Drugs and faith

When a drug has an effect on the way you feel you don't need someone to tell you it works.
All the medical drugs designed to alter bodily conditions without effecting how you feel must be taken on faith.
We cannot feel them working and so their effects are unclear to our consciousness.
Instead we take the word of a doctor who says it does this (desired) effect but look out for this unwanted effect.
Whether this information is good or just a marketing speil is anyone's guess!
How trustworthy is the process from lab to pharmacy? Who controls the industry and is their #1 motive improving people's health or is it something else. For example, profit?

Comments