Pharmaceutical Companies In America Are For Profit...BIG PROBLEM

Pharmaceutical Companies In America or anywhere else should not be a For Profit" Business. For profit means to more you sell...the more you make financially! Most of the medications that we take actually have substantial side effects.It makes sense to me. Take one prescription for a specific problem. Next thing you know is that you need another prescription b/c you are having to now treat a side effect of the med. that you're taking to fix the prior health issue.What happens is that $$$$$ rules the American political system. So our own country knows exactly what's going on. Unfortunately, they care more about getting into office than taking care of the citizens of America. The Pharm. companies use their financial strength when it comes time for voting someone into the White House. One hand washes the other. What happens naturally is that the Pharmaceutical Companies own America. Why??? Because they are making so much money on keeping us sick. If they didn't, they would go out of business. That's what happens to "For Profit Companies." What do you think? Remember, that Americans take more prescription drugs than anywhere else in the entire world.

Very true

That is one side of the coin. On the other side is the research and development that goes into creating some necessary, lifesaving drugs that are keeping many of our members on multiple networks here with us.