I think that one of the most interesting points from our
discussion yesterday was the issue that some doctors get paid to promote drugs
to people. Some of my initial questions where how many doctors took part in
this and to what extent have I been or my family members been affected.
To give the benefit of the doubt to doctors, my initial
reaction was that only a select few had financial gains from this type of
practice. However, I found an article online that referenced a survey, which
stated that as much as three quarters of doctors have some kind of financial
relationship with the companies. The “companies” being pharmaceutical
companies. I was a little taken aback by this! That is a lot of doctors!
Relating this all to the text, I now wonder what the “good”
is in the healthcare industry! For me personally, the good is my own well-being
and personal health along with the health of my family members. To the
pharmaceutical companies, the good seems to be making money. The “good” for the
doctors seems to be a combination of both. When doctors engage in financial
relations with pharmaceutical companies, it sends the message to me that their
own personal money gains are more important than my own. My trust in them is
greatly diminished because money is getting in the way. I’m sure not all
doctors take money but it sure does send a negative message out about the
entire healthcare industry to the general public.
I also wonder the same thing. The good was always something I struggled to get a good grasp of, but in terms of health care its even harder to understand. Their job is to protect and promote good health for others, but at the same time their resposibilities lie with earning money for their families. So I pose the question is it ethically wrong for doctors to take incentives from pharmaceutical companies for giving out medications they would normally. The only difference being they use a particular brand. Whose to say that the meds they give wont do the same thing for the clients, it just has a gain for the doctor as well. Do you think its right under those circumstances?
ReplyDeleteI have no idea if it is wrong. I suppose it is very dependent on the situation.
DeleteDo doctors need the extra money for their families to the point where they need to take money from pharmaceutical companies? Normally I would say no because being a doctor is usually associated with high income but every situation is very different.
I would say that it is still ethically wrong. Assuming the drugs do exactly the same thing, then the only differences include the brand name and possibly the price. Should I be buying a more expensive drug just because the doctor got paid to prescribe me that brand? I don’t think so...
I have had a very sour taste in my mouth due to doctors. I believe that even if they do not make money like we had talked about from prescription drugs, they are still neglectful about how drugs are given to older populations. Of course doctors have to watch students on campus. It is the way the law has been set in hopes to combat a very dirty drug issue in America. Doctors just give drugs without understanding if it will counteract what we are going through. Also I
ReplyDeletefeel we can look at alternatives other than a drug that makes you loose a kidney or having other issues from the drugs. I think that doctors do not look out for the good of us as I have seen in my personal life. I think shareef is on to something here. Is it right is the question? and who pays for it when patients are hurt worse?