I recently went to see my doctor for a quick and easy procedure. She gave me multiple prescriptions that included Vicodin and Xanax. I wondered to myself, "Are these drugs necessary?" because, that's a lot of money if it's not necessary, so I opted that it wasn't. And, turns out, it wasn't.
I'm not sure if I'm going to go back to that same doctor. I don't like doctors that are willing to just give you drugs (i.e. making you spend money) when it's not needed. It makes me think that she didn't have my best interests at heart.
So, I was interested in reading the answers to the question, "Do you trust everything your doctors say?" Here were some of the answers:
Anonymous: I read up on what the doctor is saying. Because I've encountered a lot of doctors that come in talk really fast and quickly just write you a RX and send you on your way without addressing any concerns. And then they go out into the hallway and resume a friendly meeting with the pharmecudical sales person-they give perks to the doctors.
tropicalmama: I trust my dr, but I still do my own research. Doctors don't know everything; there are constantly new treatments, discoveries, meds, etc. being made. You might find something that your dr hasn't heard of yet or that they don't use as the first resort that you would prefer to try first. So, I trust my dr. so far as to take their opinion of what the problem is, and unless the treatment is really out there, on treatment as well. But, then I go home and start researching to see what I find. Or, if I'm pretty sure of what's wrong, I research before I go in so that I already have some knowledge and ideas to discuss with them.
SabrinaMBowen: I Do my own research!!! I never trust the doctor at face value unless it's a common cold and even then I rarely get the meds they prescribe until I talk things over with the pharmacist and look things up.
Do you always trust what your doctor tells you?