Published Oct 14, 2005
MUCstudent
1 Post
Hi everyone!
I'm a first year nursing student in an RN program and I'm doing a paper on the nurse-doctor relationship. I would love some opinions from other nurses. Is it good/bad? Is it improving? What steps do we need to take in order to become true COLLEAGUES with doctors? Etc....
I have a lot of info about trends in history, but I really want real people's opinions. Thanks so much.
Jenny:rolleyes:
ShayRN
1,046 Posts
I think for the most part I have a fantastic relationship with the doctors I work WITH, as opposed to work FOR....Some doctors still think that they can treat nurses like we are idiots, but really they just make themselves look bad in the end. The ones who start on me readily back down when I take the same tone with them that they are taking with me.
realnurse
7 Posts
hi iwould lik to give my opinion on this , doctor and nurse relasionship is very important during work the should work together as a team the profestional nurse should share with the doctor the reatment for the patient , some doctors still think that nurses only taken order this vision should be changed
CoffeeRTC, BSN, RN
3,734 Posts
Just watch ER. :rotfl:
I think it depends on the area you work.
I work LTC, some docs are great, trust you and your nursing judgement,etc and will ask what we think. Others will just play God and bark orders.