The way nurses are portrayed in Hollywood absloutely drives me nuts. No wonder why our patient's don't know a thing about "real" nurses because on Grey's Anatomy, House, Scrubs
, ER, it's the Doc's that do everything.
How nurses preech to our patient's to be healthy, yet many are extremely over weight, smoke, eat horrible food that our patients and doctors leave us, do nothing to keep our mind and BODY healthy.
How we are expected to wipe the dr's butts when they do something wrong and WE have to point it out to them, and make multiple suggestions regarding the plan of care.
How we are expected to watch our patient's be absloutely miserable to us because the Dr won't let them leave, but treats the doc with the up most respect. (I just had a pt with a 60 beat run of Vtach and couldn't understand WHY he couldn't go home?!?!)
How we are expected to take verbal/physical abuse, sexual remarks and other inappropriate events and not "judge" these people.
That is just some of the things that are wrong within the world of nursing. I agree with everyone elses statements.