I am not even a nurse yet, but my short time in the hospital as aide made me see how much nurses do and how smart they are about patient care and how little doctors are even around and sometimes honestly they seem to just not get what is actually going on. Sometimes they actually seem pretty clueless...
Yet somehow medical shows continue to portray the situation unfairly. I actually got into a small disagreement with someone about the show scrubs. They said all in all it was one of the better ones, but one of the nurses makes a comment about never going to college (then she couldn't have been a nurse) and their is a particularly irritating scene where a doctor fires a nurse. you also see the doctors sending the nurses to new assignments?
Any way i know it is just tv, but it upsets me because so many people are just ignorant
to what nursing is all about.
I know I was. I wanted to be a doctor mostly because I wanted to challenge gender sterio types, but one stay in the hospital changed my perspective took a 180. the nurses are the ones taking care of the patients!
Any way I don't want to bash doctors but I think maybe we need a new outlook on the way we view the healthcare team.