Published
I am a high school senior currently interested in pursuing nursing or at least another health care profession (such as speech therapy). I'm planning on trying to get my BSN in nursing. I will admit that because of my recent interest in nursing, I have learned a LOT about the career. Before, I wasn't even quite sure what nurses did; I knew the stereotypes, but I never actually thought about it. Now seeing how difficult it is to get into nursing school, the performance demanded by nursing students in order to get through school, the amount of hours put into becoming a nurse...it is a lot more training than the general public believes. Hell, I'm sure there's people out there that think becoming a nurse is easy.
And that's the problem. As awesome a job I think nursing is, and as much as I respect nurses for everything they do, I'm worried about how the general public perceives nursing. I read so much online about how nurses are disrespected, how people don't understand what nurses do, and how the media portrays nurses (i.e. either as hand maidens or naughty nurses). I probably won't be able to watch Scrubs anymore, one of my favorite shows, simply because of the way it tends to portray nursing! I'm very sensitive to this, especially because I can't stand people disrespecting me. I'm afraid that if I became a nurse, I wouldn't be able to just "shrug off" any disrespect I received from others for becoming a nurse. I'm afraid it would really get to me, hurt me, make me become defensive, or even make me switch to another job.
So I would really like to know what you nurses think. Do you often feel disrespected, or do you think nursing has a more positive image now? Do more people seem to understand how much work it takes to become, and be, a nurse?