Hi everyone. I'm very frustrated with the lack of knowledge of the general public of what being a nurse means. My own mother doesn't even seem to get it. When I told my father that I had been accepted into the nursing program
he seemed excited, but before getting off the phone he asked, "So do you think you'll become a doctor some day?" If I wanted to be a doctor, I would go to med school. Being a doctor and being a nurse are two totally different things. A nurse is not a step-down from a doctor. Don't get me wrong, I'm not saying that doctor's aren't necessary or that they don't deserve respect for their knowledge because they absolutely are necessary, and they do deserve every bit of respect they've earned. I just think that nurses deserve respect too, so much more than they get. People just have no idea what being a nurse is all about. I hate the perception that all nurses do is change bedpans, bring patients something to drink ,and follow doctor's orders. I mean, yes, these things are part of the job and I don't mind doing them, but there is so much more to it. The problem is, I don't even know how to put into words the job of a nurse. How can I explain to people what nurses do and the amount of knowledge required to perform their jobs? Any suggestions? Is anyone else frustrated by this, or is it just me and should I just not care about what everyone else thinks? Thanks for letting me vent.