Published
I had a heated debate with my fiance last night which ended up in an argument about how the average person doesnt really know and appreciate what knowledge an RN has. He's not the type to bash anyone and what he was saying was true, but I just got so upset. I dont like the fact that like him, most people think the RN doesnt know much and basicall has to follow what the doc says and the doc has the final word. I explained to him the RN's responsibilities even when a DR may prescribe meds that the patient cant tolerate, for example. I explained that there are Nurse practitioners, but still, the argument was that to the average joe, an RN doesnt know much and the doc gets all the credit. I hate to admit, but Im starting to believe this. Before I entered the field, I also had the same perception, and thought that the medical assistanr, CNA, were all the same:"Nurses". I even went to a clinic once and the dr. referred to the
medical assisatnt as "the nurse". I am not happy about entering a field where I feel I always have to explain to people our importance is more than giving a bedpan and blindly following doctor's orders! Even one of my instructors bluntly said that 90% of the time, the nurse just passes medications. My jaw dropped.
On the other hand, I hear other instructors say that because they were an RN, they had banks look at them more favorably during a morgage application, and that in some cases, a bank may only ask you to put down 1000$! I said this to my fiance and he said nurses dont get that much respect in our society. What are your thoughts and experiences?
It'd be nice to be recognized and appreciated for my work, but at the end of the day, the important thing to me is performing m job well, getting paid decently, and flexibility.