The image of nursing is so bad. While touring a hospital in the area that I live in the staff, nurses and non-nurses, repeated "You don't want to continue with nursing. Get our while you still can. "
I have noticed that this is frequently exclaimed to me even when I am out in public. I find that I must defend my career. .What kind of image do we, as nurses, send to the public, to our patients, and family? Or the patient's family?
I had worked as a care assistant for two years during nursing school. I was told throughout those two years to get out while I could. The nurses told me to go to PA school or to get into medical school instead, where the real money is.
I'm proud to say that I will have my BSN in May
And will continue on for Master's within a year or two's time. I am discouraged though at the image that is being set forth by, not only the media, but by many of the nurses I have encountered through clinical and the workplace.
Mar 6, '03
Yes, you are right--sometimes we are our own worst enemies. I feel that many, if not most, nurses feel powerless when it comes to working to change our working conditions or patient conditions. Often because these nurses have been beaten down by years of poor working conditions and being consistently treated with disrespect by doctors & administrators. Many healthcare facilities are all about profit, not patients (even though these are the ones that make the most noise about how patient focused they are).
Stick to your plans of continuing on with school and making a difference. It is possible to make a difference in the right place--just be very selective about where you work and stay away from facilities where nurses are treated poorly. (I'm looking for a place like that myself!)
Last edit by spineCNOR on Mar 6, '03