As a second career nurse, I have over 20 years of work experience and have been rudely slapped in the face this past month at my new RN job by just how horrible nursing culture is. What is up with this? I had some inkling of what was to come in nursing school, but I'm still shocked. As a feminist it saddens me that I can't help thinking this has to do with the fact that the overwhelming majority of nurses are women. My complaints so far:
1. Paranoia
2. Coworkers writing each other up or chewing each other out
3. Gossiping about others' mistakes or lack of skills/experience
4. Unfriendliness to new nurses that borders on rudeness
5. Looking down on CNAs, reinforcing a hierarchy that that nurses themselves hate when they deal with doctors
[Just last week a nurse manager at a staff-wide meeting told the CNAs that there were plenty of people lining up for their job. I was shocked!]
But maybe this kind of brutality happens at other "blue collar" jobs, too, and I've just been oblivious because I've always worked at pink collar or white collar jobs.
Why is this happening? What's the dynamic here? Is this workplace hostility coming down from management?
Every time I say something or make a suggestion, something I would do at any other job, I'm looked at like I'm from outer space or like I'm some kind of idiot. For instance, I'm on orientation and I asked if I could take one day and rotate through other parts of our hospital so that I could get a handle on the institution as a whole and I was shut down and given an eye roll by my manager.