Updated: Published
I work alongside nurses and am in the process of becoming a nurse myself.
While I have great respect for the profession and my coworkers as well as a passion for it (hence my reason for going to RN school), I've noticed nurses (many but not all) have this superiority attitude like their job is the only job in the Healthcare field that matters. They talk down to all the other professionals (respiratory therapists, social workers, occupational therapists, etc) and disregard any of the hard work they do and just expect a pat on the back for every little thing. I've even heard some nurses say they should get paid more than the doctor because their work is more important.
Also, I hear many nurses complain that they have too much to do, and then when anyone tries to give them a hand they have this turf battle and think everyone is trying to take over their job and isn't competent enough to do so even if it's something as simple as helping bathe a patient.
What's your opinion?