Published Mar 21, 2016
tnbutterfly - Mary, BSN
83 Articles; 5,923 Posts
I just saw this question posted on a blog.
One of the responses was:
I don't think the issue is that the word "Nurse" is somehow sexist, it's that the profession remains gendered. As a consolation prize, men who enter Nursing are more likely (statistically) to be given administrative roles, and more likely (statistically) to be given academic leadership roles.
What are your thoughts?
Horseshoe, BSN, RN
5,879 Posts
The title "Nurse" is not gendered, so no. It's also a legal title in most states.
brownbook
3,413 Posts
I don't know about the "statistically" information. I'm old and still occasionally think, assume, male, when I hear the word doctor! But I am finally getting over that! So yes a lot of people think female when they hear the word nurse, but that will change.
I don't understand how, why, we suddenly stopped using the work actress?
TheCommuter, BSN, RN
102 Articles; 27,612 Posts
The word 'nurse' is not sexist by itself. However, nursing remains a female-dominated profession just like elementary education, social work and librarianship.
The glass ceiling is also eminent in nursing. Males in this profession tend to be promoted to management faster, rise through the ranks more swiftly, and earn more money than their female counterparts.