Published
I am a new nurse, a male nurse, and a recent graduate from nursing school.
Although I have only been in the profession a short time, it is clear to me that the term "nurse" is not politically correct. I know there are people who will argue against this, but that happens whenever people suggest looking for politically correct alternatives. So, the question is, what word will replace "nurse" as a more politically correct job title? It seems that "RN" has the most traction, except that it isn't really a title as much as an acronym. I don't even really know of any other words. Things like "caregiver" seem too general.
I have searched these forums, and found a couple of discussions, but no clear suggestions for real alternatives.
And, for the record, this is why I think the word "nurse" needs to be reimagined:
- The associated definition of nursing as "breastfeeding,"
- The definition of nursing as "to hold closely and carefully or caressingly"
- The definition of nursing as "to hold (a cup or glass) in one's hands, drinking from it occasionally"
- The historical implications of nurses as submissive, non-autonomous, and following a doctors orders. (The changed waiter to server, right? - because their job is to serve food, not wait on customers. Likewise, a nurses job is much more than just to passively sit with patients and "nurse" them while the doctor does all the work.)
- The sterotype of a woman in a white outfit with a cute white hat.
- Most of all, because when I get home from 12 hours of work, I have absolutely no feeling that "I nursed my patients today."
Obviously, much of this resolves around the issue of gender and that nursing is a somewhat gender-biased word, but even for women I don't think nursing really accurately describes what we do each day as RNs (or LPNs for that matter).