today i was chatting with another rn who believes that the term nurse for our title is outdated and may not fully represent our field as well as it used to. and perhaps the title nurse turns men off from persuing a nursing degree, way too feminine for a man to own. not to say that we don't nurse patients back to health and nurse our drinks while griping about work, however we're not all nursing babies either!
what are your ideas concerning this topic? do you have any suggestions as to a better term for us nurses other than nurse?
i think direct care provider as a title could curb our patient/resident/client's appetite from calling out "nurse!" as if it were a curse word.