Every time this subject comes up, I'm reminded of a conversation I had 'way back in nursing school
in the Dark Ages (early '80s, actually), with the husband of a friend. He was an attorney and had nothing to do with anything in nursing or the larger healthcare community. I had originally met this friend in the local NOW chapter, and all three of us were ardent feminists. I was burbling on happily about the value and importance of recruiting more men into nursing, which was a hot topic at the time (and there were four
male students in my freshman class of nearly 100!), and this male attorney acquaintance cut me off to explain that nurses were crazy to not only let
men into nursing, but actually encourage
them to enter -- "Don't you realize, the men will take all the good jobs, they'll all get promotions and cushy jobs, and you (the female nurses) will still be hustling up and down halls, emptying bedpans! Nursing and teaching are the only professions in which women have the advantage and run everything, why on earth
would you want to give that up?" His opinion was that we should be fighting tooth and nail to keep men out
of nursing. It sounded like crazy talk to me at the time, but I've reflected on that many times over the years since then -- it doesn't sound nearly as crazy to me any more ...