Discussion overheard at lunch today: ER RN (white female) "discussing" her opinion with Respiratory Therapist (white, male) and EMT (black, male) that within 10-years white women will no longer be in the majority as far as bedside nursing goes. Her argument was that women have more opportunities to go into other professions than when she was young (guessing she's in her late 50's) and that since we are importing nurses from "third-world countries" no white woman will be a bedside RN in the future. EMT agreed and said that it's not just white women, that he wouldn't want his daughter or son to be a RN. The RT kept quiet and continued to eat his tuna sandwich.
So, I wonder, do any others of you feel that way? That is, that women (white, black, brown, etc) in America have more opportunities, more career choices now than ever before; and that the nursing profession will become the immigrant's first job in America?