Nursing recognized as a profession has gained it awareness in our current culture somewhat piggybacking off the Women's Right movement. Even though many nurses are now male, the stigma associated with nursing has its roots attached to a time when women, whom comprised nursing, were thought of as hysterical, at worst, and not equal to a man, at best. Doctors by contrast, predominantly male, has commanded respect and paid well for that respect for years. Nurses Announcements Archive Article
Updated: