After working for many years in another medical profession, when the opportunity presented itself (tuition assistance, grants, etc.) I decided to get into the profession that I wanted to right after high school. You see my father was from the generation (and some still do) that men did not become nurses. And since he was the one paying for college, My father and brother enrolled me in a medical profession that they perceived was more male oriented. Now that I am an RN I like talking to people, the autonomy (although limited), the bonuses, more patient contact, the giving of compassion, empathy, understanding, public respect, imparting information to other people, the job security (far better than other professions), interaction with doctors, respiratory therapy, and other supportive and techincal departments, etc, etc.
P.S. I read on this website that there is now a national nurses union. I have NO experience working under a union. Let me know what you think. It appears at work, that the majority are pro-union. I get comments like "Under a union patient safety will improve". "We will get better working conditions", etc. etc.
Thanks for any input.