Hello All,
Does anyone else feel like there is a stigma that comes along with nursing? Like, do you ever feel like people are judging you based on your career choice? Do you feel like nursing isn't as well respected in society as it should be?
My parents and a lot of my friends are being pretty critical of my decision to become a nurse. I can't even count the number of time I've been asked, "But wouldn't you rather be a doctor? Why not try for med school instead?"
Just wondering if anyone else has experience with this sort of thing. How do you handle it?
Love, -NR :heartbeat