I keep hearing such a negative stigma over nursing school, the first year out, and complaining in general about the profession. WHY THEN BE A NURSE?
I have just quit teaching after 10+ years, and I see how it easy to be caught up in the negativity of an "industry" (i.e. education), but if we all quit, then who would be left to teach our children? There are things that amazing about teaching, but it has it's downsides, too. That's why they have to pay teachers...duh.
So now that I'm fulfilling a desire to become a nurse that I had to put on the back burner years ago, I'm wondering why I'm doing it when I read so much negativity.
I don't know if the stories about nursing school are stories from students who have never really struggled with anything before, so it's a major shock, or if it's horrible for everyone.
I don't know if things actually get better after you get your license or not. From a lot of stories, it just gets "worse." WHY THEN, would anyone want to this profession? If you're just doing it for the money...don't get me started.
It's even made me question myself and my own goals. So please, tell me what was the best and worst of nursing school, the first year, the career. I've seen plenty of arguments on WHY NOT to do, plenty of reality checks...I'm over 35...got plenty of those.
SOMEONE PLEASE GIVE ME THE ARGUMENT OF WHY ONE SHOULD CONSIDER NURSING! PLEASE TELL ME THERE'S SOMETHING GOOD OUT THERE!!