Is nursing as bad as some say???

Nurses General Nursing

Published

i am currently working on getting into a nursing program. I have been so darn excited until i started reading some posts about how bad nursing is and how Obama Care will destroy the entire health field. Is all this really true? I know Obama Care isn't going to help anything and will affect nursing but will it completely destroy it? Is there something i am missing??? Is the nursing profession really like high school with all the drama? Basically I want to know that there are nurses who LOVE it and are proud to be called a nurse. Or is everyone upset that they chose this path???

To address the original question, I fall into quite a of the few camps mentioned. It was a calling, but I also was stuck in a dead end job paying barely above minimum wage. I was a high school drop-out with a GED who had kids way to soon. The opportunity for school arose and I grabbed it......for better education, more money, job stability, and the calling.

That was almost 10 years ago and I don't regret it. I firmly believe that in nursing you need to find your niche or specialty that makes you happy. For me, it's definitely NOT in a hospital. I enjoy being out in the community, so I chose hospice (or rather it chose me). I tried many areas before hospice and while I was not miserable in any of them, they never felt "right". Hospice feels right.

You will always have employers and conditions that will make you question the sanity of choosing nursing. It's important that you try to separate that from the actual tasks of nursing.

Specializes in NICU, ER, OR.

Yes. its all that and worse.

If every nurse stood together, nursing could be amazing(er) for everybody. Not just those who found a decent job where they do not have to sell their soul to make a difference.

Yep, still thinking of way we could make it better.

+ Add a Comment