Published
i am currently working on getting into a nursing program. I have been so darn excited until i started reading some posts about how bad nursing is and how Obama Care will destroy the entire health field. Is all this really true? I know Obama Care isn't going to help anything and will affect nursing but will it completely destroy it? Is there something i am missing??? Is the nursing profession really like high school with all the drama? Basically I want to know that there are nurses who LOVE it and are proud to be called a nurse. Or is everyone upset that they chose this path???