Jump to content

Nursing without corporate influence


I've been a nurse for six years and, since becoming a nurse, I've worked for either a corporate-owned hospital or a home health agency. I'm tired of the corporate influence-money being more important than patient care and employee satisfaction. I'm wracking my brain trying to come up with something I can do in nursing that doesn't involve working for a corporation. I love taking care of patients but I'd like the luxury of having the time to do it right without the hospital constantly being short-staffed and assigning more and more responsibilities to nurses with less and less resources. Any suggestions?

I dont want to sound mean.

But every job out there involves money, financial responsibility and all that.

Not just nursing.

Your best bet would just be ending up somewhere with a strong positive culture

And if you're able, learning and understanding that financial responsibility can be positive and improve quality and the patient experience

I understand that financial responsibility is necessary but the bottom line seems to be the only concern where I've worked, even at the faith-based, not-for-profit hospitals. I'd love to find a place with a positive culture but I don't know where to look and I hate to keep leaving jobs after only a few months or a year to find it. I'm thinking about travelling so I can look for good places without a long commitment.