I'm really feeling burnt out on taking care of people who are self destructive and have no interest in doing anything to make their lives and health better. I'm sick of a system that is like a big, enabling wet nurse to people who are going to go right back out and continue with all the bad habits again, only be back in the hospital to suck on more of the healthcare titty that puts them back together again so they can go out and continue to screw up their health some more. I'm sick of the total lack of responsibility I see every day I work.
I was talking to my stepbrother who is a chiropractor. His clients are a different group. They are interested in health maintenance and are motivated to improve their health through their own efforts. He's really doing something for people. I'm not. Sure, I have a good bedside manner and manage to connect with my patients and gain their trust. But basically I hate the American healthcare system totally and completely. At this point I'm a nurse only for the money.