I read a lot of posts and my aunt who has been an LPN for over twenty years has told me to stay far away from nursing. She works in LTC, I believe, mostly nursing homes. I don't think I have ever heard her say anything good about the field. I want to be a nurse. I admire the field and the men/women who do it every day. I think if I'm going to work and work hard it should be doing something that benefits mankind as a whole. I can't think of anything better than helping sick people. Corny, maybe, but I'm an corny person. My question is...does nursing turn you into an angry, bitter burnt out person. Only if you let it or is it just inevitable by how the whole system is set up?
My goal is to make a career out of nursing. I want to learn as much as I can and advance as far as I can go. I come from nothing, high school dropout, married young, kids young, and I'm one class away from being accepted into the LPN program. I want to make my children proud of me. What's your opinions nurses? Does your job make you a hateful person?