Hey guys, okay so after reading a ton of these articles and just from listening to people talk, it sounds like nursing is not very rewarding...and not worth the struggle of schooling. Ive wanted to be a nurse for years but im starting to get discouraged...I hear many people complain about how they dont get paid well, they get treated very badly by doctors and patients, and how its not worth it or even how they CANT FIND JOBS! i was told nursing is in high demand. I dont want to struggle and fight through all the hard work for something thats not going to pay me well or not feel very rewarding at the end of the day or when i graduate i cant even find a job....Its making me kind of nervous to pursue this career now? Are there any nurses who can say anything positive? Or how they actually enjoy what they do?