So I am VERY excited about nursing, I will find out in two weeks if I am accepted into the RN Program. It really interests me and seems like something I will enjoy. The only thing that bothers me is EVERY nurse I have spoken to tells me they hate their job and if they could take it back they would have went into a different profession. They tell me not to do it. Has anyone else come across this also? It really does bother me, it's discouraging to hear people who have the job I am so anxious to have speak so negatively about it. I want to become and RN and eventually go back to school for my Bachelors or Masters, possibly become a Nurse Practitioner. Do any RN's here have any input on why you like or dislike your job? Thanks for any info!