Is nursing as bad as others say it is? I just got through reading a thread about nurses saying that they would not reccommend the field. Either too much stress, too much dirty work, and unappreciated by coworkers. Is this true for an RN? Also many saying that the money is not up to par either. Is the salary for RNs overstated by salary.com?? I understand that entry-level pay would not be great money, but I would expect to start making more after 2-3 years of experience. With this experience does the stress/pressures ease up a bit? Any opinions and suggestions are appreciated as I am trying to find out if this is right for me or not.