I constantly hear stories of how doctors, fellow employees, supervisors, and disgruntled patients alike always uses nurses as a punching bag. So despite those issues, do nurses really get respect for anything they do, by anyone? Another question is are nurses really even viewed as "professionals" or are they just lap dogs and ass wipers to those above them?