Why does it suck to be an RN these days, it is everywhere, in every area, I am so tired of being treated like dirt! We nurses do most of the work and the docs and big hospital systems treat us like peons. When did we lose the professionalism? WHen did being an RN become a negative profession? Where did all the jobs go?