I am about to begin nursing school in a few weeks, and everywhere I turn I feel like I hear a bunch of negativity towards nurses. It seems like nurses are the bottom of the food chain, but is that really true? People keep saying to me, "why would you only want to be a nurse?" - like it's a bad thing. I realize that you have to earn your own respect, and that it isn't easy. Maybe it's just a mix of media and too many fears swimming through my head. Are nurses really as disrespected and disliked as everyone makes it sound?