Nursing in the media means a lot to me.
I think one of the main reasons I almost chose NOT to be a nurse when I was in high school was because I kept seeing these medical shows where the doctors did nearly everything.
Shows like "Nurse Jackie" and "Mercy" and "HawthoRNe" have been somewhat helpful (in my opinion) at getting nursing into the mainstream of media, but there are still some annoying stereotypes that they play off of. Nurse Jackie snorts drugs. "HawthoRNe" has the male nurse who COULD have been a doctor, but he "choked" on his MCATs and is "stuck" being a nurse, and there is also Candy, the typical Barbie nurse.
Okay, so I know that part of a show is all about the story-telling, and story-telling can only happen when there are flaws in the characters. But do you think that some of these portrayals help or hinder? Are they realistic or playing off of established cultural ideas?
What do YOU want to see in the media about nurses that isn't there yet? What do you LIKE about these dramas? What do you NOT like? What do you think the impact is on society's concept of nursing?