So there's Grey's Anatomy, House, and ER on mainstream tv...then you got stuff on cable like Strong Medicine, Life in the ER, etc. Most of these shows revolve around Physicians or are nothing better than a soap opera, and on another thread there is a letter-writing campaign beginning for the advocacy of better PR on tv for nurses on these shows.
WHY is there not a show that is about NURSES??? At least on the main networks, FOX, ABC, CBS, and NBC? It would be a great way to promote the profession and educate the public on what we do! It could certainly become just as much of a hit and have some drama. It could even explain the various roles of CNAs, LPNs, RNs, Nurse Managers, CRNAs, etc. Anybody have any thoughts on this?