I just watched the show "Hawthorne" on the TNT channel. It's a new drama about nurses. I liked it... It seems to be a realistic portrayl of what nurses do. I'm glad there's finally a show about nurses. Nurses deserve immense respect for what they do...They're equally important as doctors imo. Did anyone else watch it? What did you think?