I saw a nurse post this last night on somewhere else I frequent. She said nursing needs to get back in the hospitals and out of the universities.
Why would she say that? For all of you older nurses who may know, was it more hands on?
And when they did teach in the hospitals, were classes held in the hospital as well?
I was gonna ask her, but she would probably never see my question since it's not really a message board like this site.