Ok. Easy question. I've read course descriptions, and I've definitely worked around nurses. However, I can't really get a feel for what nursing school is teaching. I read course descriptions for classes and see things like "care for, empathize, nursing theory," etc, but I'm not actually seeing what that is.
Are you actually learning to recognize, understand, diagnose, and treat disease? I see a lot of information about "nursing theory" and that "nurses don't practice medicine," but I'm not concerned with verbage. It seems most RNs come out of school unable to interpret ECGs, labs, and simple stuff like that so what does the education really consist of if nurses have to continuously go and pickup random certifications to be able to know or do anything? I just want to know what you're really learning out there, i.e. what knowledge you've gained and what you can do now as a result of it.
Note: nothing here was intended to be a stab, but I seriously am not getting what nursing school is teaching. Thanks for any replies.