We go to college for an education. At the time, sometimes we think it's mostly related to our degree and/or passing classes. Except, when the classes are over, we've graduated and become "Real Nurses" and worked on our own? Turns out, some of our educators did more than educate us about course content - whether we realized it or not then.
Updated: