Why is it that nurses are informed from the beginning of their education and throughout their careers that nursing is a higher calling? Is this being taught to management and administrators as well? Rhetorical questions, as a male, in a female-dominated profession, and coming from the corporate world it is rather disingenuous and would never be tolerated in "male" dominated professions. I'm expected to take care of you or your family, and in return I get to work in sub-par environments with sub-par compensation. Why am I nothing more than a liability on the balance sheet?