I originally entered college for my nursing degree in the late 80s, changed majors due to all the negative I received from those around me, and now changed careers to go back to school for nursing.
I am still receiving tremendous grief on the horrors of nursing (though not as much!). Can those of you who have been in the field for the past 20 years provide your observations on the biggest changes in nursing since the late 80s?
I really appreciate and thanks!