I am in my first semester of nursing school and am currently writing my first paper for theories. It is an adaptation of Notes on Nursing in which we clarify our own definitions of what nursing is and what it is not.
While writing about what nursing is not, my mind was flooded with ideas. For example, nursing is not female only, nurses are not wanna-be doctors (different wording, of course), nursing is not glorified babysitting, etc etc.
I am curious what your thoughts are (as nurses) regarding what nursing is not. I love reading stuff like this and hearing peoples opinions.
Thanks in advance for your thoughts! I respect all of you so much and am excited to read about what you think!