Has nursing changed you as a person?

Nurses Men

Published

This is a question that I've been thinking about recently. I wanted to post it here because as a future male RN I suspect other male RNs would have experiences more relevant.

So, have you changed since becoming an RN? Was it a positive or a negative change, or a little of both? Or have you largely stayed the same?

I'm asking because I think that nursing would change me. I presently work in research, where things tend to be cool and intellectual. In nursing you're caring for others and there are more emotions involved.

Plus, I suspect that caring is not a one-way street; if in your work you care for others and ask them to trust you, I would think in other areas of your life you may find yourself opening yourself up to others to ask for their help, trusting them.

Which is, let's face it, scary for most men, who generally tend towards being stoic and independent. And especially so for me; going through school as one of the "nerds" has made me especially uncomfortable about self-revelation, even to family and friends. But that's what I may face as I proceed down this path...

So, is my theory totally off base? Or have you found your relationships with others has changed since working as a nurse?

Specializes in Oncology, ID, Hepatology, Occy Health.

I think we're shaped by all of our experiences, and any job experience is bound to influence you. Nursing may have changed me, but so have the other jobs I've done outside of nursing.

Nursing's probably made me very hard. I think I'm a much colder person than I was when I started out, but I don't actually think that's a bad thing.

+ Add a Comment