This is a question that I've been thinking about recently. I wanted to post it here because as a future male RN I suspect other male RNs would have experiences more relevant.
So, have you changed since becoming an RN? Was it a positive or a negative change, or a little of both? Or have you largely stayed the same?
I'm asking because I think that nursing would
change me. I presently work in research, where things tend to be cool and intellectual. In nursing you're caring for others and there are more emotions involved.
Plus, I suspect that caring is not a one-way street; if in your work you care for others and ask them to trust you, I would think in other
areas of your life you may find yourself opening yourself up to others to ask for their
help, trusting them.
Which is, let's face it, scary for most men, who generally tend towards being stoic and independent. And especially so for me; going through school as one of the "nerds" has made me especially uncomfortable about self-revelation, even to family and friends. But that's what I may face as I proceed down this path...
So, is my theory totally off base? Or have you found your relationships with others has changed since working as a nurse?