Of course Nursing has evolved and many men are in the field. In my short career as a Nurse I 've worked with 95% only women. Right now its only me and 3 other guys in my workplace.
Now I get these weird feelings sometimes. Sometimes certain female staff approach me and talk to me a lot with the guys saying "they wanna show other women that they can talk to a guy". I just find that weird.
Others say "you're working with women so enjoy the time". Like I don't know why they're telling me this stuff. I'm just here to help out patients. I'm utterly confused. but I am not gay either. I am single. Many of the workers have asked me that. No one in all the places I've worked at behaved like this.
Maybe because I'm young at 23? A lot of the senior staff call me "handsome" which I think is inappropriate. Asking me if I have a date tonight etc.. When I don't answer back or if I sarcastically answer I'm the dick