Published
Okay, I have a question for some of you guys (and any gals who are looking around who might have an opinon on this): I'm a woman who spent several years in a male dominated field (military), and as such, I'm pretty accustomed to off color jokes, etc. During that time, I got used to having to take charge of things in my unit, and bascially act as a man would in order to gain the respect of my peers. Now that I'm entering a female dominated field, I'm wondering if the traits I developed in the military to deal with my all male coworkers are going to become a hindrance. There are certain characteristics that I feel will be beneficial to my patients, but at the same time, I don't want to tread on any toes by seeming disrespectful. To make the situation worse, I live in the South, where women are supposed to be meek and mild and stay at home (at least in the small town where I live).
So, guys (and gals); do you think you would rather work with someone who is straight forward and to the point, or someone who is more feminine (in the sense of being mild mannered)? Mind you, there are only a handful of males in my class of 40 (around 7, I think), but many of the women are the epitome of the Southern belle.
Any input is appreciated...