My mom, who is a nurse, recently told me of a story of a doctor who became rather sarcastic towards her and her other female coworkers, and that his god-complex attitude leads him to do that often. I wonder that when I graduate and finally get settled in a job how the doctors will treat me and my male counterparts. I know it is probably no big deal but I wonder if male nurses tend to get treated differently, whether better, worse, or about the same in the job setting.