After spending alot of time on this forum reading vrious posts and speaking to RNs I know in real life, I am utterly AMAZED that there seems to be a general territorial nature in the nursing profession. Again, I'll emphasize the word "general" because there are no absolutes when dealing with the attributes of a population of people. However, although I am strongly considering pursuing RN/APN, in my current field (male-dominated) I have never seen as many negative attitudes, judgements or tearing down of folks as I have seen here and heard in the real world in the nursing field. I believe for any profession to thrive it needs mentorship. I mentor interns and fellows in my current field and I really enjoy it. I love watching them develop and grow into their own, regardless of where they came from and where they aspire to be. What are your personal experiences and thoughts? I'd like to hear non-politically correct but respectful responses. Quite a few of the people I have spoken to said they believed the problem stems from the fact that the field is primairily women and women = cattiness. I won't lie and say I completely disagree based on my own personal working experiences but still I have to feel there is more it. OR is the whole situation blown out of proportion and is a non-factor and real life?
I have given alot of thought between RN to APN route or going direct-entry to APN and had settled on the first, but now I am concerned that I'll get in as a RN (one day) and if my leadership learns of my plans to further my career, I'll be sabotaged. I know that is the extreme but I have read stories of that, have seen it happen to one of my good friends, and am not sure that is the kind of environment I really want to work in even as much as I would love to use my skills and compassion to help others.
Thoughts?