Rex, that was a very interesting and profound post. You helped make me think of a few things...
After making multiple visits to this site,I fear that the bottom line is that nursing will NEVER be a profession. In response to your question Rex, "why can't we raise leaders within nursing to um lead us to the promise land." The answer is simple, when someone rises to a position where the could lead, they are instantaneously stabbed by their peers. "They don't understand us, anymore... they forgot what it was like to be a REAL NURSE" are the common laments of the masses.
I am sure others will say I am sexist, but oh well-- but I have never seen a female dominated profession that enjoyed high levels of autonomy or respect. Nurses tend to be very malicious to each other, very 'caddy' and very back-biting. Christ, at least 10 different conversations in this board drew lines in the sand over entry level education into nursing.
Case in point, I went into the nursing home to see some patients yesterday, I sat on one unit for about 2 hours. In 2 hours, do you want to take a guess at how much B.S. I heard. "If *NAME* thinks I am going to bust my a** while she sits down in that room kissing Mr. *NAME*'s families a**, she's crazy." Then I heard "*NAME* hasn't done this or that yet, and its almost 11:00!" Followed by "*NAME* is such an idiot... do you want to hear what she said?" BUT THEN, when *NAME* showed up, they were sooooo sweet to them, and so nice to their face. "Backstabbing wenches" was all that I could think of.
After 2 hours of that, I was in need of compazine. I couldn't wait to get away from that place.
My thoughts on the matter--- no, nursing will never receive the respect and admiration it deserves, why??? SOME nurses [mind you some, not all], should look in the mirror, because that's who is to blame.
Tis with our judgements as our watches, none go just alike, yet, each believes his own.