I've read on several threads now that nursing is not respected as a profession. I find this mystifying, as in my experience nursing and nurses ARE respected. So maybe what I'm not getting is what "respect" really means to nurses.
So what do you think it is? How do you think nursing will achieve it? And why do you think that particular way?
(I'm thinking of the multiple times I've seen posted that a BSN as entry level will command respect for the profession. Is that really true? Why? And what about those professions who already require a bachelor's degree and more and still aren't respected by the public? Lawyers, for one...)
I'd love to hear what you think!
Melissa
(edited to make thread title actually resemble the post...LOL)