Hello to all! I recently read a blog questioning whether or not nurses were professional? There is not an easier question in the world to answer. YES!!!!!!!! All nurses are governed by set of rules that guide our practice, we all go through very specific and extensive education and training, and we perform in a professional manner both inside and outside of work. Nursing is not a mere job, it is a career. BIG DIFFERENCE. If a nurse is not taken seriously or viewed as a professional by his/her peers or patients it is most likely due to a poor attitude or appearance. To me, just the title of a nurse commands respect. I definetly never realized what a great responsibility nurses had until I started practicing. As far as saying we are a "doctor's assistant", that is just absurd. Yes we do assist the doctors in taking care of patients, but we make independant nursing judgements/interventions everyday without a doctor's order. On closing, I would like to end with a truthful quote I just love. "People are always blaming their circumstances for what they are. I don't believe in circumstances. The people who get on in this world are the people who get up and look for the circumstances they want, and if they can't find them, make them." -George Bernard Shaw.