I'm a first year nursing student in an RN program and I'm doing a paper on the nurse-doctor relationship. I would love some opinions from other nurses. Is it good/bad? Is it improving? What steps do we need to take in order to become true COLLEAGUES with doctors? Etc....
I have a lot of info about trends in history, but I really want real people's opinions. Thanks so much.