Thought about posting this on Facebook but didn't think it would be well received, so I thought maybe my fellow nurses could relate. Yesterday I entered a patient's room with high hopes of reviewing the care plan, working on pain and nausea control, and going over post op instructions from the surgeon but instead was met with one complaint and one snarky remark after another without me so much as being able to get a word in.
Eventually she apologized and I try to be understanding because I know nobody wants to have an illness and be in the hospital, but one thing she said stuck out to me. She said "I thought I came here so you guys could help me." That was before I had even gotten an assessment in, by the way. There is just this notion in America that anytime someone experiences pain, is diagnosed with a disease, or experience any type of discomfort the doctors and nurses will fix it. We are a death and disease denying culture. I wish healthcare wasn't advertised as a cure all but as a chance to have a better quality of life. Just had me feeling frustrated...