I was out at a restaurant with some friends tonight, and I'm not sure if the man at the next table over heard me mention nursing or if it is irony, but I overheard this comment...
"I think nurses are the most arrogant group of people..."
It seemed like he was having a discussion with his daughter about her future career plans maybe? I don't know but that is the first time I've heard that, and it angered me! Any similar experiences?