Hello everyone I recently had a situation at work and I felt so angry but I dont know if this has ever happened to any of you. At the hospital that I was working start complaining about me, not about patient care but for little things, what really bothers me is that not one single patient complained about me, it was the nurses. I usually very reserved, I dont get involved in the gossip, I keep it to myself. I get called in and they tell me that I need to pretend and that I need to smile more because it looks like I dont care about patients. For this reason among other little things that they dont tell me upfront but behind my back I cant work there anymore. What is nursing all about these days, what has happened to us, why do we eat our young? Why do we have to be fake and pretend and not be ourselves so that patients get to know us just the way we are? Why everything is about money? Just need some advice thanks.