A nurse who is retiring at the end of this year cornered me. She's worked in our ER for most of her career. She became close to tearful, sounding bitter about how much the department has changed, how "I don't have ties to any of these people anymore!". Then she proceeded to don her rose colored glasses about yesteryear, when the whole department had picnics, parties, did rafting trips, and so forth.
Now, between you and me, I heard that the department was a snake pit back in the day, with many cruel and difficult nurses, who put newbies through the wringer. But thats all a matter of perspective, I'm sure.
But it got me to think about the importance, (or not) of social ties at work. Personally, yes I need to feel a sense of belonging and teamwork, but I in no way want to socialize with the whole gang outside of work. I've certainly made some wonderful friends at work, but they haven't remained dependent on remaining in the same unit or workplace. For me true friends are few and far between.
What do you think?