I recently had a discussion with someone who views nursing as "just a job". But I feel it's more than that. They are using the fact that I am a new nurse against me and saying that basically I will eventually say it's " just a job". I never want to have that mentality though. I know that you get caregiver strain here and there and its hard work. But never could I picture saving lives as "just a job". Thoughts?