Coming from a different field into nursing, I'm curious if the name of the college/university you went to has ever mattered after you graduated.
I've seen how in law, for example, the school you attend has quite a strong bearing on your employment opportunities later on. While I'm sure employment and opportunities in nursing aren't as pegged to alma mater, I do wonder if there is any stigma or benefit attached to certain schools.
Has anyone ever found this to be the case as you've hit the "real world"?