I've got a significant other going for their nursing degree in Wisconsin. We're looking at moving in together and her mother has stated she needs to finish going to school in Wisconsin because people hire nurses from Wisconsin more. When asked why I was told that nurses in Wisconsin were trained better and thus preferred over nurses from the south.
Is this just a prejudice that's spawned from the whole North versus South feeling? I don't like calling people liars but this seems a bit far fetched and I've only got word of mouth to go on not actual experiences or written statements proving it. Was hoping people could chime in and let me know experiences / preferences.