I have been nursing in Florida for 2 years now and I am sick of these Florida patients. The snowbirds season. The geriatric patients. The entitled, rich snobs. But I am mostly sick of how healthcare here is all 'customer service' and Press-Ganey scores.
Which states have you traveled to that had the most patient-centered care? The least politics? The least BS?
Thanks for the input.