I haven't started nursing school
yet. I see nurses explain diseases that I have never heard of like it is nothing. I was wondering. Do you learn these diseases on the job or in nursing school? Do they teach you a certain number of diseases and how to treat them and you learn the rest on your own? Also, as a nurse, when you receive a new patient, do you go and research the diagnosis on a website or in a book? Just trying to learn as much as I can about the field I plan to be in for the long haul.
Thanks alot for your help!