What do you do when you have a nursing degree but don't believe in Western medicine or our health care system? I recently received my BSN degree and through my schooling came to the realization that I very much disagree with our health care system and Western medicine. I do not want to work in a hospital and am more interested in working in holistic/alternative nursing but have no idea where to begin to get into that field. Has anyone else had these feelings? I would appreciate any advice on this, I feel completely lost right now. I am still interested in helping people and being a nurse, but I don't feel that Western medicine is really that helpful. No one else in my nursing school
cared to try and understand my feelings/beliefs, so now that I am done I am really trying to find others who do but am having a hard time. I don't want to be a typical nurse in a hospital because I feel that by giving medications and other Western medicine treatments I would be going against what I believe. I know that I am a very compassionate and caring person who is great with people, I just need to find the right place for me and don't know where to begin!