Someone I known mentioned that she had dropped out of nursing school because she didn't believe in Western medicine. I'm trying to get a handle on what this is all about and I just started reading this book Denialism by Michael Spector. Personally, I don't blindly put my faith in Western medicine, and I realize that many things are done without solid rationales or evidence or for liability reasons, but certainly there is more scientific evidence for these practices than for alternative medicine. And let's face it, I'm not going to a chiro if I've got leukemia and I don't think acupuncture is going to help with a severed arms. What's up with this the haters? Does anyone have insight into this? Your opinions?