I just watched a documentary Food Matters. It definitely reminded me of some thoughts I've had about the medical system. I'm all for evidence based medicine, but it doesn't really happen that way too often. I'm a new graduate nurse and work in a large hospital, and I wonder, do any other nurses hate the hospital environment as one for healing? Personally, I would never want to have to spend time in one, unless as a necessary evil. Just one example... Research shows us the incredible benefits of movement and the deadly effects of immobility, and, yet, let's be honest, the whole hospital room/unit set up encourages patients to stay in bed most of the day. It's just easier on everyone. When will exercise be a prescription (and not 15 min with Physical Therapy once a day)? The diet offered is definitely deficient. The food should be as healing as the medicine (if not more). There is very little stimulation, fresh air, connection with nature, interaction with others. Why have we set up this horrible system? I know it's a product of our entire culture and all our values are out of whack, but can anyone else relate? Not to generalize too much, but it seems to me that from birth to death, staying out of hospitals as much as possible is much more conducive to health and wellness. (no, I'm not talking about acute conditions that need specific treatments unavailable anywhere else, I'm just saying the whole environment needs deconstruction).