i was on the dr.phil site (i am a chronic phil watcher) and I was reading the replies to the show about sex education in schools. I was SHOCKED to read how many people believe that sex education should not be taught in school at all, but instead left entirely up to the parents. From a nursing point of view, I found this disturbing. Sex ed is more than just about a moral decision but medical consequences such as STDs, pregnancy, etc.
What are your opinions?