Sex is completely natural. I mean it's how we are all born (well...minus the test tube babies, but that still involves reproductive cells...). I mean it is something that does need to be taught rather than just found out on the streets. I don't know why it's so taboo. What needs to be taught is safe sex, use of a condom (to reduce the chances of STI transference and pregnancy), and that it's okay to have to sex.
It seems most people lose their virginity between the ages of 16-19
http://en.wikipedia.org/wiki/Adolescent_sexuality_in_the_United_States (Yeah it's wiki, but I'm not going to sift through hundreds of sites), I mean sometimes its urges, sometimes it's peer pressure, regardless it happens. Despite how much knowledge is readily available to people, many still don't know how to use a condom, feel awkward buying them, etc. I mean this way the youth are at least practicing safe sex.