It's amazing how much build up sex has on here and in our society. Is having sex the be all to end all? How much is the importance of sex in your life coming from external expectations? Do you feel bad if you aren't getting any because you are made to feel odd if you don't have sex or want to? Why does it seem that every bit of popular culture/media here is focused on it? Is something that's supposed to be natural become so sensationalized and blown out of proportion that it makes other important things in your life smaller?