Her Campus Logo Her Campus Logo
The opinions expressed in this article are the writer’s own and do not reflect the views of Her Campus.
This article is written by a student writer from the Her Campus at Manhattan chapter.

By, Samantha Keating

Sex. A word that can inspire discomfort among many people. It is something we are trained not to discuss or to keep highly private. If it is not with your partner or doctor, you are typically raised to not speak publicly on the topic. But why? If we tell our friends everything else that is good in our lives why do we hide from sex? Why is a natural life process detailed as shameful and secret?

Recently, I have been reflecting on the sex stigma and how it harms women. Society pushes women to take a stance on sex, whether it be pro or con, there is the pressure to label and have an opinion. Oftentimes women are pushed into the realm of sexual awareness at an extremely young age. Whether it be from hearing an inappropriate comment or being the victim of catcalling or harassment, women are forced to bear witness to the world of sex whether they want to or not. Yet somewhere along the way, we lose sight of the fact that becoming aware of sex is hardly ever a choice. If we are aware of sex then why are we taught to be silenced and not to discuss it? Why are young girls taught that sex is a dirty word that should be whispered in confidence?

When discussing the topic with close friends we discussed a multitude of possible reasons. One being that girls feel uncomfortable talking about sex because of a lack of inexperience and do not want to be made to feel inadequate. I can’t help but place blame on the sexual education system for this. It is no secret that sexual education is meant to terrorize and instill fear, not actually educate. I can’t help but wonder how many more women would feel comfortable discussing sex and asking questions they want to ask if sex was normalized early on. Other women felt as if talking about sex will cause others to judge them and think less of them. The irony of how men are placed in such opposition to this is comical. Men are often taught that the more overtly sexual they are the more respect they will earn. But how can we as a society blindly ignore the disconnect between teaching boys the way to earn respect by having sex and instilling fear in girls that if they have sex they will be judged and lose the respect of others? A tired patriarchal-driven ideal that harms women and girls. We should be teaching women that their body is theirs and whatever they choose to do with it is completely fine as long as they are comfortable and consenting. 

We should be teaching sex education correctly. Warning young adults that as long as they practice safe sex and consent they are not doing anything wrong. If sex was talked about more it’s possible fewer young people would end up with STDs or unwanted pregnancies. If sex was talked about more it would empower young people to feel powerful and comfortable in their choices. Sex is not shameful and we need to stop teaching young girls otherwise. It is the age of empowerment and ownership, women can do anything that they want, and if that includes freely asking bodily questions they should have always been entitled the answers to then they deserve that. If it includes guilt-free girl talk they deserve that as well. The first step in destigmatizing sex is recognizing that it exists, it is not going anywhere and how you feel about it is completely valid as long as you’re being respectful of the feelings of others.

Samantha Keating

Manhattan '24

Junior at Manhattan College studying English & 5-year elementary and special education!