Her Campus Logo Her Campus Logo
FSU | Culture

When Algorithms Decide What Feels True

Fernanda Ieffet Student Contributor, Florida State University
This article is written by a student writer from the Her Campus at FSU chapter and does not reflect the views of Her Campus.

In a world of so much information, easy scrolling, and media training, it’s often difficult to exercise discernment.

Let’s be honest, even artificial intelligence (AI) is getting trickier to recognize — I can’t be the only one who sometimes needs a second watch of a particular video until I realize what I’m looking at isn’t real. I even, at times, feel embarrassed by this lack of recognition.

Yet the worst of all evils isn’t even when we fail to do our homework and listen to our English professor’s advice on how to verify the information we’re receiving. The problem really hides underneath the common layer of our identities we pretend not to struggle with: our pride.

Pride determines what sorts of information we do check. Think about it: you spend hours researching what interests you to keep yourself informed, but you do so through platforms that align with what you’re already inclined to believe.

The reality is that people, especially those of an opinionated kind, have a particularly annoying and dangerous red flag: we never want to be wrong, or maybe we adore being right.

That’s precisely what prompts you to turn off the TV when the political commentator you know is from the opposite political party of yours starts to speak, or what makes you listen every morning to that podcast whose hosts you could swear take the words right out of your mouth.

The difficult aspect is understanding that many love being told what they already believe is true. They love it because it tells them what they want to hear, not because they’re being challenged. The information they’re absorbing is confirming their prior beliefs, also known as confirmation bias.

It’s basically when people curate and retain information to support their preexisting ideas, and also actively disregard the ones that have the contrary effect, and it happens because we really want them to be true. Some of us fail to recognize this in our daily lives, or have never heard of the term before. Social media, on the other hand, definitely knows this well, and that’s why it’s built in a way that reinforces these patterns to get you hooked.

Do you know that stubborn person who, no matter how much you try to argue with, their attention span gets conveniently short when you start presenting evidence that backs your side up instead of theirs? Well, algorithms are feeding that character in you.

They want you to be that person! They’ll keep facilitating confirmation bias, because it’s the easiest way to get you to engage with the content — that’s what will make you stop what you’re doing to watch, what you’ll like and comment on, or what you’ll repost.

But the price, too, is high for us to keep paying it: you don’t get the three-dimensional outlook, just the angle that makes you feel smart. For that reason, force yourself to be uncomfortable — follow the people whose opinions make you angry. That way, algorithms can’t make you comfortably sit on your own sheltered, yet incomplete, set of perspectives.

Want to see more HCFSU? Be sure to follow us on Instagram, Twitter, YouTube, and Pinterest!

Fernanda Ieffet is a junior majoring in Political Science at Florida State University, who recently transferred from the University of Connecticut. She was born and raised in Brazil and is excited to be a part of Her Campus.

She loves reading and spends her free time writing poetry and fiction—usually with a coffee right next to her and a book not too far away. She’s always drawn to stories, words, and ideas that make you think (or feel a little too much).