Her Campus Logo Her Campus Logo
KCL | Life

What On Earth Is Character.ai?

Sìana Baker Student Contributor, King's College London
This article is written by a student writer from the Her Campus at KCL chapter and does not reflect the views of Her Campus.

You’ve heard of ChatGPT. You’ve heard of DeepSeek. Maybe you’re even a student who bagged a free year of Gemini Pro. These engines can answer your simplest and greatest questions—what’s seven times six? What’s the average height and weight of a Pomeranian in LA? How do I integrate logarithms? Can you write my essay for me? Each of these generative artificial intelligence (AI) models can serve you like an informational butler, if that’s what you want.

But what if you want something else?

In-between Byler theories and Hit Me Hard and Soft tour videos (and that one guy that does Hugh Jackman impressions), my TikTok recently became littered with mentions of something called Character.ai. After scrolling through a few comment sections, I got the gist: it’s an app where you can ‘converse’ with customizable characters, all powered by generative artificial intelligence.

Unlike ChatGPT, though, you’re not just talking to an indiscriminate robot; you’re talking to a fine-tuned imitation of somebody or someone. I decided I needed to do some investigating, so I went to the App Store and clicked ‘download.’ (Let it be known that I’m aggressively anti-AI, and that this was strictly for research purposes!) After opening Character.ai and finding the search bar, I put in a famous name. Hundreds, if not thousands, of simulations surfaced; I clicked on the topmost one. The bot ‘talks’ first, setting up the kind of roleplay scene you might find on Wattpad or AO3 (Archive of Our Own), except it’s all formatted to look like a text conversation—exposition in italics, speech in normal text. 

Having grown up before the AI moment we’re currently in, I still find it very strange when a computer addresses me like a person. This ‘Billie’ wasn’t part of a narrative constructed by a teenager with English as their second language, whose house burnt down twice a month (as was the average Wattpad writer). This thing was talking to me and it felt a little too real for my liking. I ‘texted back’ some general things—“hello,” “how are you,” et cetera—to see the reply. It was very general, and very obviously a bot. In my opinion, nowhere near as immersive as reading someone else’s prose. Closing the tab, I figured I was just a little too old to understand (after all, fandoms are built for teenagers and I’m not one). For the next few weeks, it lay dormant in my phone.

However, that didn’t stop me from thinking about the repercussions of an app like that. You hear a lot of stories now about people falling in love with AI chatbots, of them neglecting their real-life friendships or relationships to pursue a fake one. Earlier this year, I listened to the New York Times interview a woman who claimed ChatGPT was her boyfriend. These stories are presented as anomalies, absurd and rare manifestations. I’d like to believe that—that AI is only a threat to the wellbeing of a very small handful of people—but the very existence of an app like Character.ai had me questioning this notion. After all, it boasts over 20 million users worldwide; so, I decided to get in contact with the culture. 

On October 25th, I posted a TikTok asking people to share their opinions and experiences.Most people left brief comments: “hate it,” said one user. Another simply put “pretty cringe sometimes” with a thumbs up emoji. Some people were more elaborate: “people [forget they’re] literally talking to an AI bot,” claimed one user, though they noted that the bots became less realistic as the app developed. Another user concurred, expressing that they “got frustrated” when the bots didn’t accurately reflect the personalities of the people or characters they were based on. Apart from that, though, nobody had anything much to say about it—the video didn’t get many views—so I planned to ditch this article.

Until I found a message request.

“Hiii,” they wrote, “I wanted to tell you about a friend’s experience with [Character.ai] but I didn’t want to risk him seeing my comment.” Of course, my interest was piqued. Andy (they/them) continued: “we both started using Character.ai when it started getting big… around early 2023, we were both 16. … I started getting bored rather quickly … [but] he started texting bots instead of answering my messages, he would go AWOL for thirty-six hours.”

I thought back to myself at sixteen: I was pretty impulsive, apathetic towards consequences; I was naively passionate and, honestly, felt quite lonely.

“Now, two and a half years later,” Andy confesses, “I think he still uses it to fill some sort of void he feels… the friends I have who used it felt pretty alone when they did,” writes Andy.

If AI had been around when I was sixteen, I think I would’ve fallen into it like Andy’s friend. People argue that ‘what’s on the other end is not human,’ but I think that might be its strength. To be ostracised and isolated is to be vulnerable to these kinds of technologies, not only because they reciprocate communication but because they are robots, thus devoid of judgement, allowing you to be yourself in a way other (real) people have often berated you for. As an isolated, disconnected midteen, I would’ve loved it—which begs me to question: what effect is this having on young people now?

“I think many people my age use it if they are very invested in fandoms,” Andy recalls.

The concept of being able to emotionally interact with a universe that was once distinctly separate from our real world is quite a scary thing to me. Teens anything like me relate with characters that aren’t real because real people don’t relate to us. Although I can’t imagine that the blurring of this boundary is much of a good thing. For me, being unable to distinguish between fantasy and reality resulted in a prescription of antipsychotics. Therefore, I wonder,  what might happen to young people who are interacting with artificial intelligence that is tailored in this way. I wonder if it’s cruel—if it’s enabling a bad habit, a maladaptive daydream or an unconventional fantasy.

Even though traditional forms of fan media­—fan art, fanfiction and fan edits—are very much alive, times are changing. The ability to interact with fake personas in a tactile, affectionate way indistinguishable from real people is bound to have side effects—but while the rest of the world wants to point and laugh at those who fall for such schemes, I can only imagine what my life would have been like if that tech had been available to teenage me.

Sìana Baker is one of four resident writers for the life section of Her Campus KCL.
Currently, she is reading for a BA in English (third year). Her areas of focus include modernist literature, poetry and interdisciplinary theory. She is intrigued by the avant-garde, the experimental and the innovative; the first work she fell in love with as an adult was Samuel Beckett’s Waiting For Godot, which she owns in second edition.
Aside from her academic commitments, Sìana has always enjoyed creative writing, most notably poetry—her diaries have been filled to the brim with rhyming couplets since she was six years old. Her most recent pieces can be found on her dedicated Instagram page, @shaztheticism.
Overall, she hopes to bring her academic work and her passion for creative writing together in her contributions to Her Campus KCL.