Her Campus Logo Her Campus Logo
DESU | Culture > Digital

AI Is Writing the News — But Can You Tell?

Kalia Lindsey Student Contributor, Delaware State University
This article is written by a student writer from the Her Campus at DESU chapter and does not reflect the views of Her Campus.

The Her Campus National Editors write about products we love and think you’ll love too. Her Campus has affiliate partnerships, so we get a share of the revenue from your purchase. All products are in stock and all prices are accurate as of publication.

Artificial intelligence isn’t just behind your TikTok “For You” page or Spotify playlists anymore; it’s now shaping the headlines you scroll past every morning. Newsrooms from The Washington Post to small campus papers are experimenting with AI to write, edit, and even design stories.

That raises a big question for our generation: Can we actually recognize and trust AI‑generated content compared to human writing?

Think of journalism like a timeline of tech upgrades: 

printing press → radio → TV → internet → AI. 

Each leap made information faster and more accessible, but also forced society to rethink ethics. AI is the most complicated leap yet.

·       Pro: Speed, efficiency, and coverage of niche topics.

·       Con: Bias, filter bubbles, and the risk of losing the human heart of storytelling.

How Newsrooms Are Using AI

Heliograf at The Washington Post: This robot reporter pumps out quick election and sports updates in seconds. It’s efficient, but critics say it misses the emotional depth that makes human reporting feel alive.

OpenAI in Editorial Strategy: AI predicts trending topics and tailors stories to your feed. Convenient, yes, but it risks trapping you in an echo chamber where you only see what algorithms think you want.

Small Newsrooms & Freelancers: Tools like Otter.ai and Trint help reporters transcribe and research faster. For student journalists, that’s a huge time‑saver, but without oversight, errors and bias can sneak in.

Andy, editor‑in‑chief for Delaware News, says AI tools help him move faster: “Otter.ai helps with notes, Grammarly speeds up editing, and Stacker automates data like gas prices. Readers may not notice or care as long as the information is correct.”

But he’s quick to add: “Hands-on experience matters more than streaming videos into AI and letting it create stories.

Holmes, a retired journalist, is even more blunt: Before AI, everything was factual because you went into the field and talked to people. If you didn’t get quotes, you didn’t get a byline. AI can’t replicate curiosity or emotional depth.”

Why Students Should Pay Attention

AI‑curated feeds feel personalized, but they can also limit what you see. One student I interviewed, Tayna Roberts, said: “It feels like I’m only hearing half the story.” That’s the danger of letting algorithms decide your worldview.

And then there’s the rise of deepfakes, synthetic images that blur reality. They can be used to deceive, but also to protect journalists in repressive regimes. Either way, they’re shaping the media landscape we’re inheriting.

Recent surveys reveal the tension at the heart of AI in journalism: 68% of journalists say AI boosts efficiency but risks depth, while 55% of PR professionals worry about reputational damage in AI‑driven media. In other words, the technology is undeniably fast, but the trust it depends on remains fragile, and that fragility could shape how audiences, especially students, consume and believe the news they see every day.

AI is here to stay, but journalism can’t lose its soul. Transparency, diversity, and human oversight are non‑negotiable. As Holmes reminds us: “Curiosity should never be automated.”

For college students, this isn’t just about the future of journalism; it’s about how we learn, share, and trust information in a digital world. Media awareness is a skill set we’ll carry into every career, whether you’re writing headlines, pitching campaigns, or just trying to make sense of your news feed.

How to Spot AI Writing

Sentences may flow too perfectly but feel generic or repetitive.

Language often sounds oddly emotionless, missing human nuance.

Watch for vague phrasing and filler transitions like “however” or “in conclusion.”

Notice the absence of specific details, quotes, or lived experiences.

If it feels polished but lacks a genuine human voice, it’s likely AI‑generated.

Interactive Learning: Can You Spot the Difference?

To test this, I created an interactive quiz: “AI vs. Human: Who Wrote This?” Most people struggled to tell the difference. But when asked which they trusted more, the majority leaned toward human writing.

This experiment isn’t just fun — it’s a learning strategy. By engaging with AI content directly, students can sharpen their media literacy skills and become more critical consumers of information. Feel free to take the quiz: https://docs.google.com/forms/d/e/1FAIpQLSco1uM0MTidN32d-1nkMUPqm0MfH9ARRRC5ZHQ15Sh9wtHFGA/viewform

Sources 

· The Washington Post’s use of Heliograf (coverage via Journalists.org)

· OpenAI editorial strategy insights from recent industry analysis reports (2024–2025)

· Otter.ai and Trint transcription tools in small newsroom reporting (industry reports, 2024–2025) https://otter.ai

· Nicholas Diakopoulos, Automating the News (2019) — on bias and algorithmic influence

· Knight Center for Journalism — reporting on AI imagery, deepfakes, and misinformation

· LSJ.com.au – analysis of copyright and AI training issues – https://lsj.com.au

· Survey data from recent industry reports (2024–2025) on journalists and PR professionals

· Interviews:

·       Andy, Editor‑in‑Chief for Delaware News on Bay Side

·       Holmes, Retired Journalist

·       Tayna Roberts, student perspective

My name is Kalia Lindsey I am a senior at Delaware State University. I am an aspiring writer on the editorial team. I enjoy writing and love to share stories with others. Writing Stories brings out my creative side. Writing is my art that I like to share with the world