The opinions expressed in this article are the writer’s own and do not reflect the views of The University of Scranton.
I know it’s completely unpopular, but I don’t use ChatGPT to help me in my writing assignments or day-to-day life. This aversion comes not only from the obvious code of conduct rules it breaks at my university, but also because of the ways it de-skills me from using my own creativity.
I took a very interesting course in the fall called “Science and Society,” which studied the way scientific advancements progressively changed and shaped society. One particularly important topic we touched upon was that as people lean on machines and technology, we begin to lose our personal and societal skillsets. When the sewing machine was invented in the mid-1800s, suddenly many Americans no longer needed to know how to sew or knit. While people still sew and knit today, it is not the household skill it was in the past, and if you gave me a bunch of fabric, I couldn’t sew a shirt.
I think that if people, particularly young students, continue to lean on ChatGPT, we will also be de-skilled in learning how to write. If we as students need ChatGPT to write out essays for us, it could be that we have been failed by our teachers and professors, but I think it more so demonstrates a laziness that has become a byproduct of my generation’s need for instant gratification.
Of course, I understand why it could be appealing to just “push through” your classes by using ChatGPT or other AI tools. After all, many STEM and business majors would argue that writing is not essential to their future education. However, STEM leans heavily on research reports and presentations of data; business requires professionalism in emails and presentations. Without an ability to effectively and coherently communicate our ideas, even in these environments, we will look unprofessional. I understand that we live in a society where instant gratification and ease are taken for granted. Students, who are often overwhelmed, don’t want to think or write. And why would we, when we can have everything handed to us?
The problem with simply “using our resources” to write an entire essay is that we are de-skilling ourselves and keeping ourselves from doing the difficult thing of learning how to write by writing. This begets the idea that people could just use ChatGPT to write all of this, but is that really where society has gone to? That we need a robot to think like us and write for us? For the rest of our lives? It’s like having your mother make your bed every morning, even when you’re 40. At some point, it’s just embarrassing.
For English majors like me, however, education is directly being harmed by ChatGPT. Because so many students cheat on quizzes or use ChatGPT to write their essays, and it is becoming increasingly difficult to tell what is written by a human or AI (fun fact: if you give AI a sample of your writing, it can mimic your own writing), teachers and professors alike are forcing students to write their essays within a class period, a model that is not only stressful but harmful to our education. Essays take time, and even in college, word-vomiting onto a piece of paper in an hour is not conducive to learning because there is barely enough time to read the entire essay back over and edit. If I can’t learn how to fix my own writing, I will never be able to improve my own writing abilities.
When students use AI to cheat, other students like me—who work through the process of writing or finding answers by combing through our textbooks like we’re supposed to—are unfairly penalized. My grade will be inevitably lower because I don’t have a supercomputer doing it for me. I am learning, but my grade will always be lower than those who use ChatGPT as a result.
In and of itself, this problem goes beyond traditional writing and quizzes that we can just copy and paste. It also applies to creative writing. I have several professors who encourage us to use ChatGPT as a resource to get story or character ideas. This is also heavily problematic because not only are we not learning how to write, but now the machine is robbing us of our imaginations and originality.
However, these frivolous uses of ChatGPT aren’t just harming us as students and as a society; it also harms the environment. Generative AI models like ChatGPT create significant strain on the electrical grid, requiring large amounts of water and electricity to function, putting a strain on natural resources. The data centers needed to house AI servers are temperature-controlled to allow the large computing infrastructure—servers, data storage drives, and network equipment—to run smoothly. While such data centers have existed since the rise of computers, AI has dramatically increased the need for data centers and the electricity needed for them to function and to train generative AI models.
According to Norman Bashir, MIT’s Computing and Climate Impact Fellow at the Climate and Sustainability Consortium (MCSC) and a postdoc in the Computer Science and Artificial Intelligence Laboratory (CSAIL): “What is different about generative AI is the power density it requires. Fundamentally, it is just computing, but a generative AI training cluster might consume seven or eight times more energy than a typical computing workload” (Zewe).
Scientists have estimated that the power requirements of data centers in North America increased from 2,688 megawatts at the end of 2022 to 5,341 megawatts at the end of 2023, thanks to generative AI (Banko and Brill). Globally, the electricity consumption of data centers rose to 460 terawatt-hours in 2022 (Banko and Brill). This would have made data centers the 11th largest electricity consumer in the world, between the nations of Saudi Arabia and France, according to the Organization for Economic Co-operation and Development (Banko and Brill).
By 2026, the electricity consumption of data centers is expected to approach 1,050 terawatt-hours, which would bump data centers up to fifth place on the global list, between Japan and Russia (Banko and Brill).
While not all data center computation involves generative AI, the technology has been a major driver of increasing energy demands. “The demand for new data centers cannot be met in a sustainable way. The pace at which companies are building new data centers means the bulk of the electricity to power them must come from fossil fuel-based power plants,” says Bashir (Zewe).
In a 2021 research paper, scientists from Google and the University of California at Berkeley estimated the training process alone consumed 1,287 megawatt-hours of electricity, which is enough to power about 120 average U.S. homes for a year and generates about 552 tons of carbon dioxide (Zewe).
Now, to prevent the servers from overheating, a cooling system is used in these data centers to extract heat from the supercomputers. However, it’s not minimal amounts of water: it’s massive quantities.
Data centers use water to extract heat from overheating PC components, consuming nearly fifty centiliters of water—roughly a single plastic bottle—for every 20–50 queries on ChatGPT. With the introduction of generative AIs like ChatGPT, the water demand has only increased further (Gargar).
ChatGPT even beats Google in the water waste category, zooming past with 25 times more contributions to water waste than the ever-famous search engine (Gargar). With all these data centers operating at full power due to the increasingly fast development rate of artificial intelligence, it puts a severe strain on our resources.
Adding insult to injury, ChatGPT is only able to function because it steals data off the internet from people like you and me. AI pulls data from Wikipedia, Reddit, popular art, and even research databases, hidden under the guise of “training data,” “unsupervised learning,” and “data exhaust.” AI researchers think that your content is more important than what computer scientists are doing, as it needs our words and interaction to feed its artificial intelligence.
As such, we forget that without our content, ChatGPT would not exist, and that by using and further training the monster of AI, it will only grow more expansive. As a writer, that isn’t something I particularly want to think about, as I believe creations in the arts—movie scripts, books, etc.—should be made by humans. However, publishers and producers alike could cut out the human element entirely by generating content from machines, creating films and books written by code instead of people. In fact, when I watched the AI-generated Coca-Cola ad, I was very disheartened to see that the company had prioritized cutting cost over real creativity and a good product.
However, AI can’t just poorly replicate our writing on its own, it does so by stealing dozens of authors’ content to create. The unauthorized use of creative works to train AI models has become a significant concern among writers and artists. In a notable instance, journalist Alex Reisner revealed that over 139,000 film and television scripts were utilized without consent to train AI systems developed by major tech companies, including Apple, Meta, and Anthropic (DeepNewz). This dataset encompassed scripts from acclaimed shows such as The Simpsons, Breaking Bad, and The Sopranos, raising alarms about the infringement of intellectual property rights (DeepNewz). After all, while in this instance the AI systems were directly trained on these works without permission, they can also pull from the internet to find anything written and steal our work in that way. As such, the creative work used to train these AI models underscores a broader issue about the ways not only proprietors of these artificial intelligence models steal data, but also why these AIs have far too much access to the internet because of the way they are able to pull anything written from the internet.
In conclusion, ChatGPT and other AI tools may seem beneficial in the short term; however, the ease they provide comes at a powerful cost: to our learning, to our environment, and to our future as thinkers and creators.
Works Cited
Michele Banko and Eric Brill. 2001. Scaling to Very Very Large Corpora for Natural Language
Disambiguation. In Proceedings of the 39th Annual Meeting of the Association for Computational Linguistics, pages 26–33, Toulouse, France. Association for Computational Linguistics.
DeepNewz. “Investigation Reveals AI Trained on 139,000 Movies and TV Shows without
Writer Consent, Including Apple and Anthropic | DeepNewz Policy.” DeepNewz, 20 Nov. 2024, deepnewz.com/ai-modeling/investigation-reveals-ai-trained-on-139000-movies-tv-shows -writer-consent-apple-7eb34849?utm_source=chatgpt.com.
Dorfman, JJ. “Gravity Falls Creator Responds to Discovering His Scripts Were Used in AI-
Training–Using a Beloved Character’s Voice.” CBR, 26 Nov. 2024, www.cbr.com/gravity-falls-creator-ai-training/?utm_source=chatgpt.com.
Gargar, Lois Gargar. “ChatGPT Is Bad for the Environment: Here’s Exactly Why.” The Teen
Magazine, 2022, www.theteenmagazine.com/chatgpt-is-bad-for-the-environment-here-s-exactly-why.
Zewe, Adam. “Explained: Generative AI’s Environmental Impact.” MIT News,
Massachusetts Institute of Technology, 17 Jan. 2025, news.mit.edu/2025/explained-generative-ai-environmental-impact-0117.