Her Campus Logo Her Campus Logo
KCL | Career

The Impact Of AI On Creative Industries—In Conversation With Dr Michael Collins 

Updated Published
Casey Chan Student Contributor, King's College London
This article is written by a student writer from the Her Campus at KCL chapter and does not reflect the views of Her Campus.

“We go to art because it is evidence of complex, dialectical, uncertain thinking.” 

Since its advent, Artificial Intelligence (AI) has been encroaching upon every industry—but especially our creative sectors. AI actress Tilly Norwood made her debut just last month; disputes over copyright infringements of AI-generated music continue to escalate; and countless photographers and artists report losing work to generative softwares. As Arts and Humanities students, it’s reasonable that we feel apprehensive about our career prospects (more so than we already were.) 

I therefore decided to interview Dr Michael Collins from the KCL English Department to ask what he thinks is the future for Arts and Humanities careers, and what insights he can give to current students on how to navigate the career world. 

Q. Before the advent of AI, there was already a sense that creative industries were undervalued compared to STEM fields.For example, during covid times there was a government-backed advert suggesting a ballet dancer should retrain in cyber security that received a lot of backlash. With the rapid development of generative AI, do you think this schism in how society values creativity versus technology has widened? 

Dr Collins: I think this is a great time for us to demonstrate how important our skills are to any developing industries  making use of AI. AI  assistance, and things like that, they are making errors left-right-centre, without fact-checking. There’s a general sense that the production of AI generated things is ‘slop’. Any job that a person enters is going to have moments of human interaction underpinning those things. Arts has always been good at training interpersonal communication. Shared resources of ideas, discussions, seminars—that’s fundamentally how Humanities students learn. That is going to be more important as a tool of verifying, exploring and growing society as AI becomes more dominant. Not only are the arts more useful than other courses, but arts education is also potentially more useful than ever before. It’s evidenced in Silicon Valley that the arts are doing well in terms of new idea generation, and there is a job crisis in tech. Because AI ultimately seeks to annihilate coders— they will be the wave to go. It’s a rich moment for us. 

Q. There is always the lingering question of whether we should “beat them or join them” — do you think there is room for compromise? Currently, YouTube is partnering with musicians to enable users to generate tracks using those artists’ vocals. In the visual arts, the use of generative AI is being compared to artists’ adoption of previous technologies like photography, and researchers at the Royal Northern College of Music are building AI tools and datasets to interrogate its role in creative practices. What do you think about these practices, ethically and realistically? 

Dr Collins: We need to get into the definition of what we mean when we talk about AI. A lot of the generative AI technologies being pushed by large corporations in Silicon Valley are actually seeking AGI—artificial general Intelligence. Sam Altman’s group is trying to design an AI system that can be applied to any toolset or ask imaginable—that is dangerous. That will lead to disenfranchisement of human labour. Perhaps AI can operate under circumscribed rule sets. If we want to build an economy based on AI, we need a regulation that limits AI to certain kinds of tasks. If we want to use AI to produce good stuff, we need rules. And it’s going to be people trained in humanities who will have the space to be able to define those rules because they understand what makes good art and they are able to express it. Currently this drive to limitlessness is what will make AI fail as a tool. Limitless growth is dangerous to the core of AI tools. We need to get better in the arts to show how good regulation makes good stuff. 

Q: Tech and education ministers from the UK and Finland have compared the advent of AI to the launch of calculators. They say we should not fear AI, rather become better than it. Similar to math exams becoming more difficult so that critical skills and approaches are still required of students, how can arts and humanities students upskill themselves beyond AI?

Dr Collins: Firstly, generative AI is, by any definition, incredibly stupid. It has no comprehension whatsoever of what it is producing. No one goes to a work of art because it’s a brilliant execution of an algorithmic principle. We don’t enjoy something just because of the style it has pushed out into the world. We go to art because it is evidence of complex, dialectical, even uncertain thinking. Art is evidence of the intelligence of our brains trying to understand the world. AI has no capacity to do that. There will always be a space that will prioritise understanding over execution. I think our ‘upskilling’ is to lean further into the social qualities of arts education—make our capacities with oral argument better, make rational debate central. Let people learn, grow, understand and develop, rather than just accumulate a dataset. It is only in those areas that we can combat and present an intelligence better than AI. Besides, large language models are at, like, the level of a parrot. They are trying to make a parrot a god. 

Q: So, up to this point I’ve asked you a lot about how AI impacts the creative world. I am curious though, as to whether you think it’s a mutually affecting relationship. I understand that science fiction, horror, technology and dystopias are your areas of interest. I have read my fair share of Ray Bradbury. One of his stories, The Veldt, depicts the Hadley family in their Happylife Home—a house that has almost a mind of its own. At first it seemed able to remove all the inconveniences of the Hadley’s and anticipate all their needs, but in the end, it ultimately causes the demise of the parents. Granted, we’re nowhere near that stage of automation in everyday life just yet, but what do you think is the role of literature in responding to or even combating the erosion caused by AI? 

Dr Collins: One thing AI can’t do is prophesize its own doom. It assumes there will be resources it can continually cannibalise, and that it can grow its databases infinitely. They don’t have an awareness of the actual limit of things. Whereas literature is often engaged by stories of people encountering impediments, or limits, or having to reroute their thinking and requestion their form of life—because resources and possibilities are not limitless. Literature is very good at understanding how the human brain rethinks its position when it encounters problems and conceptualises ends and limit points—an AI can’t do that. It has no sense of an ending, just a limitlessness. We can’t live in a world that does not have a sense of an ending because that would lead to the end of the world. You need to have a sense of the impediments to transcendence in order to have a human, meaningful literary output. 

Follow up: So, are you saying that, not only do stories with a conscious sense of ‘limits’ serve as a warning, but they are polemically opposed to AI at their core? 

Dr Collins: I think that’s right. 

Q: I want to end on a hopeful note. My friends and I were discussing possible futures with AI. The best hypothesis we came up with was AI, deepfakes, and technology becoming so seamless that we are no longer able to distinguish the real from the fake online—and everything becomes uninteresting. Society turns away from the internet and turns to books, live performances, and traditional paintings. What’s some encouragement you can give to current, and future, arts and humanities students? 

Dr Collins: I really like the idea that the ultimate end point is just really boring stuff. You cannot question authenticity—but we need to have debates about authenticity and works of art. Is this an accurate depiction of the world? Does the person who makes this art have the right to make this statement? We need to have that—it’s the root of our politics. And when that’s abandoned because everything is partially artificial, we’ll be so bored by that world that we may seek more authentic experiences. There are already examples of that, see: the back to vinyl movement, the rejection of digital music; or the rise of poetry slams and experiential types of spaces. Another potential possibility is the AI bubble will burst, there will be an economic collapse which will force people to back away from it towards different kinds of skillsets. Already, it’s hugely overinvested and over-promising. We’ve got too much resistance in the world and in our economic structure to allow for the future of AI to be inevitable. 

Student at King's College London reading English with Film