For many people, the idea of being able to search a vast database for any topic in the blink of an eye seems like it’s straight out of science fiction. But the reality is that it’s happening right now with artificial intelligence.
Large Language Models like ChatGPT and OpenAI have exploded in popularity ever since they were released to the public back in 2022. These tools can be used in a multitude of ways; the only problem is that people are using AI instead of their brains when it comes to an academic setting.
Students are using LLMs and generative AI to complete their coursework at an alarming rate. This is an issue that has been plaguing all types of education, from middle school to post-secondary.
For a while, UNC had strict policies in place to restrict AI usage for coursework, outlining the unethical usage in the official student handbook for the 2024-2025 school year. However, this year some students are reporting that professors are encouraging them to use AI in the classrooms, and specifically tailoring projects and classes toward being AI-friendly.
I believe that this change to UNC policy is a step backwards in progress, and overall, it is hindering students in the long run.
When you use AI to work on a homework assignment or a test, you aren’t putting in any real thought into the material, and it leads to an overall worse product. The danger lies in a term researchers are calling shallow processing.
When a student relies on AI to solve a problem or write an essay, they skip the crucial and oftentimes difficult steps of sourcing information, synthesizing complex ideas and structuring an original argument. These steps are what challenges students and leads to a better overall understanding of the source material.
A study from researchers at MIT claim that ChatGPT users had the lowest brain engagement and “consistently underperformed at neural, linguistic, and behavioral levels.” This fundamental lack of engagement ensures that the core skills needed for academic success are never properly developed.
It’s not difficult to determine AI writing from human writing. Professors can tell when a student uses AI or not, even without a plagiarism checker. Most AI-written assignments lack nuance, specific vocabulary or key terms, and the unique voice that comes from the student.
It’s not just limited to writing either; the theater departments have been using AI for months in many of their productions.
I spoke to Luke Avalos-Gonzales, a senior performance and visual arts student, about how the university's new policies are affecting their major. Turns out, generative AI has been used extensively within some of the mainstage performances, and Avalos-Gonzales is not happy about it.
“It’s quite shameful that we have an artist in charge of things that will not make art, and prefers to AI generate it over create,” said Avalos-Gonzales. “UNC is taking an AI-friendly stance and that is trickling down into the theater department, mainly through technical aspects.”
An example of the AI-generated art features behind the human actor in “Measure for Measure" (courtesy of the UNC Bears Instagram account).
They told me about how the use of AI is being discussed by students in the major, and the lack of accountability behind its usage.
In a recent UNC production, “Measure for Measure,” an AI-generated projection was displayed behind an actor, which caused a large controversy within the major.
“At least within the student body I've talked to, how blatantly and flippantly AI generations were used as a part of the projection design, and it took away from the fantastic set work and sound design that was all done by humans,” Avalos-Gonzales said.
By using AI in place of humans for theater productions, it sacrifices any shred of authenticity that the department holds. The program is full of talented artists and designers who would jump on the opportunity to create graphics like this, yet the university ignores it.
“I think AI is going in a very bad direction where it is cheapening a lot of things that we do,” Avalos-Gonzales said. “In an era where everything is becoming faster and more amplified, there’s a power in teaching a student to slow down and how to focus on things that come from their own mind.”
Many students at UNC pride themselves on not needing to use AI for course work, wearing it as a badge of honor in a time where on average, half of adults under 30 are using AI on a weekly basis.
Avalos-Gonzales is one of those students, and they have included an AI-ban on any theater productions they develop in the next year.
Gregory Egbert is a senior journalism major at the University of Northern Colorado.


