Press "Enter" to skip to content

AI and the future of university studies

Artificial intelligence is a technology that is becoming more accessible for everyone to use, leading to its increased use in a number of fields and professions. This has led to positive reactions from some and negative reactions from others. Universities have found this debate complex as they struggle to find the line between accepting the widespread use of this new technology and ensuring students are still leaving college with the knowledge they need to be successful in the world. 

ChatGPT, created by Elon Musk, is the newest in a long line of AI tools that has generated a flurry of news articles and opinions. One article says the first chatbot program was created in 1966. ChatGPT was released Nov. 30, 2022. 

“[The bot was designed to] answer follow-up questions, admit its mistakes, challenge incorrect premise, and reject inappropriate requests,” OpenAI said about ChatGPT

Even though this chatbot is an improvement on older chatbots, researchers and users have remarked that it’s not entirely accurate, which is one of its major downsides. 

“[T]he software also fails in a manner similar to other AI chatbots, with the bot often confidently presenting false or invented information as fact. As some AI researchers explain it, this is because such chatbots are essentially “stochastic parrots” — that is, their knowledge is derived only from statistical regularities in their training data, rather than any human-like understanding of the world as a complex and abstract system,” an article by The Verge said

As a student on a university campus, one of the biggest concerns is that students will use AI chatbots like ChatGPT to write their papers for them, instead of doing this work themselves. A number of articles came out in October and November 2022 regarding this topic and the number of students that are using AI tools to write their papers and essays. An article by Vice talked to students who use these tools for their college homework. 

“It would be simple assignments that included extended responses. For biology, we would learn about biotech and write five good and bad things about biotech. I would send a prompt to the AI like, ‘what are five good and bad things about biotech?’ and it would generate an answer that would get me an A,” Reddit-user “innovate_rye” told Vice. 

According to other students, using these types of tools has decreased their anxiety regarding schoolwork. They say they don’t use AI for assignments that they need to know the information to be able to pass the class, but instead work they don’t find interesting or they view as a waste of time. 

“I like to learn a lot [and] sometimes schoolwork that I have done before makes me procrastinate and not turn in the assignment. Being able to do it faster and more efficient seems like a skill to me,” the Reddit-user and student said.  

While this seems like a valid point, it only is on the surface-level. A lot of ‘busywork’ is likely actually helping to ensure students have full understanding of topics they are expected to know for their classes and into the future. Things that students see as pointless may not always be so, just because it’s boring work for them. It’s often said that you need to be exposed to a topic around a dozen times for it to be stored in your brain long-term. 

That being said, perhaps this is also a sign that the education system needs to change. If students are so overwhelmed with schoolwork that they are turning to AI tools to be able to pass their classes, maybe that shows that the amount of work assigned to students is too much. 

Another concern is that plagiarism checkers, something professors have learned to rely on, won’t be able to tell the difference between work written by an AI tool and a human. Part of this is the questions regarding whether using AI tools to do assignments for them is plagiarism itself, since it isn’t the writer’s actual voice, even if they’re not copying directly from someone else’s work. 

A new tool created by a Princeton senior is made specifically to determine whether essays were written using the AI tool ChatGPT. This isn’t because he is against students using the tool, but a way for them to learn to use it responsibly. 

“AI is here to stay. AI-generated writing is going to just get better and better. I’m excited about this future, but we have to do it responsibly,” the senior Edward Tian said. 

Even with these tools available, not all students will use it to make their university lives easier. There’s something about being so mentally exhausted after finishing a paper or exam that you can’t get by using AI tools. Just because a set of students use this tool doesn’t mean that every student does. One bad apple shouldn’t spoil the whole batch. I think this is ultimately how we should view this new technology, as with any major change: there will always be people that “misuse” it, but that doesn’t mean everyone is going to and the assumption shouldn’t be that everyone will. 


Get the Maine Campus' weekly highlights right to your inbox!
Email address
First Name
Last Name
Secure and Spam free...