Press "Enter" to skip to content

AI in the classroom and how it may change education and jobs

You may or may not be familiar with the term artificial intelligence, or AI. If not, then you could be familiar with the very popular AI system called ChatGPT. ChatGPT was launched in November of 2022. The program allows users to compose emails and essays in a manner that sounds as if a real human composed it. This is a big reason why the program caught on like wildfire among college students. The program will complete essays and assignments for you.

ChatGPT seems to be the first wave of artificial intelligence rebuilding the foundation of academics, because a lot of things that students normally do themselves can now be assisted or done by AI. The increasing awareness of this program among students has started to instill panic in some instructors, but others have begun the process of embracing where the trend is headed instead. ChatGPT and other AI programs have unknowingly put academics at a crossroads where they can try to ban the use altogether or they can choose to adapt the AI into their curriculum. 

As artificial intelligence has become more of a hot topic in the world, especially in educational areas, people are starting to discuss and collaborate on it more. As its popularity progresses, people seek to understand where it is headed and what it means for not only the future of education but also the future of the workforce, specifically entry-level jobs. The reason artificial intelligence systems like ChatGPT have become such a hot topic is because we have to acknowledge that they may be better at performing certain jobs and tasks than we could ever be. 

If we look into specialized areas such as computer programming and coding, an artificial intelligence program is less likely to have errors and the project would be completed probably hours before a human would finish the task. It is a frightening thing to face, but also one that we are still unsure of. Oded Netzer, a business school professor at Columbia, believes that artificial intelligence will actually help coders instead of replace them. So artificial intelligence could actually be a supportive addition instead, filling in the gaps so that workers may execute their work more accurately and quicker. 

The University of Maine has recently been holding a series of artificial intelligence events where anyone interested can become more informed on the budding topic of artificial intelligence in school and workplaces. On April 6, there was a presentation called “The Future of Work: How Generative AI May Change Professions and Tasks Ranging from Law, Journalism, Software Programmer, Videographer, Medicine, and More.” Led by the main presenters, Peter Schilling and Jon Ippolito, the presentation lasted an hour and was informative on the future of AI in classrooms and jobs. In attendance were professors from a widespread variety of departments at UMaine such as forestry, chemical engineering, journalism, computer science and more. Each department had a different experience and viewpoint on the way artificial intelligence may change their education curriculum and how it may affect students entering the workforce. 

Not all opinions were negative, with some professors seeing programs like ChatGPT as a way to supplement learning if used properly. Others, like those in computer science, had interesting perspectives on how it could possibly be more difficult to get an entry-level job in the field should artificial intelligence start doing the base-level work. 

Another interesting perspective was that fields like English and journalism may not be as affected by artificial intelligence because jobs in that field involve things that may just require human personality and human sense. This is because there is an Achilles’ heel with artificial intelligence, and that is that it lacks true sense that only a human can have. AI can also be risky in healthcare, rather than an asset, if the program is not made 100% to be completely unbiased. For example, there was a system used to identify malignant skin lesions by a dermatology department in healthcare and the system started to incorrectly identify malignant lesions based on the presence of a ruler. Medical images of a cancerous lesion will include a ruler in the image for scale and the program had started to falsely identify the ruler as an indicator of cancer. It shows how easily fallible and dangerous the system could be if not done and supervised properly. 

 


Get the Maine Campus' weekly highlights right to your inbox!
Email address
First Name
Last Name
Secure and Spam free...