With the meteoric rise of artificial intelligence, which utilizes applications accessible to anyone with an internet connection, comes the equally meteoric rise of concerns regarding the potential of these new technologies and the ethics of using a program such as ChatGPT in an educational setting.
Here at the University of Maine, this issue has evoked heated debate. Many professors, both progressive and traditionalist, have sought to better understand how to best situate this new technology within an academic environment, though what that would look like is highly subjective.
Within the discipline of academic and professional writing, the standards to which student work is held often considers not only the clarity of information but also the students’ ability to identify and cite evidence in support of their argument.
It could be claimed that using AI not as a singular solution but as a tool, can help students to be better equipped in diagnosing the ideal structure of argumentative support. On the other hand, crafting and structuring arguments is a skill students should be developing on their own rather than relying on a program.
A liberal education as defined by the Association of American Colleges and Universities, is “…an approach to undergraduate education that promotes integration of learning across the curriculum and curriculum, and between academic and experiential learning, in order to develop specific learning outcomes that are essential for work, citizenship and life.”
Artificial intelligence may become a prolific tool that our society integrates at a fundamental level. That being said, as students of the liberal arts and sciences, it is our purpose to not forsake our responsibility, which is to learn and develop ideas freely. Similarly, to be competent in argumentation and not rely on a facsimile of the mind to generate ideas for us.
Another concern of using AI in a university setting is plagiarism and the identification of source material. When a claim is made in an argument, it needs to be supported by evidence. Otherwise, it is just an opinion. ChatGPT, the primary generative AI tool used by students, provides limited sources, and is often only able to identify an author or book that a piece of evidence comes from. If asked for specific citations such as page numbers, it will fabricate information and render the source unreliable. Unless specifically prompted, the program will not offer sources.
A student who misuses ChatGPT and programs it to fully write a paper, will receive a useless piece of writing, devoid of human thought and understanding. That paper may present information well, but without citations, it undermines the academic utility of professional writing and the creation of original ideas using past research as evidence.
The University of Oxford defines plagiarism as “Presenting work or ideas from another source as your own, with or without consent of the original author, by incorporating it into your work without full acknowledgement. All published and unpublished material, whether in manuscript, printed or electronic form, is covered under this definition, as is the use of material generated wholly or in part through use of artificial intelligence.” This definition is a perfect example of the concerns with AI surrounding plagiarism. Generative AI tools use information in bulk from the internet without adequately paying the authors of the original works from which it generates.
Plagiarism and unoriginality are symptoms of a student who misuses public tools simply to take an easier way out. At UMaine, the impact seems minor compared to universities with greater enrollment. There are no significant student organizations dedicated to its use, and the University of Maine System has failed to include a stance on AI into its language about plagiarism in student work. Anyone can generate work with relative ease using these tools.
Steps need to be taken to ensure proper education about AI systems is given to students so they can use it as a tool and not a crutch. Intelligent students will work to develop their skills alongside artificial intelligence. They will still generate their own ideas, practice writing and crafting arguments, but will do so alongside the new systems that will inevitably become a large part of our society, economy and education.