Since the rise of ChatGPT and other generative AI sources, educators of all levels have struggled to address a new form of academic dishonesty – computer generated writing passed off as original work. Nearly 2 ½ years after the launch of ChatGPT, the issue is more prominent than ever.
Just under a year after ChatGPT was created, The Pegasus explored the dangers of AI in an article titled AI Bears Danger of Stealing Student Voices. Since publication, Eureka has faced a growth in the use of AI in all subjects.
In order to combat generated assignments, some professors have started increasing the amount of in-class writing done, while others have added more bluebook exams to their courses.
However, there has been a push to use generative AI as a positive tool in certain courses with instruction from the professor.
Professor Carver, associate professor of special education, utilizes AI within her Educational Assessment course to assist students. Carver began instructing students to use AI ethically and when allowed in her course after the idea headlined the Illinois Education and Technology Conference.
“We are utilizing AI to help write assessments, build rubrics for those assessments, and for the purposes of generating ideas. We aren’t saying to write the assessment and use it just as AI has generated it, we’re saying to give AI a response and see if we like the response, and if not, adjust to get a different idea,” Carver said.
In education classes where students participate in an off-campus teaching practicum, Carver believes AI can allow future educators to spend more time building relationships with students, and that generative AI can assist with idea generation.
“We started with Google Arts and Culture, which has different games on it all the time. One of the games is feeding it a prompt, and from the prompt it has to generate the same picture. We started with practicing our prompts and looking at the vocabulary we are using, making sure we are using the most efficient vocabulary,” Carver said. From student’s practice in class, they are able to create meaningful prompts, then edit the work AI provided.
Although some professors have found ways to adapt AI into their courses, the fight against submitting AI assignments prevails.
Outside of classes where AI is permitted by the professor, students may use AI for a myriad of reasons. One student shared ways that they use AI to assist them in their coursework. For the sake of the student, anonymity has been granted.
“I use it for outlining essays when I’m writing and to put thoughts into bullet points. The only other time I would use it is if I was confused by a prompt, I would put the prompt into AI to put it in a simpler form,” the anonymous student said.
In class, professors work to find AI generated assignments in writing. Dr. Cunningham, associate professor of religion, has previously taught a course about the ethics of AI, as well as discouraged students from using AI in the classroom. “Most of the time, it’s written assignments and papers where I can see signs of a use of AI, especially in terms of a drastic shift of language or of structure,” Cunningham said.
“When ChatGPT-4 landed, or before there was kind of an explosion of apps or different versions of generative AI, it was relatively unknown or obscure to students. But once everybody started to catch on to what AI was or what the tool could do, I began to see it quite a lot, actually. Some of it has tapered off, actually, but I still see its presence,” Cunningham said.
Students who recognize their peers using AI in a negative way often have strong feelings opposing the use of generative AI. Senior Hayden Skaggs is vocal about his dislike for the use of AI, and believes it has negatively contributed to the learning environment at Eureka. “With so many students, it is the way they think now. They don’t think, they let AI think for them, and it’s destructive to the learning capabilities of students,” Skaggs said.
Turning in AI generated assignments, as per Eureka policy, is considered academic dishonesty. Different courses have different penalties, many including failing the assignment or course. Outside of assignments, AI has also been used to answer in-class questions, or for research purposes.
Professors have begun altering coursework to discourage the use of AI, including the expansion of self-reflection within assignments. A common theme of assignments and the push against AI revolves around the lack of genuine thinking found in work. “AI, at this point, doesn’t really simulate human thought, in the sense that it doesn’t have the ability to reflect in a personal or subjective way in what something means to itself,” Cunningham said. Without thinking or reflecting on their assignments, students will miss a crucial aspect of expanding their learning.
With the expansion of AI comes ethical implications and limitations.
The use of AI to complete coursework has been an issue everywhere, and it’s began affecting learning and employment. An article published by Macalester College explores the ways which professors have had to revise their courses to avoid students using AI, as well as finding balance on where to utilize AI when teaching. Additionally, there has been growing concern about the expansion of AI into the workforce, where AI can simulate human activity in research, data processing, art, writing, and more.
Outside of academia, many find positive uses for AI, including web browsing, meal planning, online shopping, and much more. Many social media sites, such as Meta, X, and Snapchat, have implemented AI into their software, creating conversational bots users can interact with. Still, negative consequences for positive uses persist. Energy levels needed to run AI are expected to increase, which contributes to the amount of water used to cool the systems, contributing to the ever-growing climate concerns across the world.
While AI becomes a part of everyday life, it is important to analyze the ethical and social implications of use within academia. Letting a chatbot think and reflect for you may be detrimental to your academic, social, and career success, and can lead to a widespread culture shift away from fundamental aspects of human nature if not used properly.