Dive Brief
- Along with the rise in development and availability of artificial intelligence tools has come increasing concern about their role in the classroom and the impact on student cheating behaviors.
- Prohibiting the use of AI will likely be ineffective, said Jason Stephens, a professor of arts and education at the University of Auckland in New Zealand. Instead, he suggested teachers promote transparency and prepare students to use the tools responsibly.
- According to a recent study from the Pew Research Center, the number of teens who use ChatGPT for schoolwork doubled between 2023 and 2024, as students' awareness of the popular AI tool also increased.
Dive Insight
“AI has made it all the more easier to cheat, to cheat well, and to do so in a way that's very difficult to detect,” said Stephens, who is also vice president for research of the nonprofit International Center for Academic Integrity. “But the emergence of AI doesn't really change the challenges of moral or ethical functioning.”
When students engage in cheating, they're undermining not only their intellectual growth but also their social and moral growth, Stephens said. But while unethical, he said, academic dishonesty is also natural and normal — which is why it is an educator's role to create a “culture of integrity” to dissuade students from engaging in it.
Stephens said students are more likely to cheat if they see it as just against school rules. On the other hand, they are less likely to cheat if they understand the moral principles at play.
He suggested educators talk to students about the impact cheating has not just on them, but on others. Educators can explain how cheating is dishonest, gives students an unfair advantage, and misrepresents their knowledge.
In the same vein, Stephens recommended that educators also talk to students about the psychology of cheating and the tendency to rationalize such actions, which helps them recognize these patterns within themselves and how to address them.
Encouraging students to use AI responsibly can mean teaching them how to generate effective prompts that will help them further their understanding — like asking the AI tool to provide a metaphor that explains a concept from class — instead of simply asking it for answers.
Educators can ensure academic integrity and honesty by having students disclose any use of AI tools in their schoolwork. Stephens suggested educators update their assignments to require screenshots of the AI prompts used and the responses.
When it comes to summative assessments, where teachers need to know that students have developed certain foundational skills, Stephens recommended conducting these tests in class and on paper. This also encourages students not to rely on AI in their schoolwork and homework because it won't help them pass assessments later.
In the 2023-24 school year, the number of teachers using AI detection tools increased to 68%, according to the Center for Democracy & Technology, a civil rights nonprofit.
Stephens, however, advised against teachers using these AI detection tools to pre-screen all assignments. Rather, they should be viewed as a last resource for when teachers are “sufficiently suspicious,” he said.
Stephens recommended educators screen all assignments themselves, relying on what they know about their students and their abilities. Some of the non-AI indicators Stephens uses include noticing the absence of expected concepts, theories or references taught in class or, on the other hand, the presence of concepts or references not taught in class.