The last year was crucial for K-12 leaders seeking national and state resources to guide their schools and districts in integrating artificial intelligence into the classroom.
But just as schools received more AI support and tips from both the federal government and national organizations, warning signals on the repercussions of misusing the technology also emerged. From burgeoning lawsuits to a proliferation of deepfakes in schools, these challenges may slow schools down in their efforts to roll out AI tools, according to some K-12 technology experts.
Making the future even hazier, experts say they are uncertain how federal support for AI in K-12 will fare during President Donald Trump’s second term.
As schools navigate the complexities around using AI in classrooms, here are four trends to look out for in 2025.
District and state AI guidance will rise
While it’s still unclear what direction federal AI policies for education will take under the second Trump administration, it’s likely that schools will have to rely more on national organizations and nonprofits for high-level guidance in 2025 — and for several years to follow, said Kris Hagel, chief information officer at Peninsula School District in Gig Harbor, Washington.
Over the past two years, the U.S. Department of Education’s Office of Educational Technology developed useful AI resources and guidance, said Pat Yongpradit, chief academic officer of Code.org and lead for TeachAI. But Yongpradit said he doesn’t expect similar federal assistance in the near future.
The Education Department’s AI resources “really set the tone for state education agencies,” Yongpradit said, adding that “regardless of what happens with the Ed Department, state education agencies are going to take it from there.”
With more states expected to continue rolling out their own AI resources for schools, Yongpradit said he foresees more school districts following suit with their own policies.
As of November, 24 states had released guidance for AI in education, according to TeachAI, a national coalition that aims to guide schools on safe and ethical AI use. Code.org is a nonprofit that provides computer science curriculum and programs to schools.
More AI tools will be tailored to special education and English learners
With special education teachers increasingly expressing interest in AI tools, Yongpradit said he’s hopeful “more tailored experiences” will be on the horizon for this sector.
At Peninsula School District, Hagel said, leaders are exploring how to securely analyze students’ Individualized Education Program data through the district’s own AI enterprise system. The goal is to ultimately use AI to help improve IEPs by comparing students’ testing data to their IEP goals, he said.
There are ways to do that "safely and securely,” Hagel said. “I think people haven’t wrapped their heads around the underlying technology to understand.”
Still, Hagel strongly advised against using free, publicly available AI tools like ChatGPT for special education needs. However, districts could explore special education solutions with AI enterprise systems where “you have environments built-out or safe, where you know that the large language model is not saving that data, you know it's not taking it anywhere, and nobody else is storing it,” Hagel said.
Robin Lake, director of the Center on Reinventing Public Education at Arizona State University, said she expects to see more AI tools quickly rolling out this year to support not only special education students, but also multilingual learners.
She agreed that AI-powered tools will likely support IEPs in the future. And for English learners, Lake expects real-time translation tools to be more integrated in classrooms.
Teachers' reliance on AI detectors will continue to grow
Teachers’ use of AI detection tools has grown in recent years as tech companies tout their software’s abilities to spot text generated or paraphrased by AI.
Yongpradit said he expects to see more teachers opt for AI detection tools in 2025. At the same time, he said, more public pushback is likely against using this software to address cheating and plagiarism.
In fact, Yongpradit said, he often dissuades teachers from using AI detectors. “Even if these tools were perfect — no false positives, no bias,” the detectors are designed for particular generative AI models, which often change and ultimately make the detectors less effective, he said.
“The better thing to do is to figure out why you're teaching what you're teaching, why the kids would be cheating in the first place,” Yongpradit said. “Is what you’re doing just basically in need of a change itself?”
Lake, however, said more teachers will likely go beyond detection tools. Teachers may seek live feedback from an AI coach listening in on their instruction, or more teachers might start using AI for targeted professional development.
Personalized instruction tools, such as AI tutors, also could see growing popularity among educators this year, Lake said.
Some districts will still struggle to integrate AI
Despite the growing visibility of AI in K-12, a sizable number of school districts have yet to start implementing the technology or continue to block its use altogether, Hagel said. Yet he said he is hopeful that most school districts will get onboard this year with using AI to some degree.
Yongpradit, meanwhile, said he expects “huge swaths of the education community” still won’t do much with AI. That’s “simply because they have bigger fish to fry, and frankly so,” Yongpradit said.
Lake has heard from several school districts — both rural and urban — that say they don’t have the capacity, money and time to seriously invest in AI — even though the interest is there. But that’s where federal and state officials should step in to provide support and guidance, she said.
Nonetheless, she said, not many states are providing financial investments for schools looking to innovate with AI tools.
Likewise, some districts continue to struggle with AI implementation because “there’s a lack of understanding fundamentally on how AI works” and they’re fearful of it, Hagel said. That challenge illustrates a need to rethink how to explain AI to school leaders.
“Something’s going to have to happen to get people to understand the underlying technology behind AI so that they can feel more comfortable with moving forward with it,” Hagel said.
Challenges including lawsuits over a school’s plagiarism policies or concerns with student data privacy protections can have a “chilling effect” on districts wanting to move forward with the technology, Lake said. While schools shouldn’t take unnecessary risks involving AI, she said, they should feel comfortable experimenting with these tools in controlled, evidence-based settings to find solutions for students and teachers.