Dive Brief:
- To help schools and colleges ensure their use of artificial intelligence does not violate federal civil rights protections, the U.S. Department of Education’s Office for Civil Rights on Tuesday released its first-ever resource clarifying for school leaders how existing legal requirements apply to the technology.
- The OCR guidance outlines 21 potential examples of discrimination involving AI that could violate civil rights laws based on race, color or natural origin, sex or disability.
- OCR noted that though some scenarios could be considered discriminatory without AI, they may also “be compounded by its use.” Additionally, the resource said the examples provided are “illustrative” and “non-exhaustive” — and would not determine the outcome of any future OCR investigations.
Dive Insight:
This new OCR resource puts school districts and education agencies “on notice now,” said Kristin Woelfel, policy counsel for the Center for Democracy & Technology's Equity in Civic Technology team.
School leaders “can’t say they didn’t know that this was a thing, because we finally have a very clear statement that OCR does plan to view these uses” of AI “in light of civil rights laws,” Woelfel said.
The Center for Democracy & Technology, a nonprofit advocating for civil rights in technology policy, has flagged concerns with the rapid growth of the technology’s use in schools and what that means for students. The nonprofit found in an April report that student discipline tied to AI-related plagiarism suspicions jumped from 48% to 64% between the 2022-23 and 2023-24 school years.
In that same report, CDT warned that historically marginalized students such as English learners or students with disabilities are more susceptible to discipline involving AI plagiarism concerns.
In March, a coalition of 41 youth justice and civil rights organizations sent a letter calling on the Education Department to ban schools from using federal grants to fund police surveillance technologies, particularly those using AI. The organizations wrote that the use of such technologies in schools sets a “dangerous new chapter in the school-to-prison pipeline and mass criminalization of Black, brown, and Indigenous youth and other marginalized young people.”
Now with the new OCR resource on AI use, school leaders have a comprehensive introduction to “very real concerns” about AI discrimination, Woelfel said.
Woelfel said she hopes this new resource will encourage school leaders to consider looping in their civil rights officer or a school board attorney when thinking about implementing AI technologies.
Woelfel also recommended that school leaders audit their district’s current anti-discrimination policies for students and staff. When procuring technology, districts should ask vendors if their products can adhere to these nondiscrimination policies, she said. In addition, local civil rights offices or legal experts should be included in those procurement conversations to ensure those products don’t harm students, Woelfel said.
The OCR resource comes as school leaders are increasingly adopting AI technology and developing policies for its use. Just last month, the Education Department released a highly anticipated AI toolkit to help school leaders do so.
“A lot of the conversation lately has been about integrating AI in the classroom and how it can be a tool for innovation and for equity,” Woelfel said. “And it’s not like that’s untrue, but I do think that this part of the conversation can sometimes be lost. Maybe even to a point that this is just part of the conversation that administrators haven’t heard yet.”