Dive Brief:
- A coalition of 41 youth justice and civil rights organizations sent a letter Monday urging the U.S. Department of Education to ban schools from using federal grants to purchase police surveillance technologies — especially those that tap into artificial intelligence.
- The letter said the growing use of AI-driven surveillance technologies is expanding police presence in schools and causing students to have more frequent police contact, exclusionary discipline, and school pushout.
- Ultimately, the organizations wrote, the greater use of these technologies in schools establishes a “dangerous new chapter in the school-to-prison pipeline and mass criminalization of Black, brown, and Indigenous youth and other marginalized young people.”
Dive Insight:
Improving school safety and mental health continue to be top of mind for district leaders. As a result, it’s become more common for schools to tap into surveillance technologies to address these issues.
But this letter — cosigned by organizations including the NAACP Legal Defense and Educational Fund and GLSEN — signals a growing movement to more critically examine and perhaps stop or slow the use of AI and other big data technologies in school surveillance.
More schools, for instance, are relying on AI weapon scanners and AI video surveillance cameras to detect if someone is bringing a gun into the building. This particular technology saw an uptick in interest in the K-12 sector following the May 2022 massacre at Robb Elementary School in Uvalde, Texas, according to AI security company leaders.
But as these technologies grow in popularity, an October report from the American Civil Liberties Union said the sweeping array of school surveillance technologies — including communications monitoring, online monitoring and web filtering, weapon detection systems, and remote video monitoring — create a false sense of security with little evidence demonstrating improved school safety.
Guidance from states, industry leaders and researchers has also continuously flagged the risks of using AI in schools, given the inherent implicit biases within these tools. As guidance from the California Department of Education notes, algorithmic bias can potentially have “unfair and discriminatory outcomes in machine learning algorithms and AI systems due to the data used to train them or the design choices made during their development.”
Monday’s letter also called for the Education Department to divest all of the agency’s discretionary appropriations from funding police surveillance technologies in schools. The groups also suggested the department issue guidance and offer assistance to help districts conduct and disclose assessments and audits of AI technologies to evaluate if trained datasets are harming students due to its pre-existing biases.
Additionally, the letter said the department should study the prevalence of surveillance tools and other high-risk AI technology in public schools.
Pushback against the use of AI in school surveillance is still slow moving. Though New York became the first state in October to ban facial recognition technology in schools, Utah and other states are signing contracts to use AI-based gun detection video surveillance software in all of its public K-12 schools.