Dive Brief:
- Ed tech companies using artificial intelligence-based algorithms to detect risk among students for suicide, self-harm and harm to others via digital monitoring tools can serve as one potentially useful way for schools to prevent suicide, according to a recent RAND Corp. report.
- However, the report found when interviewing parents, advocates and some school staff that concerns persist over the technology’s ability to protect sensitive student data. The study’s respondents also raised flags about the lack of oversight and research regarding the accuracy of AI monitoring tools.
- RAND recommends that school districts engage and get feedback from their communities about the use of AI monitoring tools, inform caregivers and students about any suicide risk surveillance technology in use, and about opt-out policies. Other suggestions for districts include tracking student outcomes following a suicide risk alert and educating students about mental health issues.
Dive Insight:
Schools are facing mounting pressure to address severe student mental health concerns, just as the AI surveillance technology used by companies like Gaggle, Securly and GoGuardian, continues to grow increasingly popular among districts.
There’s good reason for school leadership concern — in 2020, the Centers for Disease Control and Prevention reported that suicide was the second leading cause of death for children ages 10-14 and the third leading cause of death for teens and adults ages 15-24. In February, CDC found that nearly 1 in 3 teenage girls seriously considered attempting suicide, a 60% increase from a decade ago. Among LGBTQ+ students, CDC said over 20% attempted suicide.
While the risk of student suicide is a real issue schools must grapple with, evidence is still sparse that AI surveillance tools are the best answer to address student safety threats to themselves and others.
A recent report by the American Civil Liberties Union flagged that school surveillance technologies, including online monitoring tools, foster a false sense of security without much evidence to demonstrate they actually improve school safety.
The latest RAND report echoes some of that skepticism. Yet researchers also said there are some benefits for AI-based surveillance tools, finding in interviews with school staff and health care providers that there have been actual instances of these tools successfully identifying a student at imminent risk for suicide who would not have been detected through other prevention or mental health programs at the school.
“Given the extent of the mental health challenges among youth and limited resources available in schools and communities to address them, these alerts might provide new information that can allow proactive response and save lives,” the RAND report said.
A November study published by the Journal of the American Medical Association also found a direct link to increased youth suicide rates with mental health workforce shortages.
As schools and their broader communities struggle to find enough mental health supports for students, there are federal resources available to boost funding for these staffing concerns in schools. For instance, the U.S. Department of Education provided $280 million to two grant programs to support school mental health. Funding for those programs comes from the Bipartisan Safer Communities Act and annual federal appropriations.