Amazon co-founder MacKenzie Scott has donated over $19 billion to charity in just five years
Diamond batteries powered by nuclear waste promise 28,000 years of clean energy
Ignored Warnings: A Year of Silence
In November 2023, a student reported through an app that photos of 50 classmates had been taken from their social media and altered with AI to depict them nude. What should have triggered immediate action only led to a hastily conducted internal investigation, with the school dropping the case after the main suspect denied involvement. As a result? Nothing was reported to the police or child protective services.
It wasn’t until mid-2024 that the police got involved, following the seizure of a student’s phone. This student, accused of creating the images, has since left the school, but no charges have yet been filed against him. For many, this delay allowed for more victims to be targeted.
Angry Parents, Mobilized Students
The families are furious. In November 2024, they filed a lawsuit against the school, accusing its leaders of negligence. In their view, the mere fact of not alerting the authorities from the beginning is a betrayal of the students. The students themselves have voiced their frustration by organizing a silent march. “All they had to do was call the police,” an exasperated mother stated.
Under pressure, the board finally announced the departures of Micciche and Ang-Alhadeff. Some found this relieving, but others saw it as too little, too late.
A Necessary Reset
The school has promised measures to regain the trust of parents and students: appointing an interim director, implementing training to better handle reports, and providing psychological support for the victims. But this is just the start, as the trauma runs deep.
This scandal raises a crucial question: Are schools prepared to deal with such technological misuses involving AI? With laws still vague and monitoring inadequate, young people are on the frontline against abuses linked to artificial intelligence. When a problem involves social media and AI, ignoring it only makes matters worse. Perhaps this incident will serve as a lesson for other institutions… provided they listen, for once.
