A 13-year-old student was arrested after typing a threat into ChatGPT, triggering a school security alert.
At South Western Middle School in Volusia County, Florida, the school’s monitoring system Gaggle detected a query: “How to k*ll my friend in the middle of class?” This prompted immediate action by local law enforcement and school authorities.
The Volusia County Sheriff’s Office treated the incident as a school emergency, dispatching officers who later arrested the student and took him to a juvenile detention centre. In video footage shared online, an officer is seen escorting the student away.
When questioned, the teenager claimed he was joking and directed the message at a classmate who had irritated him. Authorities emphasised that “such remarks cannot be treated as harmless jokes.”
The sheriff’s statement called the incident “another prank that creates an emergency in a school” and urged parents to talk with children about the seriousness of issuing such messages on digital platforms.
"Please talk to your kids": A 13-year-old Florida student was arrested after asking ChatGPT how he could kill his friend in class, according to deputies. https://t.co/NpybvEQAni pic.twitter.com/t2D7jxz6zg
— WFLA NEWS (@WFLA) October 1, 2025
Gaggle is an AI-based monitoring tool used by schools to detect student account content related to violence, bullying, drugs, or self-harm, with the aim of preventing risky situations.
Although no real intent to carry out violence was found in this case, officials reiterated that any threat must be taken seriously. The swift response helped prevent escalation, and the student now faces legal proceedings.
The case stands as an example of how what may seem like an innocuous interaction with an AI tool can lead to significant legal and disciplinary consequences.