Not Just Respond—Prevent
Kazakhstan's AI-Powered Aman System Takes On School Bullying Crisis
Picture an ordinary school recess. Noise, running, children laughing and arguing. In one corner—a sharp word, a shove, tension building, the kind adults usually notice only when someone shouts or the situation spirals out of control. It's in that precise moment—between "nothing serious yet" and "too late"—that technology is now stepping in.
Aman, a Kazakh startup that just a year ago was a simple SOS button app, has evolved into a full-fledged security system. No longer just a way to call for help, it's now a platform integrating video surveillance, behavioral analysis, direct police communication, and parental tools—all in one ecosystem.
But the project didn't start with technology. The idea grew from personal unease. Aman's founder, Aruzhan Mede, has often said the catalyst was the daily news of violence and the gnawing anxiety of walking down the street unsure whether help would arrive in time if danger struck.
The first version of Aman was built for that exact scenario—immediate protection. An SOS button, alerts to loved ones, security calls, recording what was happening. A solution for when every second counted. But as the team expanded, they confronted a harsher reality: the most devastating stories often begin not on the street, but in school.
According to Kazakhstan's National Center for Public Health, 17.5% of Kazakh children face bullying. Meanwhile, 14.1% of teenagers admit to participating in it. In 2025 alone, official data shows 139 minors died by suicide—and nearly one in five cases was linked to bullying. Over 11 months that same year, 404 bullying incidents were recorded. But those are only the cases adults found out about. How many more remain hidden in school hallways and private child chat groups?
This realization marked a turning point for Aman. The startup shifted from reaction to prevention. Today, its core feature is AI-powered video surveillance. The system analyzes camera footage to detect signs of aggression—fights, sudden movements, dangerous behavior. If a potential conflict is identified, it alerts school administrators.
Another critical upgrade is integration with the Ministry of Internal Affairs' Operational Center and the 102 emergency service. In a real threat scenario, alerts can go straight to police—no middlemen, no delays. Developers emphasize privacy: video data stays within the school or residential complex, never uploaded to external servers.
A separate module is the parent app, which lets users track their child's location, receive incident alerts, and get notifications about potential emotional distress. It's a tech-driven answer to a problem society hasn't fully grasped—and one that, until now, has often gone unnoticed.
But along with high expectations come pressing questions. One of the most sensitive is how to distinguish between genuine bullying and ordinary conflict. Parents and teachers worry that, in some cases, such systems could become tools of pressure or manipulation—especially if reports are taken at face value, without considering the full context.
Then there's a broader concern: Could digital monitoring replace human judgment in understanding these situations? Developers counter that the system doesn't pass verdicts—it simply flags potential warning signs. The final decisions remain with people: school administrators, psychologists, and parents.
In the end, Aman is no longer just a startup or a piece of technology. It's an attempt to answer a critical question: Can we detect and prevent danger before it turns into tragedy? And if so, are we ready to entrust that task not only to people, but to algorithms as well?
Read also:
- Executive from significant German automobile corporation advocates for a truthful assessment of transition toward electric vehicles
- Crisis in a neighboring nation: immediate cheese withdrawal at Rewe & Co, resulting in two fatalities.
- United Kingdom Christians Voice Opposition to Assisted Dying Legislation
- Democrats are subtly dismantling the Affordable Care Act. Here's the breakdown