
When a company that profits from school shootings publishes a report about how to stop them, you already know what’s not going to be on the list: getting rid of the guns.
Omnilert’s new “School Shootings 2024” report reads like a self-congratulatory white paper from a corporation trying to sound like a public health agency. It lists every possible “solution” to school gun violence, including mental health support, social-emotional learning, threat assessment teams, community engagement, and, of course, “AI-enhanced cameras.” Everything, that is, except the one factor that makes all of it necessary in the first place. Because if we actually tackled gun access, Omnilert wouldn’t have a product to sell.
The report cites 336 school shootings in 2024, with 276 victims and hundreds of thousands more affected by trauma. It calls for mental health programs, training for teachers, community interventions, and more police presence as the path forward. The language is polished and compassionate, carefully designed to sound like progress. But nowhere in Omnilert’s lengthy statement about “school safety” is there a single mention of restricting firearms, improving background checks, or even requiring safe storage. That silence isn’t accidental. It’s strategic.
Gun availability is the fuel for the entire ecosystem of “school safety tech” companies. The more shootings there are, the more schools panic-buy new systems that promise to “spot threats in real time.” The worse the violence gets, the better the business case. Omnilert and companies like it have no incentive to solve the problem. They just need to sell the illusion that they can.
Omnilert loves to talk about prevention, at least when it’s profitable. Their AI surveillance system is marketed as a way to detect guns in real-time and alert authorities before anyone gets hurt. In reality, it’s already done harm. In Baltimore County, their system mistook a crumpled Doritos bag for a firearm, sending police racing to confront a Black teenager who was simply eating a snack after practice. The officers arrived with guns drawn. The company later told reporters that the system “worked as intended.”
Think about that. A teenager could have been killed, and Omnilert’s conclusion was that its software performed correctly. That is not safety. It is an automated extension of the same fear and bias that already put students of color at risk, now repackaged as cutting-edge technology.
The truth is, Omnilert doesn’t need to prevent school shootings. It just needs to look like it’s helping. Its entire business model depends on schools believing they are safer when they install these systems. Every false alarm, every lockdown, and every gun-drawn encounter keeps the fear alive, and that fear keeps the money flowing.
Omnilert’s report ends with a call for a “comprehensive approach” to school safety. It praises AI cameras, mental health programs, and trained school resource officers. What it will never endorse is the one measure that would actually reduce school shootings: reducing access to the guns themselves. Because if that happened, the need for Omnilert’s technology would vanish overnight.
Companies like this don’t exist to end school shootings. They exist to make them manageable enough for politicians and administrators to feel like they’re doing something. They sell safety as a subscription service, not a solution. Until the gun problem itself is addressed, everything they build is just another layer of security theater, and the next “false alarm” might not end with a bag of chips.
(Source)






Leave a comment