
Some stories read like satire even before you write them up. Case in point, the October 20th incident at Kenwood High School, where police drew guns on a teenager because an AI system thought an empty bag of Doritos might be a firearm. Except, we’re now being told it wasn’t the Doritos at all.
No, according to Maryland’s Inspector General for Education, Richard Henry, the real culprit was the student’s hand movement. A pose. A gesture. The angle of a few fingers. A snack-based Rorschach test for law enforcement.
As Henry told WBAL:
“The way that their hand was situated — a couple fingers in, a couple fingers out — that movement was detected.”
Ah yes, the universal sign of imminent danger: holding a bag of chips like a normal human being.
You’d think after police rolled in with guns drawn on a teen who had a snack in his pocket, everyone would stop and wonder what the problem might be. Say, bad software, bad policies, or the age-old American addiction to pointing weapons at children. But no. According to Henry, “Omnilert did its job.” Police did their jobs. Everyone did their jobs. It was just, wait for it, human error.
The kind of human error where the principal couldn’t see the “canceled” message because it didn’t fit on her phone screen. The kind where the school resource officer was off that day and didn’t have his work phone to review the alert, so he just called the precinct. And the kind where the inspector general leans over the mic and insists the AI wasn’t biased, wasn’t mistaken, wasn’t malfunctioning; it just saw fingers.
And here’s the dark irony. Actual school shootings are almost always described afterward as a perfect storm of errors. Warnings go unheeded, parents dismiss red flags, schools miss threats, and police bungle communications. This incident at Kenwood was almost like a reverse school shooting. The same chain of cascading mistakes, except instead of failing to stop an attacker, it nearly got an innocent student killed.
Meanwhile, the school district is locked into a $2.6 million contract with a system that, according to its own data, has sent thousands of alerts, fewer than 1% of which have ever required law enforcement. Because nothing says “effective school safety” like turning every student into a potential silhouette target practice scenario.
Let’s entertain Henry’s explanation for a moment. If this is really about “a couple fingers in, a couple fingers out,” then we’ve entered new and exciting territory.
Any student could find themselves at police gunpoint for taking a phone, wallet, chapstick, or lint out of their pocket the wrong way.
Because pockets, as we all know, are suspicious. And movement? Even more suspicious. And teenagers? Well, that’s basically Omnilert’s whole business model.
But sure, let’s assure the public this has nothing to do with race. Nothing to do with overreactions. Nothing to do with a school system that responds to ambiguous footage with armed officers.
Just fingers.
This is the part that really sticks. The inspector general’s report and his radio appearance bend over backwards to reassure everyone that Omnilert didn’t do anything wrong. It’s almost impressive the linguistic gymnastics required to argue that an AI that has now sent thousands of false positives is both working perfectly and simultaneously needs retraining, new protocols, new app usage rules, and biannual refreshers for staff.
But it’s easier to blame a teenager’s hand position than to question whether we should really be turning schools into surveillance labs run by private companies. It’s easier to say “everyone did their job” than to confront the reality that a kid could have died because a camera misread a Doritos bag.
And it’s a whole lot easier than talking about the real problem. America’s refusal to enact meaningful gun control, forcing schools into expensive, error-prone tech that treats every student as a potential shooter.
Because if we had real gun laws, there would be no demand for AI weapon detectors, no runway for companies to sell fear-based tech, and no incentive for school districts to pay millions for a system that can’t tell a pistol from a pocket snack.
Instead, we get press releases about finger angles.
A boy ends up in handcuffs, police guns aimed at him, because an algorithm didn’t like the way he put a snack in his pocket. The inspector general insists the system “did its job.” And the company claims it was “lighting” on the bag.
Meanwhile, the only people who seem to be taking this seriously enough are the family whose child now avoids the benches where he was confronted and is in therapy because of the incident.
But sure. Let’s train principals to use the app better. Let’s tweak the protocol. Let’s do everything except the one thing that would make all this unnecessary.
Pass real gun control laws.
Until then, students, consider yourselves warned. Your pockets are a threat, your snacks are suspicious, and your fingers? Well, let’s just say you’d better hope Omnilert likes the way you hold them.
(Sources)






Leave a comment