Axon Wants Its Body Cameras To Start Writing Officers' Reports For Them:
Taser long ago locked down the market for "less than lethal" (but still frequently lethal) weapons. It has also written itself into the annals of pseudoscience with its invocation of not-an-actual-medical condition "excited delirium" as it tried to explain away the many deaths caused by its "less than lethal" Taser.
These days Taser does business as Axon. In addition to separating itself from its troubled (and somewhat mythical) past, Axon's focus has shifted to body cameras and data storage. The cameras are the printer and the data storage is the ink. The real money is in data management, and that appears to be where Axon is headed next. And, of course, like pretty much everyone at this point, the company believes AI can take a lot of the work out of police work. Here's Thomas Brewster and Richard Nieva with the details for Forbes.
On Tuesday, Axon, the $22 billion police contractor best known for manufacturing the Taser electric weapon, launched a new tool called Draft One that it says can transcribe audio from body cameras and automatically turn it into a police report. Cops can then review the document to ensure accuracy, Axon CEO Rick Smith told Forbes. Axon claims one early tester of the tool, Fort Collins Colorado Police Department, has seen an 82% decrease in time spent writing reports. "If an officer spends half their day reporting, and we can cut that in half, we have an opportunity to potentially free up 25% of an officer's time to be back out policing," Smith said.
If you don't spend too much time thinking about it, it sounds like a good idea. Doing paperwork consumes a large amounts of officers' time and a tool that automates at least part of the process would, theoretically, allow officers to spend more time doing stuff that actually matters, like trying to make a dent in violent crime — the sort of thing cops on TV are always doing but is a comparative rarity in real life.
[...] Then there's the AI itself. Everything at use at this point is still very much in the experimental stage. Auto-generated reports might turn into completely unusable evidence, thanks to the wholly expected failings of the underlying software.
[...] On top of that, there's the garbage-in, garbage-out problem. AI trained on narratives provided by officers may take it upon themselves to "correct" narratives that seem to indicate an officer may have done something wrong. It's also going to lend itself to biased policing by tech-washing BS stops by racist cops, portraying these as essential contributions to public safety.
Of course, plenty of officers do these sorts of things already, so there's a possibility it won't make anything worse. But if the process Axon is pitching makes things faster, there's no reason to believe what's already wrong with American policing won't get worse in future. And, as the tech improves (so to speak), the exacerbation of existing problems and the problems introduced by the addition of AI will steadily accelerate.
That's not to say there's no utility in processes that reduce the amount of time spent on paperwork. But it seems splitting off a clerical division might be a better solution — a part of the police force that handles the paperwork and vets camera footage, but is performed by people who are not the same ones who captured the recordings and participated in the traffic stop, investigation, or dispatch call response.
And I will say this for Axon: at least its CEO recognizes the problems this could introduce and suggests agencies limit automated report creation to things like misdemeanors and never in cases where deadly force is deployed. But, like any product, it will be the end users who decide how it's used. And so far, the expected end users are more than willing to streamline things they view as inessential, but are far less interested in curtailing abuse by those using these systems. Waiting to see how things play out just isn't an acceptable option — not when there are actual lives and liberties on the line.
(Score: 4, Insightful) by PiMuNu on Wednesday May 08 2024, @09:08AM (3 children)
There is significant cost in the handling of backups, setting up a software to do DB lookups etc. This is not the core business of police and it makes sense for a specialist DB firm to handle it, especially in US where police in each jurisdiction are pretty independent. Or one can ask the federal government to do it, but that would be "big government" which is politically toxic in US.
(Score: 5, Interesting) by JoeMerchant on Wednesday May 08 2024, @12:29PM (2 children)
>There is significant cost in the handling of backups,
Oh, yeah. My company charged me (inter-departmental fee) $4K to add 1TB of storage to a VM (which was set up for free 8 years earlier, when our department was handling such things...)
$4K per terabyte sounds high, until you realize that they will have to keep it on a live server, powered, backed up, serviced / maintained indefinitely. $200/yr ROI on the interest should cover all that... though the six months they spent ping-ponging me around various customer service departments to figure out how to add storage to a Linux instance probably cost them more than $2K in time (at least $1K for my time, dunno what India's chargeable rate is) for all that documented ducking of responsibility - I believe I was passed from one group to the next through about four groups before they referred me back to a group that had referred me down the chain, which then qualified me for an escalation...
If every officer on the street comes with a one time $40K IT fee to keep HD bodycam footage of every second of their interaction with the public, I think that's a hell of a lot cheaper than the time they spend in court these days "swearing to" their conflicting personal observations of contentious interactions.
🌻🌻 [google.com]
(Score: 3, Funny) by DannyB on Wednesday May 08 2024, @04:11PM (1 child)
What if an AI were to compare officer testimony and reports, in real time in court, with the AI annotated video from the body cam?
Then, just as happens in real life (eg, Star Trek), the computer could interrupt saying the subject is not relaying an accurate account of what happened.
Santa maintains a database and does double verification of it.
(Score: 3, Interesting) by JoeMerchant on Wednesday May 08 2024, @04:53PM
>the computer could interrupt saying the subject is not relaying an accurate account of what happened.
I doubt society is ready for this yet.
I do think that since the advent of things like photographic evidence, our courts should be emphasizing the distinction between personal recollection - and its problems - vs mechanically recorded evidence - and its problems.
The reality of "a jury of your peers" is still a popularity contest. It attempts to limit the aristocracy's power over trial results, but it's still a form of mob rule - most popular opinions prevail - unpopular minorities have basically no hope of an unbiased trial.
AI has a place in all this, but access to the raw data, and education as to how the raw data can and cannot be trusted and interpreted, is what really needs to prevail for "fair trials."
🌻🌻 [google.com]