[ad_1]
As nine members of its ethics board resigned, Axon, the company that developed the Taser, announced on Sunday that it was pausing plans to develop a stun-gun-equipped drone that it said could be used to prevent mass shootings.
After the mass shootings in Buffalo and Uvalde, Texas, last month, Rick Smith, the founder and chief executive of Axon, announced a proposal for a nonlethal Taser drone that schools and other venues could use to prevent mass shootings. The drones, Mr. Smith said, could “play the same role that sprinklers and other fire suppression tools do for firefighters: preventing a catastrophic event, or at least mitigating its worst effects.”
The announcement, on Thursday, came weeks after a two-thirds majority of Axon’s ethics board voted to recommend that the company not follow through with a pilot study that sought to vet the concept for Taser-equipped drones.
The ethics board quickly issued a public statement on Thursday, in which it said that it had not had time to review the proposal, and that Axon’s decision was “deeply regrettable.”
Three days later, on Sunday, nine of the ethics board’s 13 members informed Mr. Smith that they would resign. Mr. Smith said in a statement on Sunday that Axon would pause its plans for the drone project. It was unclear whether the decision to halt the project was made before or after the board members told Mr. Smith that they planned to resign.
“It is unfortunate that some members of Axon’s ethics advisory panel have chosen to withdraw from directly engaging on these issues before we heard or had a chance to address their technical questions,” Mr. Smith said. “We respect their choice and will continue to seek diverse perspectives to challenge our thinking and help guide other technology options that we should be considering.”
The nine board members who resigned said in a statement on Monday that “none of us expected the announcement.”
“We all feel the desperate need to do something to address our epidemic of mass shootings,” they said. “But Axon’s proposal to elevate a tech-and-policing response when there are far less harmful alternatives, is not the solution. Before Axon’s announcement, we pleaded with the company to pull back. But the company charged ahead in a way that struck many of us as trading on the tragedy of the Uvalde and Buffalo shootings.”
In Axon’s announcement of the concept, Mr. Smith said, “I know it sounds faintly ludicrous to some.” He offered three caveats: that nonlethal drones should not have the capacity to kill; that humans, not the drones, should control what the drone does; and that the drones would need “rigorous oversight.”
“If a shooter comes into a church, for instance, and a drone is deployed and puts the shooter down, we cannot simply cheer that success,” Mr. Smith said. “We have to examine the video closely and rigorously.”
The board members who resigned said in their statement that the ethics board had warned the company for years against the use of products that can surveil people in real time.
“This type of surveillance undoubtedly will harm communities of color and others who are overpoliced, and likely well beyond that,” they said. “The Taser-equipped drone also has no realistic chance of solving the mass shooting problem Axon now is prescribing it for, only distracting society from real solutions to a tragic problem.”
One of the board members who resigned, Barry Friedman, the director of the Policing Project at the New York University School of Law, said in an interview that he was pleased Axon halted its plans for the drone project, and that he hoped the company would fully abandon it.
“I think it’s very important that we find a way to constrain the adoption of technologies, which is happening often with very little concern for harm to privacy, harms to racial justice or concerns about how much data the government holds on all of us, and what’s accessible to the government,” he said.
One of the four board members who decided to not to resign, Giles Herdale, said he hoped that, by staying on the board, he could “try and mitigate any harms caused by developments such as this.”
“What we are there to do is to try and put perspectives to give them pause to think,” said Mr. Herdale, an associate fellow at the Royal United Services Institute, a London think tank that specializes in security issues.
“Because the notion of arming drones, or any other autonomous robot, is such a far-reaching decision,” he said, “we’d want really careful, careful consideration and lots of guardrails around the deployment of those sorts of technologies.”
[ad_2]