Columns » Technicalities

Local law enforcement will adopt recognition technology at some point, and there's nothing we can do about it

by

comment
SHUTTERSTOCK.COM
  • Shutterstock.com
Even though there’s a public outcry against Amazon’s sale of its Rekognition application to Orlando, Florida, law enforcement, there’s no stopping the technology from becoming part of standard operating procedures for cops in the future.

Launched in 2016, the opportunity for Rekognition to aid local law enforcement agencies is considerable. The visual analysis service provides not only face recognition, but can also detect objects, scenes, activities, paths of moving people, the number of people in crowds, inappropriate content and read text in images. The recognition application analyzes data with the type of detail that was not available or grossly inaccurate just a couple of years ago, with variables such as age range, eye color, gender and even emotional mood being detected, too. The service can even analyze live images.

Really, Rekognition could be a wonder drug for law enforcement agencies, and it's available for a price that's sure to spread this technology rapidly. But even the most invaluable of technologies comes with new sets of problems that have to be addressed.

While facial recognition technology isn't new, Amazon is one of the most ubiquitous corporations in the world, and the idea of its tentacles stretching even further into our every day lives does not fit well with some people.

Amazon has been under public scrutiny for marketing their Rekognition system to law enforcement agencies, and currently testing the software with the Orlando, Florida police department. The American Civil Liberties Union (ACLU) and other civil rights organizations say the system can track not only people committing crimes, it can also identify and track innocent people. In a public letter to Amazon CEO Jeff Bezos, the ACLU and dozens of other organizations noted that Rekognition's tracking of “people of interest” could target people who attend free speech rallies or protests and open them up to surveillance even though they have not committed a crime.

Even some Amazon shareholders have concerns about selling this technology to police departments. 19 "groups" of shareholders, including advocacy organizations such as the Social Equity Group, the Northwest Coalition for Responsible Investment, Sustainvest Asset Management, and the Social Equity Group, wrote a letter to Bezos stating they wanted to halt Rekognition sales to law enforcement, citing that it not only poses a privacy threat, but may a negative financial impact on stockholders tied to public backlash. But the company hasn't backed down.

“We believe it is the wrong approach to impose a ban on promising new technologies because they might be used by bad actors for nefarious purposes in the future," Amazon's general manager for artificial intelligence, Matt Wood, says in a statement. "The world would be a very different place if we had restricted people from buying computers because it was possible to use that computer to do harm.”

It is true that Rekognition has the potential to be used in the wrong way. But, as Wood stated, potential harm hasn't been a good enough excuse to stop technology before. Cloning, video games, CRT monitors, television, mobile phones and even computers were all seen as the ruin of our society for the same reasons at one time or another.

Fact is, our personal information is everywhere in digital form already, all we can do is continue to mitigate the danger by being diligent and educated on how it's being used. And as far as law enforcement adopting recognition software into its arsenal, that train has already left the station — it's only a matter of time before Rekognition and other facial recognition technology become standard tools for law enforcement around the globe.

Add a comment

Clicky Quantcast