Where Does eDiscovery Fit in the Facial Recognition Conversation?
For most of us, the concept of facial recognition – like so much technology of the last decade – began as a sci-fi detail we accepted on the big screen but didn’t give much thought to in our day-to-day lives. Then one day, our phones started tagging photos automatically, asking, almost sheepishly, “Is this you?” And just like that, the idea that an algorithm could learn to recognize our faces was real. But we’ve moved on from that innocuous beginning and now are treading more and more into the realm of another type of film (I’m thinking here of Brazil or Minority Report) where technology aids law enforcement in making our world a safer place, but also introduces new privacy and ethics conundrums when that technology fails or is corrupted. And, as always, eDiscovery has a place where legal and tech (in this instance law-enforcement and facial recognition) collide.
Law Enforcement, Facial Recognition, and Privacy Concerns
In a Washington Post article in July 2019, it was revealed that agents with the Federal Bureau of Investigation and Immigration and Customs Enforcement were using facial recognition software to scan state driver’s license databases, analyzing millions of Americans’ photos without their knowledge or consent.
In the past, police have used fingerprints, DNA and other “biometric data” collected from criminal suspects (think of those large binders of mugshots you always see victims flipping through in cop shows), but the photos in DMV records are of a state’s residents, most of whom have never been charged with a crime.
According to the Government Accountability Office, the FBI has logged more than 390,000 facial-recognition searches of federal and local databases, including state DMV databases, since 2011. Even though neither Congress nor state legislatures have authorized the development of such a system, and now lawmakers on both sides of the political spectrum are concerned.
“Law enforcement’s access of state databases,” House Oversight Committee Chairman Elijah E. Cummings (D-Md.) said is “often done in the shadows with no consent.” And Rep. Jim Jordan (Ohio), the House Oversight Committee’s ranking Republican, seemed particularly incensed during a hearing into the technology earlier this year.
“They’ve just given access to that to the FBI,” he said. “No individual signed off on that when they renewed their driver’s license, got their driver’s licenses. They didn’t sign any waiver saying, ‘Oh, it’s okay to turn my information, my photo, over to the FBI.’ No elected officials voted for that to happen.”
Off-The-Shelf Facial Recognition Tools for Law Enforcement
In Oregon, back in 2017, the Washington County Sheriff’s Office became the first law enforcement agency in the country known to use Amazon’s artificial-intelligence tool Rekognition, and almost overnight, the deputies of this small county in the suburbs outside of Portland had ramped up their investigative ability. With this off-the-shelf technology, they were able to scan for matches of a suspect’s face across more than 300,000 mug shots taken at the county jail since 2001. With that information, they can take a picture – perhaps captured by a security camera, social-media account, or cellphone – and link it to an identity.
But linking a photo to previous mug shots is analogous to the same action that happened prior to the addition of technology. It’s something detectives used to do manually but now are assisted with the use of AI. Still, there are significant problems with the technology itself.
Facial Recognition Bans Due to Concerns Ranging from Privacy to Racial Equity
Some places have already taken measures to ban face recognition software. In 2019, San Francisco became the first city to ban the technology for law enforcement and government agencies. Similar measures are under consideration in Oakland and Massachusetts. Lawmakers in California are also considering a statewide ban on facial recognition programs.
To add to that list, the largest manufacturer of police body cameras, Axon, is rejecting the possibility of selling facial recognition technology at the recommendation of an independent ethics board which it created last year after acquiring two artificial intelligence companies.
In a 42-page report, the ethics panel found that face recognition technology is not advanced enough for law enforcement to depend on, with concerns ranging from “privacy costs to racial equity.” In fact, the technology was found to be less accurate in identifying the faces of women than men, and younger people compared to older ones. The same was true in people of color, who were harder to correctly identify than white people.
But the lack of regulation or precedent means that, while the technology is being banned in some places, it’s being pursued in others. For instance, Detroit reportedly signed a $1 million deal for software that let it continuously monitor “hundreds of private and public cameras set up around the city,” including gas stations, restaurants, churches and schools, according to the New York Times.
Facial Recognition and eDiscovery
So where does facial recognition fit in eDiscovery? As with any emerging technology, it’s worthwhile for those of us in legaltech to stay abreast of these changes. For one, any new technology is potential ESI that must be discoverable should a criminal or civil case arise in which that electronic data is evidence. Often, people in legal circles might argue, “That hasn’t happened yet, so we’ll worry about it later.” Then again, people a few decades ago might find it unbelievable that data from a phone app (“phone app?” they might add with a puzzled look) meant for requesting rides from a stranger would be used in a nationally covered murder case.
But there is also the notion that facial recognition tools might be used to help attorneys and litigation support specialists do their work. In fact, the company Veritone recently announced an AI eDiscovery tool that makes unstructured data searchable by keywords, faces, and objects.
As technology continues to forge ahead, it’s important that the legal world engages in the conversations surrounding these technologies, before they find themselves deeper in the state of catch up many say they’re already playing.
Written by Jim Gill
Content Writer, Ipro