Akron Beacon Journal

 

Dave Yost responded appropriately to reports, starting in the Washington Post over the weekend, about the nationwide mining of Bureau of Motor Vehicle databases. In Ohio, as reported by the Columbus Dispatch, the BMV provided driver’s license photographs from 2011 to the Bureau of Criminal Investigation as part of developing a facial-recognition system to help identify criminal suspects. The state attorney general has called for a review of the process with the objective of achieving the “transparency and clarity” now missing.

As Yost rightly put it in a statement to the Dispatch this week, “Ohioans expect that law enforcement at both the federal and state levels use all available legal tools to protect them from threats ranging from terrorism to garden-variety crimes. No Ohioan expects that work to blossom into a surveillance state.”

The concern isn’t exaggerated. Neither Congress nor state lawmakers approved tapping into the driver’s license database. The public wasn’t directly informed. In 2016, Mike DeWine, then the state attorney general, allowed the FBI access to use the facial-recognition system. The Drug Enforcement Agency and Immigration and Customs Enforcement also have access, the latter, according to the Post, using the scans to search for immigrants.

DeWine, now the governor, backs the Yost review, and adds there has been no abuse of the state system by federal or local authorities. His press secretary told the Dispatch that facial-recognition technology has been used for “limited and legitimate purposes,” as a tool in identifying criminal suspects and preventing terrorist attacks.

The system first started here in 2013, the Cincinnati Enquirer reporting its existence. DeWine put together an advisory committee to assess the program. It recommended improvements, including audits to identify abuses and narrowing access by law enforcement authorities.

That is the kind of care required in applying a technology that has much value for law enforcement and public safety yet could be misused easily. It is reasonable, for instance, to implement policies along the lines of limiting the technology to violent crimes or ensuring that no arrests are based solely on such facial recognition. It bears emphasizing that millions of Ohioans have landed in the system though they committed no crimes. As Dave Yost suggested, their privacy belongs in the conversation about the technology.

At the least, state officials must be forthcoming about its presence. That is especially so in view of research raising concerns about the shortcomings in facial-recognition software. In January, a study by the Massachusetts Institute of Technology reported that a program from Amazon incorrectly identified darker skinned women as men at a rate of 31 percent.

This “algorithmic bias” surfaced in another recent study, conducted by the Florida Institute of Technology and the University of Notre Dame, facial recognition delivering false matches at a higher rate for African Americans than for whites. The study notes the software can be adjusted to achieve a more accurate read of the black population. Yet the change results in a higher rate of failed matches for whites. Apparently, there isn’t the one software package capable of accurate outcomes for both.

A Notre Dame researcher told the New York Times the decisive factor may not be skin color. Other things such as facial structure and hairstyles may be part of the problem.

Again, Ohioans have good reason to want law enforcement authorities to use all available tools to protect public safety. At the same time, those tools must not only comply with the law and respect individual rights. They must work effectively. In the case of facial-recognition technology, this is a tool in progress, requiring oversight and public discussion. Which Dave Yost has advanced with his review.