In a letter to Walter Copan, undersecretary of Commerce for Standards and Technology and director of the National Institute of Standards and Technology (NIST), Rep. Emanuel Cleaver, D-Mo., called on NIST to create a framework for the development and use of facial recognition technologies.

As part of that request he specifically asked NIST to “design and endorse industry standards and ethical best practices for independent testing of demographic-based error rates in facial recognition technology.”

In the Nov. 26 letter, Cleaver said, “I write today with optimism for the potential of technology to continue to improve society, but with the understanding that government must play its part in guiding the use of technology–including facial recognition technology–down the optimal path.”

Cleaver explained that his concern is centered around “the propensity of the technology to misidentify individuals, particularly in regard to variances in skin-type and gender.”  He further explained, “Some developers have historically sold facial recognition technology by focusing on overall accuracy rates that may obscure disproportionately high rates of misidentification of certain demographic subgroups.”

In his letter, Cleaver discussed a May 2016 White House report which found that when it comes to facial recognition technology, issues with data selection, accuracy, and completeness can lead to discriminatory results. However, he cited another report which found that better training methodologies and approaches could improve the demographic accuracy of facial recognition algorithms.

In addition to developing a framework, Cleaver also requested information on any datasets that NIST makes available to facial recognition technology developers, as well as the steps NIST takes to ensure that the data sets in question reflect demographic diversity and operational conditions.

“While the exact definition of ‘algorithmic fairness’ is a complicated issue that is not easily resolved, further guidance for identifying and mitigating unintended, unjustified and/or inappropriate demographic sub-group bias is a critical step,” Cleaver concluded. “Standards informed by demographic concerns also benefit government and private sector customers who may not be conversant about an algorithm’s potential for bias; illegal discrimination resulting from such biases place company investors and the American public at significant risk.”

This is not Cleaver’s first foray into facial recognition technology. In August of this year, Cleaver asked the Department of Justice to investigate “whether the uses of facial recognition technology–as currently utilized–by law enforcement agencies are in violation of civil rights protections.”

Read More About
About
Kate Polit
Kate Polit
Kate Polit is MeriTalk's Assistant Copy & Production Editor covering the intersection of government and technology.
Tags