MITRE: White House biometrics definition requires rethink
MITRE’s Center for Data-Driven Policy recommended the White House redefine biometrics as it develops an Artificial Intelligence Bill of Rights, in a request for information response submitted last month.
Within its RFI, the Office of Science and Technology Policy married biometrics for identification with technology for inferring emotion or intent and medicine’s understanding of the term as any biological-based data. MITRE would rather OSTP use the National Science and Technology Council‘s internationally accepted definition of biometrics limiting them to identity matters.
The U.S. lacks a comprehensive privacy law that would serve as the foundation for regulating AI, which has policy groups like the Open Technology Institute pressing the Biden administration for increased oversight and safeguards. OSTP wanted RFI respondents to examine biometrics through the lens of AI to inform the AI Bill of Rights government will use to protect people from problematic technologies but in doing so conflated three distinct concepts, which MITRE holds will lead to confusion.
“They kind of grouped multiple, different technologies into a single grouping, and those technologies all have different backgrounds, different operational issues and different policy considerations,” Duane Blackburn, science and technology policy lead at the Center for Data-Driven Policy, told FedScoop. “Grouping them together like that is going to really complicate the policy analysis and potentially leads to making improper decisions.”
MITRE’s second recommendation for OSTP is that it makes evidence- and science-based policy decisions because misconceptions about identity biometrics abound — the first being they’re not scientific in nature. Blackburn points to the decades of biometrics research, international standards, accreditation programs for examiners and university degrees.
The second misconception is about how face recognition technologies, specifically, are biased. Most people assume the bias is prejudicial for and against certain ethnic groups, and while that may be true for some algorithms, the assumption overlooks technical and operational bias, Blackburn said.
When face recognition technologies were first being developed 20 years ago, image lighting, pose angle and pixel numbers greatly impacted results — known as technical bias.
A face recognition algorithm trained for longer with more data performing more accurately than another is an example of operational bias, which impacts how the system works.
“There are not direct correlations between technical and operational biases and prejudicial bias, even though in a lot of policy analyses they’re treated as equivalent,” Blackburn said. “You can take a biometric algorithm with no differential performance technical bias and create systems with massive prejudicial bias.”
The opposite is also true, he added.
Lastly MITRE recommends OSTP ensure any policy decisions around biometrics are focused and nuanced, given the many biometrics that exist: fingerprint, face recognition, iris recognition and some aspects of DNA.
“You can’t really come up with a singular policy that’s going to be proper for all three or four of those modalities,” Blackburn said.
Using biometrics to unlock a phone is “significantly different” than law enforcement using it to identify a criminal, and decisions will need to be made about what data sharing is allowable under the AI Bill of Rights, he added.
An OSTP task force released a report on scientific integrity in early January reinforcing the need for technical accuracy when making policy decisions. Challenges aside, Blackburn said he remains optimistic OSTP is up to the task of crafting an AI Bill of Rights.
“How can we set up the policy so that it’s accurate from a technical, scientific-integrity perspective, while also meeting the objectives of the public that they represent,” Blackburn said. “It’s not easy, it takes a lot of time and effort, but OSTP and the federal agencies working on these issues have a lot of experience doing that.”