By Michael Fisch
LAW BRIEFS
As a member of the Massachusetts Facial Recognition Commission, Suffolk Law Professor Maurice Dyson helps to evaluate thorny legal issues involving this fast-growing technology. But he also brought a harrowing family experience with facial recognition to the commission, which is recommending rules for how police should use the technology.
Professor Dyson’s personal experience has informed the commission’s development of a questionnaire for police precincts and other law enforcement officials regarding their use of facial recognition. One question asked whether officers were applying any evidentiary standard—for example, probable cause—before the technology’s findings were used.
Some commission members argued that the tool is used by police only for investigations, not for arrests or prosecutions, so law enforcement’s use of facial recognition tools does not require probable cause. Those members wanted the evidentiary standard question removed from the questionnaire.
Dyson sees the probable cause issue as much more complicated. The legal system has just begun to consider the legal implications of a facial recognition match and the chain of events, including detention, that it can unleash, he says. “The new and unreliable technology sometimes leads to misidentified people being detained and threatened with the use of force.”
Dyson told fellow commission members about his own experience with facial misidentification. Dyson’s brother, driving with Dyson, their mother, and Dyson’s 7-year-old nephew, was suddenly stopped by police. The officers ordered everyone to get out and put their hands on the car.
A police officer pushed Dyson’s brother against the hood of the car and held a gun to the back of his head while Dyson and the rest of the family watched in horror. Police later said that Dyson’s brother had an erroneous facial recognition match with someone who robbed a store, and that their car was similar to the one the thieves drove.
Dyson described the experience, he says, to demonstrate that the technology is being used to detain people in ways that can easily escalate to a deadly encounter. “Given the significance of the harm, you want police officers and departments to think very carefully about using facial recognition data. The probable cause standard would help lead officers toward that kind of reflection,” he says.
“We do not know how or why the mistake was made with regard to facial recognition, and this is a big problem,” he says. “With little transparency into the investigative process with facial recognition, and given the degree of inaccuracy involved, when you have a serious allegation, the potential for a deadly encounter grows.”
No one should be subject to these kinds of detentions without probable cause, Dyson told the commissioners, who ultimately left the evidentiary standard question in place.
Image by Michael J. Clarke
Return to Table of Contents
As a member of the Massachusetts Facial Recognition Commission, Suffolk Law Professor Maurice Dyson helps to evaluate thorny legal issues involving this fast-growing technology. But he also brought a harrowing family experience with facial recognition to the commission, which is recommending rules for how police should use the technology.
At one point in the commission’s process, it drafted a questionnaire. One question asked whether officers were applying any evidentiary standard—for example, probable cause—before the technology’s findings were used.
Some commission members argued that the tool is used by police only for investigations, not for arrests or prosecutions, so law enforcement’s use of facial recognition tools does not require probable cause. Those members wanted the evidentiary standard question removed from the questionnaire.
Dyson sees the probable cause issue as much more complicated. The legal system has just begun to consider the legal implications of a facial recognition match and the chain of events, including detention, that it can unleash, he says. “The new and unreliable technology sometimes leads to misidentified people being detained and threatened with the use of force.”
Dyson told fellow commission members about his own experience with facial misidentification. Dyson’s brother, driving with Dyson, their mother, and Dyson’s 7-yearold nephew, was suddenly stopped by police. The officers ordered everyone to get out and put their hands on the car.
A police officer pushed Dyson’s brother against the hood of the car and held a gun to the back of his head while Dyson and the rest of the family watched in horror. Police later said that Dyson’s brother had an erroneous facial recognition match with someone who robbed a store, and that their car was similar to the one the thieves drove.
Dyson described the experience, he says, to demonstrate that the technology is being used to detain people in ways that can easily escalate to a deadly encounter. “Given the significance of the harm, you want police officers and departments to think very carefully about using facial recognition data. The probable cause standard would help lead officers toward that kind of reflection,” he says.
“We do not know how or why the mistake was made with regard to facial recognition, and this is a big problem,” he says. “With little transparency into the investigative process with facial recognition, and given the degree of inaccuracy involved, when you have a serious allegation, the potential for a deadly encounter grows.”
No one should be subject to these kinds of detentions without probable cause, Dyson told the commissioners, who ultimately left the evidentiary standard question in place.
Image by Michael J. Clarke
By Michael Fisch
Law Briefs
Return to Table of Contents
Suffolk Law Professor Maurice Dyson