' Justice is Blind(ed): The Issue with Proprietary Algorithms in Criminal Investigations | MTTLR

Justice is Blind(ed): The Issue with Proprietary Algorithms in Criminal Investigations

After serving seven years in prison, Lydell Grant was released on bond in November 2019 as a result of exonerating DNA evidence. Grant was convicted of murder in 2012 primarily based on eyewitness testimony, despite the fact that Houston police could not conclude the mixture of DNA found on the victim belonged to Grant. It was not until 2018 when law students at the Texas A&M University School of Law, in partnership with the Innocence Project of Texas, took up Grant’s case and shared the DNA with Cybergenetics, a Pennsylvania-based company that analyzes DNA evidence from crime scenes. Rather than having a human compare the crime scene DNA mixture with Grant’s DNA, Cybergenetics used its proprietary software TrueAllele to conclude Grant’s DNA did not match. Cybergenetics then went further and ran a search using the FBI’s Combined DNA Index System (CODIS), which allowed authorities to find the suspect who had confessed to the murder for which Grant was convicted.

While the use of TrueAllele has led to exonerations of innocent defendants like Grant, the use of “Black Box Algorithms” whose protected algorithms and source code remain intellectual property of the company that creates them has raised issues of accuracy and fairness. Even though technology seems to have the power to analyze DNA mixtures using a sharper process than lab technicians, the lack of transparency on how the software actually conducts this analysis makes reaching this conclusion definitively almost impossible.

The Electronic Frontier Foundation, a California firm that works to protect online privacy and speech against corporate and state abuse, appealed the criminal convictions of Billy Ray Johnson, Jr. on the grounds that the defendant was not provided the source code of the DNA software that led to his conviction, and thus he could not present an adequate defense. However, the California Court of Appeals denied the appeal, holding the evidence of guilt was enough for conviction exclusive of the DNA software analysis warranted the convictions. Despite this conclusion, the court in dicta explained how even if the software’s source code is not available for examination, “[i]ts results have been subjected to peer review analysis and its program validated in certain controlled studies.”

The use of algorithms in criminal cases is powerful but dangerous if continued unchecked. Algorithms are not the source of bias, but rather the humans who create them. When these engineers lack the training to recognize bias and are trapped in an “insular bubble” that perpetuates their own biases, then the algorithms that decide who is criminal are no fairer than biased officials who pursue the convictions. An accurate algorithm does not necessarily result in due process.

This is not to say that advances in technology should not be applied to solve crimes that have otherwise halted due to the limits of human analysis. Rather, we must decide which value is more important when two valid rights are in tension. While source code and algorithms should be subject to the protection that intellectual property deserves, protection of property should not come at the expense of liberty. When people are incarcerated as a result of supposedly more precise technology, the deprivation of their freedom cannot be equated to the profit made by companies from their proprietary software.

 

* Carlino Mark Natividad is an Executive Editor on the Michigan Technology Law Review.

Submit a Comment

Your email address will not be published. Required fields are marked *