Forensics Innocence

Innocence Project Says DOJ is Turning Dangerously Away From Ensuring that Forensic Testimony is Guided by Science Not Law Enforcement & Prosecutors

Celeste Fremon
Written by Celeste Fremon

Fifty-five-year-old Jimmy Genrich of Grand Junction, Colorado, has been in prison for nearly 25 years for series of bombings he has long said he did not commit. His conviction for the bombings that terrified residents of Grand Junction was based primarily on something called explosives toolmark analysis, a pattern-matching process akin to the controversial art of bite mark analysis, which provided the only physical evidence connecting Genrich to the crimes.

In a deeply-researched longread for the The Nation, Meehan Crist and Tim Requarth wrote about Genrich’s case, which has been taken up by the Innocence Project. In the course of their research, the reporters examined thousands of pages of trial transcripts, and interviewed dozens of prosecutors, defense attorneys, and scientists, which let them to conclude there was “a startling lack of scientific support for forensic pattern-matching techniques such as toolmark analysis,” that our legal system “has failed to separate nonsense from science even in capital cases,” and that there is a consensus among prosecutors “all the way up to the attorney general’s office” that “scientifically dubious forensic techniques” should be not only protected, but expanded.

(We wrote about Crist and Requarth’s excellent investigative work in the Feb. 6, 2018 issue of our weekly newsletter, the California Justice Report.)

The Nation’s conclusions were supported by a landmark report on forensic science released nearly a decade ago in February 2009 by the National Academy of Sciences (NAS), which concluded that, save for DNA, the rest of the most common forensic methods, including the analyses of ballistics, tool marks, shoe prints, bite marks, and more, had simply not been scientifically validated.

“Among existing forensic methods,” the report’s authors wrote, “only nuclear DNA analysis has been rigorously shown to have the capacity to consistently, and with a high degree of certainty, demonstrate a connection between an evidentiary sample and a specific individual or source.”

Six years later, in 2015, the gravity of the issue became scarily clearer when the U.S. Justice Department and the FBI made the staggering admission that, for nearly two decades, nearly all the examiners in an elite FBI forensic unit “gave flawed testimony in almost every trial where those examiners appeared to present evidence against criminal defendants.”

In a 2015 Washington Post story on the announcement, Spencer S. Hsu reported that, of the 28 examiners in the FBI Laboratory’s microscopic hair comparison unit,”26 overstated forensic matches in ways that favored prosecutors in more than 95 percent of the 268 trials reviewed,’ according to the National Association of Criminal Defense Lawyers, and the Innocence Project, both of which assisted the government with “the country’s largest post-conviction review of questioned forensic evidence.”

The cases uncovered in the massive review included those of “32 defendants sentenced to death,” 14 of whom “had already been executed or died in prison.” And that was after the review of only the first 200 convictions.

In September 2015, President Obama responded to the 2009 report and the string of 2015 revelations by asking his President’s Council of Advisors on Science and Technology, or PCAST—a body that has been rechartered by every president since it began in 1933 with President Franklin D. Roosevelt’s Science Advisory Board—to further examine the forensic issue and how science might be better brought to bear on the way it was practiced. PCAST returned in 2016 with a 174-page report that, once again, pointed to the lack of reliability of techniques that link a person or an item to a crime scene through bite marks or treads from shoes or tires, and the like, which fell “far short” of scientific standards and lacked “meaningful evidence” of their accuracy.

The PCAST report was especially critical of the use of bite marks as evidence, as Jordan Smith of the Intercept reported.

“PCAST finds that bitemark analysis does not meet the scientific standards for foundational validity, and is far from meeting such standards,” the report reads. “To the contrary, available scientific evidence strongly suggests that examiners cannot consistently agree on whether an injury is a human bitemark and cannot identify the source of [a] bitemark with reasonable accuracy.”

Less than a year later, however, the movement toward shining a brighter scientific light on the world of forensics skidded nearly to a halt when, a few months after his confirmation, Attorney General Jeff Sessions slammed the door on the DOJ’s partnership with independent scientists appointed by President Obama to raise forensic science standards.

Sessions also suspended an expanded review of FBI testimony as it has pertained to evidentiary techniques that researchers say are still unverified. Sessions said that a new DOJ strategy regarding forensics would be set by advisors who were strictly in-house.

On Wednesday of this week, that new strategy was semi unveiled by Deputy Attorney General Rod Rosenstein when Rosenstein spoke at the annual meeting of the American Academy of Forensic Sciences, where he outlined the DOJ’s plans regarding forensic sciences, which did not seem to reassure anyone save for some of the nation’s prosecutors.

According to a statement released Thursday by the Innocence Project, Rosenstein’s remarks only served to “renew concerns that the DOJ is backtracking on progress to ensure that forensic disciplines are guided by the best science and that safeguards were enacted to insulate practitioners from law enforcement influence.”

Chris Fabricant, director of Strategic Litigation at the Innocence Project went further.

“We’ve known since 2009 that there are problems with the scientific validity of forensic disciplines used to identify suspects with the exception of DNA evidence,” wrote Fabricant in reference to the 2009 NAS report.

“Yet after this administration shut down the National Commission of Forensic Science, the first inclusive and transparent effort to address these fundamental flaws in evidence that is used in countless prosecutions across the nation,” continued Fabricant, “there was no mention by Deputy Attorney General Rosenstein of how the Department of Justice plans to address this core validity problem.”

Given that the National Registry of Exonerations has estimated that faulty forensics was a factor in 24 percent of wrongful convictions documented from 1989 to the present, a failure to address the “core validity problem” of forensics in the courtroom is not—to drastically understate the matter—a good thing.


Post Script

Scott Henson, founder of the excellent, Texas-based site, Grits for Breakfast (which focuses on justice reform), did an interview with Innocence Project co-founder Peter Neufeld on December 27, in which Neufeld talked about forensic-science issues including the abolition of the national forensic science commission, and more. It’s a smart, knowledgable, and thoroughly hair-raising conversation. So if you’re interested in this crucial (and deeply alarming) topic, do yourself a favor and listen.

3 Comments

Leave a Reply to iconceptseo X