Posts Tagged ‘failed forensics’

On February 4, the National Registry of Exonerations published its yearly report for 2013.  The Registry, a joint project of the University of Michigan Law School and the Center for Wrongful Convictions at Northwestern University Law School, collects information on exonerations that have occurred since 1989.  The headlines on the new report (from the New York Times to NBC News to the Huffington Post) nearly all focused on one fact: 2013 saw 87 exonerations, the highest yearly total yet in any year since 1989.

This is a significant fact.  But two other things in the report got less notice and deserve more.

First, for most people, “exoneration” is synonymous with “DNA exoneration.”  This is how the world looks, whether on television (think of CSI and its many clones) or in any news source.  But this view doesn’t reflect the real world.  As the report points out, only about 21 percent of the exonerations in 2013 involved DNA (p. 6).  Despite the impression one gets from the media, this has always been the case; of all of the exonerations since 1989, 72 percent were not based on DNA.  And that difference seems to be increasing.  In both 2012 and 2013, non-DNA exonerations increased significantly, while DNA exonerations decreased (p. 12).

The other fact that many in the media did not notice: for the last two years, the percentage of all exoneration cases resolved with the cooperation of the police or prosecutors has risen dramatically.  In 2012, almost half of all the cases featured cooperation of the police or prosecutors in re-examining cases, leading to exoneration; the average percentage in all the years before (1989-2011) did not reach 30 percent.  The trend continued this year, with almost police or prosecutors cooperating in almost 40 percent of all exonerations.  (A few media organizations, such as Fox News, NPR, and the Christian Science Monitor, featured this fact in their headlines and/or stories.)

This is a very welcome and important development.  While some exonerations have always come about with law enforcement cooperation, this was not the trend.  Despite assurances from Scott Burns, executive director of the National District Attorneys Association that “we always did that, we just didn’t” have a name for the process (see his quote here), the data on the last two years do show a greater willingness to re-examine old cases than in years past.  According to Samuel Gross of the University of Michigan Law School, who edits the Registry, “the sharp, cold shower that DNA gave to the criminal justice system has made us realize that we have to re-examine” closed cases whether with DNA or not.  That idea appears to be sinking on a much wider basis.  And that is all to the good.

 

Following up on my last post, in which I asked why there were still no national standards for forensic science five years after the National Academy of Sciences’2009 report Strengthening Forensic Science in the United States, and with scandal after scandal in U.S. crime labs all over the country, there may be light on the horizon.  On January 10, the U.S. Department of Justice (DOJ)  and the National Institute for Standards and Technology (NIST) announced the formation of the National Commission on Forensic Science.

According to the announcement issued by DOJ and NIST:

Members of the commission will work to improve the practice of forensic science by developing guidance concerning the intersections between forensic science and the criminal justice system. The commission also will work to develop policy recommendations for the U.S. Attorney General, including uniform codes for professional responsibility and requirements for formal training and certification.

John P. Holdren, Assistant to the President for Science and Technology and Director of the White House Office of Science and Technology Policy, said that the Commission “will help ensure that the forensic sciences are supported by the most rigorous standards available—a foundational requirement in a nation built on the credo of ‘justice for all.’ ”

The formation of the Commission could be the a significant milestone in the march toward the use of real science and defensible national standards in forensic labs.  But it may be limited in what it can achieve just by its creation and structure: it is not a body created by Congress with power to come up with and implement standards or to regulate anything.  Rather it is a federal advisory committee, formed under the Federal Advisory Committee Act of 1972.  (A quick primer on the Act is here.)   It investigates and debate designated topics, and then reports its recommendations to the relevant federal department(s) that formed it (in this case, the DOJ and NIST).  Those agencies could choose to embrace and follow, or could choose to reject, some, all, or none of the Commission’s suggestions.

Still, this is a hopeful sign that we might be heading in the right direction.  At the very least, we will see a national conversation between the very large number of Committee members; they come from a variety of backgrounds in government, science, the legal system, and elsewhere.  See the list of more than thirty Commission members at the bottom of this announcement.

I hope readers will weigh in on the following question: realistically, what will come from the Committee?  Will the government adopt these recommendations?  Will the recommendations include national standards to regulate forensic testing, assure quality control, and the like? In the end, will the work that you foresee coming from the Commission improve the U.S.’s largely unregulated system?

For those in the Chicago area, I’ll be speaking about my book “Failed Evidence: Why Law Enforcement Resists Science” on Wednesday, October 16, at 5 p.m. at Gage Gallery, 18 S. Michigan Avenue.  The event is free and open to the public.  The event is sponsored by Roosevelt University’s Joseph Loundy Human Rights Project, and is part of their annual speaker series.  The link to the event is here.

The next public event for “Failed Evidence” will be in Pittsburgh on November 6, at noon at the Harvard Yale Princeton Club, 619 William Penn Place, The talk will be sponsored by the Allegheny County Bar Association and the Pitt Law Alumni Association.  More details to follow.

On August 12, a U.S. federal judge found that the New York Police Department  had systematically violated the U.S. Constitution in the way it performed stops and frisks.  Judge Shira Scheindlin’s 198-page opinion in the case, Floyd v. New York, is here.

I’ve discussed the case in earlier posts (here and here), so an explanation of the judge’s opinion is important.  Today, I’m going to talk about what the judge actually said in her opinion; there’s already a lot of misinformation out there.  Tomorrow, I’ll discuss the remedies the judge has required: the actions that NYPD will have to take to bring itself into compliance with the law.

First, the decision does not “outlaw” stop and frisk; it does not stop the NYPD from using this long-established tactic.  The judge said, correctly, that stop and frisk is a legal and constitutional tactic that police may use; the U.S. Supreme Court said so in 1968, in Terry v. Ohio.  But, the judge said, the police must obey some basic constitutional rules when they do so.   That, she said, was the problem: the NYPD was using stop and frisk unconstitutionally, in violation of the Fourth Amendment right against unreasonable searches and seizures, by stopping people without the required reasonable suspicion: a very small amount of fact-based evidence pertaining to the individual person, but something the police lacked in many tens of thousands of these encounters.  In that respect, the judge ruled, the NYPD “has a policy or custom of violating the Constitution by making unlawful stops and conducting unlawful frisks” (p. 3).

Second, the judge made clear that she was not ruling on how effective the use of stop and frisk may or may not have been in fighting crime, either standing alone or in comparison with other police tactics for fighting crime.  (See, e.g.,  p. 2), Rather, she said, she was ruling on whether the way stop and frisk had been used squared with the Constitution.

Third, the judge found that the way that stop and frisk was practiced by the NYPD was racially skewed against Black and Latino New Yorkers, in violation of the Fourteenth Amendment’s Equal Protection Clause.  This finding has grabbed the most headlines, because the judge called it “a form of racial profiling.”  This led to outraged reaction by NYPD Commissioner Ray Kelly (transcript here), among others.  But the judge’s reasoning on this point, found in summary form on pp. 10 and 11 of the opinion, is not actually the stuff of controversy or bombshells; it comes directly from the testimony she heard:

…[T]he evidence at trial revealed that the NYPD has an unwritten policy of targeting “the right people” for stops. In practice, the policy encourages the targeting of young black and Hispanic men based on their prevalence in local crime complaints…While a person’s race may be important if it fits the description of a particular crime suspect, it is impermissible to subject all members of a racially defined group to heightened police enforcement because some members of that group are criminals. The Equal Protection Clause does not permit race-based suspicion.

In conclusion, the judge said, the NYPD violated the Fourth and Fourteenth Amendments to the Constitution not by doing stops and frisks, but by doing them in an unconstitutional manner over a period of years, despite being on notice that there were constitutional problems, and these practices “were sufficiently widespread as to have the force of law.”

That’s the legal and constitutional basis for the judge’s decision that the time has come to reform the use of stop and frisk in New York.  In my next post, I will discuss how the judge will require that reform to take place.

 

The U.S. Department of Justice (DOJ), in partnership with the Innocence Project and the National Association of Criminal Defense Lawyers (NACDL), will review 2,000 cases in which microscopic hair analysis of crime scene evidence was conducted by the FBI Laboratory.  The review, prompted by the DNA-based exoneration of several men convicted on the basis of hair microscopy, will focus on “specific cases in which FBI Laboratory reports and testimony included statements that were scientifically invalid.”  The Innocence Project’s announcement of the review is here; a representative news article is here.

In a move that shows just how seriously the DOJ is taking this review, it has done something unheard of:

Because of the importance attached to these cases, the DOJ has agreed, for the first time in its history, not to raise procedural objections, such as statute of limitations and procedural default claims, in response to the petitions of criminal defendants seeking to have their convictions overturned because of faulty FBI microscopic hair comparison laboratory reports and/or testimony.

Translation: DOJ is not going to fight this review in any of these cases; they’re going to be part of it.

It’s hard to describe the magnitude of the shift in outlook this represents.  Usually, as readers of Failed Evidence know, law enforcement (and I include DOJ in that phrase) resists science-based review and testing; that’s the thrust of the book.  I am happy to say that this is refreshingly different.  According to Peter Neufeld, Co-Director of the Innocence Project, “[t]he government’s willingness to admit error and accept its duty to correct those errors in an extraordinarily large number of cases is truly unprecedented.  It signals a new era in this country that values science and recognizes that truth and justice should triumph over procedural obstacles.”

Of course, this review will not affect cases in which hair analysis was handled by state crime labs.  But here’s hoping they will take this as an example, as the Grits for Breakfast blog argues ought to be done in Texas.

For a sense of the damage that sloppy hair analysis and testimony about it has done in prior cases, listen to this NPR story and interview about the case of Dennis Fritz, in Ada, Oklahoma.  John Grisham’s nonfiction book “The Innocent Man” is an excellent read about the case.

Maybe this is the beginning of a trend.  Hats off to DOJ, the Innocence Project, and NACDL.

When a former high-ranking Justice Department official speaks of a “revolution” in criminal justice, with the whole field turning toward science, could it mean less failed evidence in the future?  What does it mean for those concerned with faulty forensic science?

Laurie Robinson served as Assistant Attorney General in both the Clinton and Obama Justice Departments, where she oversaw the Office of Justice Programs (OJP), the research, statistics and criminal justice assistance arm Justice.  That made her remarks to the Delaware Center for Justice the other day worth noticing.  According to the Wilmington News Journal, “[w]e’re seeing something akin to a revolution in criminal justice in this country,” said Robinson, now on the faculty of George Mason University. “We’re at an important crossroads, one where ideology has taken a back seat, and science and pragmatism have come to the fore.”

Robinson’s web page at George Mason says her tenure at OJP  “was marked by a focus on science and evidence-based programming.”  She was in Delaware to discuss the state’s re-entry programs  and other initiatives to reduce recidivism, and few would disagree that those important programs need scientific and statistical support.   But I wonder whether Robinson would be as optimistic about science’s role in forensic methods, which have played a role in about half of all wrongful convictions across the U.S.  Surely, forensic science needs “science and evidence-based” support and examination — badly.

 It has now been more than four years since the release of the landmark 2009 National Academy of Sciences report Strengthening Forensic Sciences in the United States: A Path Forward, which found that except for DNA and chemical analysis, most of what we think of as forensic science isn’t science at all.  In that time, little seems to have changed; scandals in crime labs continue to pile up in jurisdictions across the country (see my posts about lab scandals just this past year in Massachusetts  and Minnesota,  for example).  In a particularly compelling piece of writing in the Huffington Post, Radley Balko discusses the long-running crime lab scandal in Mississippi, and puts it in context: Mississippi’s scandal “is just the latest in a long, sad line of such stories” that Balko has already chronicled.

What’s to be done?  As a start, I argued in chapter 7 of  Failed Evidence, all federal money that goes to law enforcement should carry with it a requirement for compliance with best practices in police investigation and forensic science.  Not all law enforcement agencies run their own forensic labs (and as the NAS report said in Chapter 6, “Improving Methods, Practice, and Performance in Forensic Science,” labs should be independent of law enforcement.).  But for those that do, compliance with standards that would avoid systematic error, human biases, fraud, and improper scientific testing and testimony should be mandatory.

That would start a revolution right there.  Because what law enforcement agency could afford to just turn down federal funding, in budgetary times like these?

 

Of all of the methods studied in the 2009 National Academy of Sciences’ report on forensic science, forensic odontology — the analysis of bite marks to identify the perpetrators of crimes — has come under some of the harshest criticism.  And now an effort is underway to keep it out of courtrooms for good.

This article by the Associated Press examined “decades of court records, archives, news reports and filings by the Innocence Project in order to compile the most comprehensive count to date of those exonerated after being convicted or charged based on bite mark evidence.”   Typically, the evidence consisted of testimony by forensic dentists identifying marks on the bodies of victims (usually in cases of murder or sexual assault) as coming from the defendant’s teeth.  According to the AP, since 2000, DNA identification has overturned one after another of these cases, throwing the entire “discipline” into question, and rendering it nearly obsolete.

Critics make two main claims against forensic odontology.  First, there’s no scientific proof that a bite mark can be matched to any particular set of teeth.  There is no data, no experimental evidence — nothing — on which to base the idea that forensic odontology can make reliable identifications.  Second, human skin — almost always the site of the bite mark in question — does not reliably “record” bite marks, since the skin itself changes over time, even after death.  Because the skin changes shape, consistency, color, and even size  after the mark is made, this makes bite marks, and the method itself, inherently unstable.

Yet the dentists, who belong to the American Board of Forensic Odontology, insist that their methods of making identifications of the source of bite marks are reliable; it’s just that some forensic dentists are, well, not too good or biased.  “The problem lies in the analyst or the bias,” said Dr. Frank Wright, a forensic dentist in Cincinnati. “So if the analyst is … not properly trained or introduces bias into their exam, sure, it’s going to be polluted, just like any other scientific investigation. It doesn’t mean bite mark evidence is bad.”  It’s the familiar refrain: it’s just a few bad apples, not the whole barrel.  But according to the AP:

Only about 100 forensic dentists are certified by the odontology board, and just a fraction are actively analyzing and comparing bite marks. Certification requires no proficiency tests. The board requires a dentist to have been the lead investigator and to have testified in one current bite mark case and to analyze six past cases on file — a system criticized by defense attorneys because it requires testimony before certification…The consequences for being wrong are almost nonexistent. Many lawsuits against forensic dentists employed by counties and medical examiner’s offices have been thrown out because as government officials, they’re largely immune from liability.  Only one member of the American Board of Forensic Odontology has ever been suspended, none has ever been decertified, and some dentists still on the board have been involved in some of the most high-profile and egregious exonerations on record.

It’s worth noting that there’s nothing wrong with other uses of forensic dentistry, such as identifying human remains from teeth by matching them to existing dental records; that type of forensic work is generally rock solid and remains unchallenged by critics.  But on identification of perpetrators through bite mark analysis, the question is different: when a forensic “science” has as dismal a record as forensic odontology does, and no scientific proof of its validity exists, can anything justify allowing the use of bite mark evidence to convict a person of a crime?

The National Registry of Exonerations, a joint project of the law schools at University of Michigan and Northwestern, reported last week that in 2012, law enforcement cooperated in some way in a higher percentage of cases than in the past.  Does this mean less resistance of science by law enforcement?

The answer is that we can’t tell from this data.  But the report is worth looking at nonetheless.

Here is what the report says about law enforcement cooperation (I have removed the bullets, spacing, etc.):

In  2012 there was a dramatic increase in the number and the proportion of exonerations that prosecutors or police participated in obtaining.  Of the 63 exonerations in 2012, prosecutors or police initiated or cooperated in 34, or 54%. Over the past 24 years, prosecutors and police have cooperated in 30% of the exonerations we know about (317/1050). Last year for the first time they cooperated in a majority of exonerations, and the number of such cases is a large increase from the previous high (22 of 57 in 2008, or 39%).

This is all to the good.  But there are some aspects of the findings that counsel caution.  First, the author(s) of the report freely admit they don’t know why  this is happening.  It could have many causes.

This increase [in cooperation] may be due to a confluence of related factors: changes in state laws that facilitate post-conviction DNA testing, the emergence of Conviction Integrity Units in several large prosecutorial offices, and, perhaps, a change in how law enforcement officers view the possibility of false convictions at trial.

And just to be clear, it’s good that they admit that it’s not clear what the cause or causes could be.  Too often, those working with statistics take an opposite tack.

As far as resistance to science, however, the report may indicate that the cases where science and forensics matter most still do not get cooperation for law enforcement.  The first clue is that 57% of the exonerations in 2012 were homicide cases, and another 24% were sexual assaults — 81% in all.  These are the types of cases in which resistance science on eyewitness identification, interrogation, and  forensics can matter the most.  These cases also have the highest public profile.  And there, perhaps, is the rub: “Official cooperation is least common among exonerations for highly aggravated and publicized crimes – murders with death sentences and mass child sex abuse prosecutions – and most common among exonerations for robberies and drug crimes.”

The best way to answer whether this increased cooperation represents any lessening of the resistance to science would be to look at the individual exoneration cases for 2012: do they feature law enforcement cooperation over these science-based issues, or is it something else — for example, information on a witness interviews illegally withheld from the defendant in a previous trial?

Perhaps we will see that in the next report from the Registry.

When I’m in Cincinnati for talks on Failed Evidence tonight, April 4, at 7:00 pm at the Clifton Cultural Arts Center and tomorrow, April 5, at noon at the University of Cincinnati School of Law, one topic sure to come up is an article from the April 3 New York Times, “Advances in Science of Fire Free a Convict After 42 Years. ”   Louis Taylor was serving 28 life sentences for the deaths caused by a fire in a Tucson, Arizona hotel in December of 1970.  Taylor, then just 16 years old, was convicted of arson based on faulty forensic science.

Mr. Taylor has been release from prison.  He is now 58 years old.

The story highlights the state of arson investigation, past and present.

A few years ago, the National Academy of Sciences turned its attention to the misuse of science in courtrooms, saying that pseudoscientific theories had been used to convict people of crimes they may not have committed. By then, a small group of fire engineers had already begun to discredit many of the assumptions employed in fire investigations, like the practice of using the amount of heat radiated by a fire to assess if an accelerant had been used.

Unlike DNA evidence, which can exonerate one person and sometimes incriminate another, the evidence collected in some arson investigations does not yield precise results. Often much of the evidence has been lost or destroyed. In the case of the hotel fire here, all that is left are photographs, reports and chemical analysis, all of them assembled to prove arson.

As a result, “we can’t definitely say what really caused the fire,” said John J. Lentini, a veteran fire investigator who wrote a report on Mr. Taylor’s case. “But what we can do is discredit the evidence” used to support the charge.

The case recalls the story of the trial and execution of Cameron Todd Willingham in Texas, executed in 2004 for the deaths of his children in a fire.  Experts call the arson in that case terribly flawed — just as in Mr. Taylor’s case.

The science surrounding investigation is light years ahead of where it used to be, even a decade ago.  It’s time that all of the old cases in which verdicts depended on outmoded and discredited methods of arson investigation be re-examined, and if necessary overturned.

On Thursday April 4, and Friday April 5, I’ll be in Cincinnati for two discussions of Failed Evidence: Why Law Enforcement Resists Science (2012).  Both are free and open to the public.

On April 4, I’ll be discussing the book at 7:00 p.m. at the Clifton Cultural Arts Center, 3711 Clifton Avenue, Cincinnati OH 43220.  The event is sponsored by the ACLU of Ohio.

On April 5, I’ll present at talk at the University of Cincinnati College of Law at noon.  The address is  2540 Clifton Ave, Cincinnati, OH 45221.  The event is in Room 114.  The event is sponsored by the Lois and Richard Rosenthal Institute for Justice/Ohio Innocence Project.  The event has been approved for CLE credit for attorneys.