Posts Tagged ‘Wrongful convictions’

Those of you who follow this blog have read (e.g., here and here) about conviction integrity units (CIUs): small groups of attorneys in a district attorney’s office who have the mission of investigating claims of wrongful  convictions in past cases tried by that same office.  These units, just like homicide units, major crimes units, or others in the DA’s office, are dedicated to one type of work: investigating claims of wrongful conviction.  grity work.

The first conviction integrity unit in the country was established by Dallas DA Craig Watkins, in order to have a regular way to investigate the claims of wrongful convictions that his office already faced, and others that might arise in the future.

I support CIUs.  They assure that the DA’s office has a built-in way to address any substantive claim of wrongful conviction.  They can work in partnership with local innocence projects, which can serve as screeners for claims of innocence, in order to point CIUs to cases with real, tenable claims.  (This was the arrangement between the Dallas DA’s CIU and the Texas Innocence Project when I researched and wrote my book Failed Evidence.)  CIUs are far from a perfect solution; they are, after all, part of the DA’s office that may have made the alleged mistakes being investigated, and so they lack independence.  But without a better alternative — for example, a state-created agency like North Carolina’s Innocence Inquiry Commission – CIUs can do the job, and can be created immediately, on the orders of the DA.

CIUs have begun to spread to DA’s offices across the country.  And with that visibility comes some serious thinking about how best to assure conviction integrity.  A conference will take place this Friday, April 4, and Saturday, April 5, called “A Systems Approach to Conviction Integrity,” sponsored by the Quattrone Center at the University of Pennsylvania Law School in Philadelphia.  The event is free, and will be live streamed on the internet for those (like me) who cannot get to Philadelphia to attend.

Take a look at the description of the conference and the agenda.  It’s designed to help people involved in the criminal justice system learn to use quality control systems from experts in laboratory science, aviation and aeronautics, medicine, public health, transportation and other fields who have created mechanisms and institutional cultures designed to reduce and ferret out errors.  Here’s a sample of the conference statement, just to give everyone a sense:

The problem of quality control pervades many of the systems in our society.  Panelists, each expert in quality control and systems error reduction in a complex, high-risk field, will explore efforts to address quality control in a range of other important areas, such as healthcare, aviation, laboratories, etc., and how maintaining quality in the criminal justice system may be similar to and different from quality control in these other areas.

If you are interested in the problem of reducing wrongful convictions going forward — that is, not just correcting the errors of the past, but avoiding them in the future — I urge you to attend or watch via the web stream.  The conference will be a milestone along the road to a better, more accurate criminal justice system, with a ton of information we can all use.

 

On February 4, the National Registry of Exonerations published its yearly report for 2013.  The Registry, a joint project of the University of Michigan Law School and the Center for Wrongful Convictions at Northwestern University Law School, collects information on exonerations that have occurred since 1989.  The headlines on the new report (from the New York Times to NBC News to the Huffington Post) nearly all focused on one fact: 2013 saw 87 exonerations, the highest yearly total yet in any year since 1989.

This is a significant fact.  But two other things in the report got less notice and deserve more.

First, for most people, “exoneration” is synonymous with “DNA exoneration.”  This is how the world looks, whether on television (think of CSI and its many clones) or in any news source.  But this view doesn’t reflect the real world.  As the report points out, only about 21 percent of the exonerations in 2013 involved DNA (p. 6).  Despite the impression one gets from the media, this has always been the case; of all of the exonerations since 1989, 72 percent were not based on DNA.  And that difference seems to be increasing.  In both 2012 and 2013, non-DNA exonerations increased significantly, while DNA exonerations decreased (p. 12).

The other fact that many in the media did not notice: for the last two years, the percentage of all exoneration cases resolved with the cooperation of the police or prosecutors has risen dramatically.  In 2012, almost half of all the cases featured cooperation of the police or prosecutors in re-examining cases, leading to exoneration; the average percentage in all the years before (1989-2011) did not reach 30 percent.  The trend continued this year, with almost police or prosecutors cooperating in almost 40 percent of all exonerations.  (A few media organizations, such as Fox News, NPR, and the Christian Science Monitor, featured this fact in their headlines and/or stories.)

This is a very welcome and important development.  While some exonerations have always come about with law enforcement cooperation, this was not the trend.  Despite assurances from Scott Burns, executive director of the National District Attorneys Association that “we always did that, we just didn’t” have a name for the process (see his quote here), the data on the last two years do show a greater willingness to re-examine old cases than in years past.  According to Samuel Gross of the University of Michigan Law School, who edits the Registry, “the sharp, cold shower that DNA gave to the criminal justice system has made us realize that we have to re-examine” closed cases whether with DNA or not.  That idea appears to be sinking on a much wider basis.  And that is all to the good.

 

The International Association of Chiefs of Police (IACP) is one of the leading organizations for law enforcement professionals in the U.S. and around the world.  I regularly turn to their model policy and training documents when working on those issues for police agencies.  So it’s a big deal to see their new report, prepared in conjunction with their partner, the U.S. Department of Justice Office of Justice Programs, announcing that their new effort in which they will play a leading role in fixing the problems in police investigation that cause wrongful convictions.

The report, titled, “National Summit on Wrongful Convictions: Building a Systemic Approach to Prevent Wrongful Convictions,” takes a full view of the issues that must be addressed to avoid convicting the wrong people, and announces a series of recommendations designed to bring the goal within reach.  It is based on work at a summit of people from IACP, DOJ, and a host of experts.  In a preliminary statement in the report, the President of the IACP and the Assistant Attorney General for the Office of Justice Programs, outlined how the report came to be and what it does.

This event gathered 75 subject matter experts from all key disciplines to address and examine the causes of and solutions to wrongful convictions across the entire spectrum of the justice system. Summit participants worked diligently during this one-day intensive event to craft 30 focused policy recommendations that guide the way to our collective mission to continually improve the criminal justice system. The summit focused on four critical areas: (1) making rightful arrests, (2) correcting wrongful arrests, (3) leveraging technology and forensic science, and (4) re-examining closed cases. The 30 resulting recommendations directly address these areas and lay a critical foundation for required changes in investigative protocols, policies, training, supervision, and assessment.

The report makes thirty recommendations on a number of topics: eyewitness identifications, false confessions, preventing investigative bias, improving DNA testing procedures, CODIS, correcting wrongful arrests, leveraging technology and forensic science, and re-examining closed cases with an openness to new information.

The report is absolutely essential reading for anyone interested in wrongful convictions and what can be done to correct them.  Readers of my book Failed Evidence will also recognize that the emergence of this consensus at the top of the law enforcement profession is exactly what I have called for: “Police and Prosecutors Must Lead the Effort” (pp. 158-159).

The use of DNA identification as a forensic tool, beginning in 1989, changed the way that we think about guilt, innocence, and traditional police investigation.  It isn’t just the 311 wrongful convictions that DNA identification has confirmed; it’s the far more numerous cases in which DNA has determined guilt — sometimes in cases years or decades old.

Now DNA identification is about to change: it will become even more powerful than it is now.  A new way of processing and interpreting DNA has arrived that will make our current DNA techniques look weak by comparison.

On Friday, Nov. 8, I attended an incredibly interesting talk by Dr. Ria David.  Dr. David is one of the co-founders of Cybergenetics, a company based in Pittsburgh.  Cybergenetics has perfected computer-based techniques and technologies that will change the way that DNA is analyzed.  With Cybergenetics’ TrueAllele (R) system, the analysis relies the power of computers instead of interpretation done by humans.   The talk was sponsored by the Center for Women’s Entrepreneurship (CWE) at Chatham University.  (Disclosure: my wife runs CWE; my wife and I know Dr. David and her Cybergenetics co-founder, Dr. Mark Perlin, but neither my wife nor I have any personal or financial ties of any kind to Cybergenetics.)

Most of us know that a DNA sample allows forensic scientists to say things like “the odds that this sample came from anyone other than the defendant are fifty million to one.”  Pretty powerful stuff — until you learn that Cybergenetics’ systems will allow prosecutors to offer juries odds of not tens of millions to one, but trillions or even quadrillions to one.  In addition, Cybergenetics will allow analysts to pull apart mixtures of DNA from different people, which is common at crime scenes, and which current DNA technology often can’t handle.  Readers of my book Failed Evidence: Why Law Enforcement Resists Science can get a little more information in Chapter 7, pp. 186-190; you can get the book here.

Cybergenetics’ DNA system has found ready acceptance in the United Kingdom, but the process has been slower in the U.S. There has been considerable resistance — something readers of Failed Evidence are quite familiar with — particularly at the FBI, which governs current DNA protocols and use.

There is much more to how the Cybergenetics’ TrueAllele system works, and what it can do; I’d urge readers to take a good look at Cybergenetics web site, which gives details on what they do, and the many criminal cases and mass disaster identification cases (including the identification of remains at the World Trade Center site).  Once law enforcement sees what this new method of using DNA can do, and once resistance to change is overcome, DNA will be able to identify many more guilty criminals, as well as exonerate many more of the wrongfully convicted, and it will do so with more certainty that we ever thought possible.

 

For those in the Chicago area, I’ll be speaking about my book “Failed Evidence: Why Law Enforcement Resists Science” on Wednesday, October 16, at 5 p.m. at Gage Gallery, 18 S. Michigan Avenue.  The event is free and open to the public.  The event is sponsored by Roosevelt University’s Joseph Loundy Human Rights Project, and is part of their annual speaker series.  The link to the event is here.

The next public event for “Failed Evidence” will be in Pittsburgh on November 6, at noon at the Harvard Yale Princeton Club, 619 William Penn Place, The talk will be sponsored by the Allegheny County Bar Association and the Pitt Law Alumni Association.  More details to follow.

Of the 311 cases of wrongful convictions documented by The Innocence Project, about 25 percent include a false confession or false statement of guilt.  Yet false confessions remain the least understood type of justice system error.  Most people still ask, “why would anyone confess to a serious crime he did not commit, without physical abuse, a mental handicap or lack of sobriety?  I know I would never do that.”   Twenty years ago, I would have said the same thing.

Well, if you want to know how false confessions happen — how an innocent person could confess, even supplying details of how the crime was committed that only the perpetrator would know — and if you want to know how this could happen with a fine police detective operating according to the rules — you must listen to the latest episode of the radio show This American Life, called “Confessions.”  Here is the link to the show.  The story (one of several on the theme of confessions) runs approximately 28 minutes

A very brief summary, without giving anything away: A Washington, D.C. detective investigating a murder participated in the interrogation of the main suspect.  The woman denies any involvement at first, but after seventeen hours of questioning, she finally admits to participating in the crime, and supplies many incriminating details.  After the suspect is charged but before her case goes to trial, follow up investigation by police causes the case to fall apart, and a judge orders her released from jail after nine months.  The case is never solved.  Some years later, the same detective is assigned to a cold case unit, and he begins to look into the case again by watching a video tape of the interrogation.  What he sees reveals what went wrong, and it lays out an incredible lesson in exactly how the false confessions come to be.  And we learn that the video tape recording of the full interrogation was actually made just by chance; in the usual course of things, there would have been no recording, and none of this would have been discovered.

For anyone interested in police interrogation, for anyone still asking how an innocent person could ever confess, I cannot recommend this program more highly.  And it’s yet another endorsement of the idea that we must record interrogations if we are ever to solve this problem.

The U.S. Department of Justice (DOJ), in partnership with the Innocence Project and the National Association of Criminal Defense Lawyers (NACDL), will review 2,000 cases in which microscopic hair analysis of crime scene evidence was conducted by the FBI Laboratory.  The review, prompted by the DNA-based exoneration of several men convicted on the basis of hair microscopy, will focus on “specific cases in which FBI Laboratory reports and testimony included statements that were scientifically invalid.”  The Innocence Project’s announcement of the review is here; a representative news article is here.

In a move that shows just how seriously the DOJ is taking this review, it has done something unheard of:

Because of the importance attached to these cases, the DOJ has agreed, for the first time in its history, not to raise procedural objections, such as statute of limitations and procedural default claims, in response to the petitions of criminal defendants seeking to have their convictions overturned because of faulty FBI microscopic hair comparison laboratory reports and/or testimony.

Translation: DOJ is not going to fight this review in any of these cases; they’re going to be part of it.

It’s hard to describe the magnitude of the shift in outlook this represents.  Usually, as readers of Failed Evidence know, law enforcement (and I include DOJ in that phrase) resists science-based review and testing; that’s the thrust of the book.  I am happy to say that this is refreshingly different.  According to Peter Neufeld, Co-Director of the Innocence Project, “[t]he government’s willingness to admit error and accept its duty to correct those errors in an extraordinarily large number of cases is truly unprecedented.  It signals a new era in this country that values science and recognizes that truth and justice should triumph over procedural obstacles.”

Of course, this review will not affect cases in which hair analysis was handled by state crime labs.  But here’s hoping they will take this as an example, as the Grits for Breakfast blog argues ought to be done in Texas.

For a sense of the damage that sloppy hair analysis and testimony about it has done in prior cases, listen to this NPR story and interview about the case of Dennis Fritz, in Ada, Oklahoma.  John Grisham’s nonfiction book “The Innocent Man” is an excellent read about the case.

Maybe this is the beginning of a trend.  Hats off to DOJ, the Innocence Project, and NACDL.

I’ve written before about Conviction Integrity Units (CIUs) in prosecutors’ offices (take a look here and here).  CIUs are groups of lawyers within prosecutors’ offices — just like a major crimes unit or a narcotics unit, though probably much smaller than either of these — with the job of investigating questionable past convictions from that same office.  CIUs do this when presented with evidence that raises real doubts about the guilt of a convicted defendant in one of the office’s past cases.

The first CIU was established by Dallas DA Craig Watkins, who had then just been elected against a backdrop of more than 20 exonerations of people wrongfully convicted under the past leadership of his agency.  (The latest story about Watkins and the exoneration of wrongfully convicted people in Dallas — examining a fascinating twist on exonerations — is here.)  Just a couple of years later, Patricia Lykos, then the newly elected DA in Houston, established a CIU in her office.  From Texas, the idea has begun to spread.

Both the Dallas and Houston CIUs have one thing in common: they were launched by new DAs to investigate cases that originated under former administrations.  No doubt this is easier than investigating mistakes that have happened under one’s own watch.

That’s what makes this story out of the Brooklyn DA’s office so interesting.  Charles Hynes, the elected DA of Brooklyn, established a CIU that will be looking at cases in which prosecutors obtained convictions during his own six terms in office.  For example, at the end of March, David Ranta, 58, was released after spending 23 years in prison for the killing of a rabbi in Williamsburg, Brooklyn.  The original conviction came under Hynes’ leadership; when Ranta was released, Hynes gave credit for the release to his CIU.

CIUs accomplish something very fundamental: they make the task of uncovering mistakes in the justice system into a routine operation.  On the other hand, as readers of this blog have correctly pointed out, CIUs are a lot like internal affairs divisions in police departments: the DAs are investigating themselves.  This is easier to credit when the head of the office is not the same person who was at the helm when the mistakes were made.  It is also why CIUs  are still rare when the head of the office has been serving a long time — long enough to be the one responsible for the mistakes under investigation.

When I’m in Cincinnati for talks on Failed Evidence tonight, April 4, at 7:00 pm at the Clifton Cultural Arts Center and tomorrow, April 5, at noon at the University of Cincinnati School of Law, one topic sure to come up is an article from the April 3 New York Times, “Advances in Science of Fire Free a Convict After 42 Years. ”   Louis Taylor was serving 28 life sentences for the deaths caused by a fire in a Tucson, Arizona hotel in December of 1970.  Taylor, then just 16 years old, was convicted of arson based on faulty forensic science.

Mr. Taylor has been release from prison.  He is now 58 years old.

The story highlights the state of arson investigation, past and present.

A few years ago, the National Academy of Sciences turned its attention to the misuse of science in courtrooms, saying that pseudoscientific theories had been used to convict people of crimes they may not have committed. By then, a small group of fire engineers had already begun to discredit many of the assumptions employed in fire investigations, like the practice of using the amount of heat radiated by a fire to assess if an accelerant had been used.

Unlike DNA evidence, which can exonerate one person and sometimes incriminate another, the evidence collected in some arson investigations does not yield precise results. Often much of the evidence has been lost or destroyed. In the case of the hotel fire here, all that is left are photographs, reports and chemical analysis, all of them assembled to prove arson.

As a result, “we can’t definitely say what really caused the fire,” said John J. Lentini, a veteran fire investigator who wrote a report on Mr. Taylor’s case. “But what we can do is discredit the evidence” used to support the charge.

The case recalls the story of the trial and execution of Cameron Todd Willingham in Texas, executed in 2004 for the deaths of his children in a fire.  Experts call the arson in that case terribly flawed — just as in Mr. Taylor’s case.

The science surrounding investigation is light years ahead of where it used to be, even a decade ago.  It’s time that all of the old cases in which verdicts depended on outmoded and discredited methods of arson investigation be re-examined, and if necessary overturned.

On Thursday April 4, and Friday April 5, I’ll be in Cincinnati for two discussions of Failed Evidence: Why Law Enforcement Resists Science (2012).  Both are free and open to the public.

On April 4, I’ll be discussing the book at 7:00 p.m. at the Clifton Cultural Arts Center, 3711 Clifton Avenue, Cincinnati OH 43220.  The event is sponsored by the ACLU of Ohio.

On April 5, I’ll present at talk at the University of Cincinnati College of Law at noon.  The address is  2540 Clifton Ave, Cincinnati, OH 45221.  The event is in Room 114.  The event is sponsored by the Lois and Richard Rosenthal Institute for Justice/Ohio Innocence Project.  The event has been approved for CLE credit for attorneys.