Posts Tagged ‘forensic testing’

Following up on my last post, in which I asked why there were still no national standards for forensic science five years after the National Academy of Sciences’2009 report Strengthening Forensic Science in the United States, and with scandal after scandal in U.S. crime labs all over the country, there may be light on the horizon.  On January 10, the U.S. Department of Justice (DOJ)  and the National Institute for Standards and Technology (NIST) announced the formation of the National Commission on Forensic Science.

According to the announcement issued by DOJ and NIST:

Members of the commission will work to improve the practice of forensic science by developing guidance concerning the intersections between forensic science and the criminal justice system. The commission also will work to develop policy recommendations for the U.S. Attorney General, including uniform codes for professional responsibility and requirements for formal training and certification.

John P. Holdren, Assistant to the President for Science and Technology and Director of the White House Office of Science and Technology Policy, said that the Commission “will help ensure that the forensic sciences are supported by the most rigorous standards available—a foundational requirement in a nation built on the credo of ‘justice for all.’ ”

The formation of the Commission could be the a significant milestone in the march toward the use of real science and defensible national standards in forensic labs.  But it may be limited in what it can achieve just by its creation and structure: it is not a body created by Congress with power to come up with and implement standards or to regulate anything.  Rather it is a federal advisory committee, formed under the Federal Advisory Committee Act of 1972.  (A quick primer on the Act is here.)   It investigates and debate designated topics, and then reports its recommendations to the relevant federal department(s) that formed it (in this case, the DOJ and NIST).  Those agencies could choose to embrace and follow, or could choose to reject, some, all, or none of the Commission’s suggestions.

Still, this is a hopeful sign that we might be heading in the right direction.  At the very least, we will see a national conversation between the very large number of Committee members; they come from a variety of backgrounds in government, science, the legal system, and elsewhere.  See the list of more than thirty Commission members at the bottom of this announcement.

I hope readers will weigh in on the following question: realistically, what will come from the Committee?  Will the government adopt these recommendations?  Will the recommendations include national standards to regulate forensic testing, assure quality control, and the like? In the end, will the work that you foresee coming from the Commission improve the U.S.’s largely unregulated system?

Here we are, more than a month after chemist Annie Dookhan, formerly of the Massachusetts State Drug Laboratory, entered a guilty plea to producing fraudulent forensic testing results, and went to prison.  The scandal, potentially involving tens of thousands of cases, has resulted in the release of hundreds of convicted persons. All of this has reportedly cost the state of Massachusetts more than 8 million dollars, and the state has budgeted almost 9 million more for the continuing damage.    Readers have seen coverage of the Massachusetts scandal, and several others, here and here and here.

But Dhookan and the Massachusetts Drug Laboratory are far from alone.  According to a report by National Public Radio, there have been  twelve major crime lab scandals in the U.S. in just the last two years.  With all of this damage  – to individual cases and defendants, to state and local budgets, and to the public trust — occurring all over the country, how have policy makers at the national level responded?

Well, they haven’t.  At least not yet.

Almost five years after the release of the National Academy of Sciences’ report “Strengthening Forensic Science in the U.S.: A Path Forward,” which recommended (among other things) the establishment of national industry standards for forensic labs and a National Institute of Forensic Science,  as well as the independence of every crime lab from police agencies and prosecution offices, none of this has happened.  Nor is accreditation of laboratories required.

According to National Public Radio, Senators John Cornyn of Texas and Patrick Leahy of Vermont “are working to introduce legislation this year” which could address some of these problems.  But nearly five years after the NAS report, and with the parade of crime lab scandals  continuing without let up, why has it taken five years to get to this very preliminary point?

Readers, would mandatory national standards help?  Are they appropriate?  What about requiring accreditation?

If you are from outside the U.S., does your country set mandatory national standards for crime labs?  Is accreditation required?

The time has long since passed for us to do something about this set of problems in the U.S.  We just can’t afford the damage to the credibility of our criminal justice system and the costs of  reviewing cases and releasing convicted prisoners — some of whom may very well be guilty, but whose cases are tainted.

The use of DNA identification as a forensic tool, beginning in 1989, changed the way that we think about guilt, innocence, and traditional police investigation.  It isn’t just the 311 wrongful convictions that DNA identification has confirmed; it’s the far more numerous cases in which DNA has determined guilt — sometimes in cases years or decades old.

Now DNA identification is about to change: it will become even more powerful than it is now.  A new way of processing and interpreting DNA has arrived that will make our current DNA techniques look weak by comparison.

On Friday, Nov. 8, I attended an incredibly interesting talk by Dr. Ria David.  Dr. David is one of the co-founders of Cybergenetics, a company based in Pittsburgh.  Cybergenetics has perfected computer-based techniques and technologies that will change the way that DNA is analyzed.  With Cybergenetics’ TrueAllele (R) system, the analysis relies the power of computers instead of interpretation done by humans.   The talk was sponsored by the Center for Women’s Entrepreneurship (CWE) at Chatham University.  (Disclosure: my wife runs CWE; my wife and I know Dr. David and her Cybergenetics co-founder, Dr. Mark Perlin, but neither my wife nor I have any personal or financial ties of any kind to Cybergenetics.)

Most of us know that a DNA sample allows forensic scientists to say things like “the odds that this sample came from anyone other than the defendant are fifty million to one.”  Pretty powerful stuff — until you learn that Cybergenetics’ systems will allow prosecutors to offer juries odds of not tens of millions to one, but trillions or even quadrillions to one.  In addition, Cybergenetics will allow analysts to pull apart mixtures of DNA from different people, which is common at crime scenes, and which current DNA technology often can’t handle.  Readers of my book Failed Evidence: Why Law Enforcement Resists Science can get a little more information in Chapter 7, pp. 186-190; you can get the book here.

Cybergenetics’ DNA system has found ready acceptance in the United Kingdom, but the process has been slower in the U.S. There has been considerable resistance — something readers of Failed Evidence are quite familiar with — particularly at the FBI, which governs current DNA protocols and use.

There is much more to how the Cybergenetics’ TrueAllele system works, and what it can do; I’d urge readers to take a good look at Cybergenetics web site, which gives details on what they do, and the many criminal cases and mass disaster identification cases (including the identification of remains at the World Trade Center site).  Once law enforcement sees what this new method of using DNA can do, and once resistance to change is overcome, DNA will be able to identify many more guilty criminals, as well as exonerate many more of the wrongfully convicted, and it will do so with more certainty that we ever thought possible.

 

In today’s world, we think of DNA matching as the gold standard of identification: a scientifically precise method of matching human tissue or fluids left at a crime scene to a particular individual.  But in an op-ed article last week, High-Tech, High-Risk Forensics, published in the New York Times, Cal Hastings professor of law Osagie Obasogie reminds us that even DNA identification isn’t foolproof.  It’s an excellent point and one worth remembering.  But then he stumbles over the details of DNA identification, labeling it high risk, because many people might have the same DNA profiles.  Risk is always present, but is the risk of getting DNA identifications high because many people might have the same DNA profile?  No.

Professor Obasogie tells us the story of a man arrested for a murder near San Jose, California, when his DNA was found on the victim’s body.  Only after the man was arrested and jailed did his alibi surface: at the time of the murder, he was a patient at a hospital, suffering from severe intoxication, and there were voluminous records to prove it.  He was freed after five months, and prosecutors now think the most likely explanation is that paramedics who transported the man to the hospital were the same ones called to the crime scene later that night.  The DNA was likely transferred to the victim from the paramedics’ clothing or equipment.  Professor Obasogie decries “the certainty with which prosecutors charged Mr. Anderson with murder,” because it  “highlights the very real injustices that can occur when we place too much faith in DNA forensic technologies.” This is hard to argue with; for those who can remember this far back, it recalls lawyer Barry Scheck cross-examining Dennis Fung in the O.J. Simpson murder trial over the sloppy collection of DNA and other evidence.  Scheck and other lawyers argued that the DNA that seemed to implicate Simpson just couldn’t be trusted; “garbage in, garbage out.” Professor Obasogie is not saying that the paramedics did anything wrong or were sloppy in any conventional sense; rather he’s arguing (correctly) that contamination can happen even when we’re not aware of the possibility.  Thus caution is always advisable, even with DNA.

But then Professor Obasogie begins to argue that there are deeper problems with DNA identification than just contamination.

Consider the frequent claim that it is highly unlikely, if not impossible, for two DNA profiles to match by coincidence. A 2005 audit of Arizona’s DNA database showed that, out of some 65,000 profiles, nearly 150 pairs matched at a level typically considered high enough to identify and prosecute suspects. Yet these profiles were clearly from different people.

Professor David H. Kaye of Penn State Dickinson School of Law has pointed out the problem with this argument on his blog Forensic Science, Statistics & the Law.  In a blog post on July 26, Professor Kaye explains why Professor Obasogie is wrong on this point.  In his words:

The 150 or so matches were, in fact, mismatches. That is, they were partial matches that actually excluded every “matching” pair. Only if an analyst improperly ignored the nonmatching parts of the profiles or if these did not appear in a crime-scene sample could they be reported to match.

There is much more to Professor Kaye’s thorough and lucid explanation.  If you are interested in a real understanding of how DNA actually works, I strongly recommend Professor Kaye’s post.

The U.S. Department of Justice (DOJ), in partnership with the Innocence Project and the National Association of Criminal Defense Lawyers (NACDL), will review 2,000 cases in which microscopic hair analysis of crime scene evidence was conducted by the FBI Laboratory.  The review, prompted by the DNA-based exoneration of several men convicted on the basis of hair microscopy, will focus on “specific cases in which FBI Laboratory reports and testimony included statements that were scientifically invalid.”  The Innocence Project’s announcement of the review is here; a representative news article is here.

In a move that shows just how seriously the DOJ is taking this review, it has done something unheard of:

Because of the importance attached to these cases, the DOJ has agreed, for the first time in its history, not to raise procedural objections, such as statute of limitations and procedural default claims, in response to the petitions of criminal defendants seeking to have their convictions overturned because of faulty FBI microscopic hair comparison laboratory reports and/or testimony.

Translation: DOJ is not going to fight this review in any of these cases; they’re going to be part of it.

It’s hard to describe the magnitude of the shift in outlook this represents.  Usually, as readers of Failed Evidence know, law enforcement (and I include DOJ in that phrase) resists science-based review and testing; that’s the thrust of the book.  I am happy to say that this is refreshingly different.  According to Peter Neufeld, Co-Director of the Innocence Project, “[t]he government’s willingness to admit error and accept its duty to correct those errors in an extraordinarily large number of cases is truly unprecedented.  It signals a new era in this country that values science and recognizes that truth and justice should triumph over procedural obstacles.”

Of course, this review will not affect cases in which hair analysis was handled by state crime labs.  But here’s hoping they will take this as an example, as the Grits for Breakfast blog argues ought to be done in Texas.

For a sense of the damage that sloppy hair analysis and testimony about it has done in prior cases, listen to this NPR story and interview about the case of Dennis Fritz, in Ada, Oklahoma.  John Grisham’s nonfiction book “The Innocent Man” is an excellent read about the case.

Maybe this is the beginning of a trend.  Hats off to DOJ, the Innocence Project, and NACDL.

When I’m in Cincinnati for talks on Failed Evidence tonight, April 4, at 7:00 pm at the Clifton Cultural Arts Center and tomorrow, April 5, at noon at the University of Cincinnati School of Law, one topic sure to come up is an article from the April 3 New York Times, “Advances in Science of Fire Free a Convict After 42 Years. ”   Louis Taylor was serving 28 life sentences for the deaths caused by a fire in a Tucson, Arizona hotel in December of 1970.  Taylor, then just 16 years old, was convicted of arson based on faulty forensic science.

Mr. Taylor has been release from prison.  He is now 58 years old.

The story highlights the state of arson investigation, past and present.

A few years ago, the National Academy of Sciences turned its attention to the misuse of science in courtrooms, saying that pseudoscientific theories had been used to convict people of crimes they may not have committed. By then, a small group of fire engineers had already begun to discredit many of the assumptions employed in fire investigations, like the practice of using the amount of heat radiated by a fire to assess if an accelerant had been used.

Unlike DNA evidence, which can exonerate one person and sometimes incriminate another, the evidence collected in some arson investigations does not yield precise results. Often much of the evidence has been lost or destroyed. In the case of the hotel fire here, all that is left are photographs, reports and chemical analysis, all of them assembled to prove arson.

As a result, “we can’t definitely say what really caused the fire,” said John J. Lentini, a veteran fire investigator who wrote a report on Mr. Taylor’s case. “But what we can do is discredit the evidence” used to support the charge.

The case recalls the story of the trial and execution of Cameron Todd Willingham in Texas, executed in 2004 for the deaths of his children in a fire.  Experts call the arson in that case terribly flawed — just as in Mr. Taylor’s case.

The science surrounding investigation is light years ahead of where it used to be, even a decade ago.  It’s time that all of the old cases in which verdicts depended on outmoded and discredited methods of arson investigation be re-examined, and if necessary overturned.

A story on National Public Radio highlights one of the central themes of”Failed Evidence: how does the criminal justice system react to advances in science that throw past convictions into doubt?  The answer will not surprise readers of this blog or Failed Evidence: they resist.

The story concerns the case of William Richards, convicted in 1997 of murdering his wife.  The conviction came in a fourth trial, after two hung juries and a mistrial failed to result in a verdict.  In the fourth trial, the prosecution introduced new evidence: testimony by a forensic dentist, who said that marks seen in a photograph of the victim’s body were human bite marks .  The marks, he said, were unusual enough that they were likely to have been made by the defendant’s distinctive teeth.   Ten years later, another forensic dentist corrected a distortion in the photo of the marks, using photo editing software.  Now, the first forensic dentist says the marks weren’t from human teeth at all, and he says that he should not have testified as he did.  Yet the defendant remains in prison, serving 25 years to life.

There’s a lot that is familiar here — too much.

First, the idea that bite mark identification should ever play a role — let alone the key role — in convicting someone and sending them to prison is just intolerable.  I have posted about the weakness of bite mark analysis before (here), and Richards’ case demonstrates the point all over again.  The forensic dentist who put Richards in prison testified that the bite mark (that turned out not to be a bite mark) was so distinctive that he estimated that only one or two out of a hundred people could have made it.  The idea that such an estimate — not at data-based study, but his personal estimate — of such a low probability could ever be considered for admission in a court should make everyone shiver.

Second, the story gives us the reaction of Jan Scully, past president of the National District Attorneys Association and the elected District Attorney of Sacramento County, California.  Scully says there is something more important than the fact that the central evidence in the case has been fatally undermined.  According to the NPR story:

“We need to have finality of verdicts,” she says. “There is always a new opinion or there might be a refinement in our forensic science areas. So, just because something new occurs doesn’t mean that the original conviction somehow was not valid.”

In other words, it’s the same old story from the NDAA: there is no significance to the demonstrated falsity of the “science” that was used to put a man in prison.  It is more important for the verdict to remain undisturbed.

It’s hard to imagine a story that captures the ideas in Failed Evidence more strongly.  Go to the story, and check it out.

In yesterday’s post, I discussed Maryland v. King.  Those arguments,  heard at the Court on February 26, considered whether a state should be permitted to take a DNA sample from every person arrested (not convicted — arrested) for a felony.  I asked in my post that we put questions of  individual privacy aside, and instead ask whether such wide sampling would be a good idea from a crime-solving point of view.  (Some experts do not think so, as discussed in the post.)

Today, let’s put the question of privacy back into the equation, because that appears to be what the Justices will do.

In his recap of the Feb. 26 argument, Scotusblog’s Lyle Denniston tells us that the key points were posed by two of the Court’s conservative justices.  According to Denniston, Justice Samuel Alito clearly favored the idea that law enforcement should be able to take these samples.  DNA sampling “is the 21st century fingerprint” Alito said at least twice.  According to his way of thinking, there is no constitutional difference (in terms of the degree of intrusion on individual privacy) between taking a fingerprint and taking a DNA sample.

The other pole of the argument was taken up by conservative icon Justice Antonin Scalia.  When the lawyer for the state of Maryland used a long list of cases solved through DNA testing to support her argument in support of the law, Justice Scalia reacted forcefully.  According to the National Law Journal:  “Well, that’s really good!” Scalia exploded. “I’ll bet if you conducted a lot of unreasonable searches and seizures, you’d get more convictions, too. That proves absolutely nothing.”  In other words, the question isn’t whether the state’s action solves cases; some methods of solving cases are simply not allowed under the Constitution, even if they could be proven to work better than others.  The question is whether the Constitution — in this case, the Fourth Amendment’s prohibition of unreasonable searches — allows the state to do what it wants to do.

During Tuesday’s argument, Justice Alito commented that King could be “the most important criminal procedure case this Court has had in decades.”  That will depend on how the Court decides the case, which it will do sometime before the end of June.  But one thing we do know:  the debate between law enforcement’s desire to use all the tools it can to fight crime and the Constitution’s protections of the individual against state intrusion will go on.

The current issue of the American Criminal Law Review has a review essay of Failed Evidence: Why Law Enforcement Resists Science (2012).   According to the review, the book “engages…broadly with forensics” to explore “why law enforcement and prosecutors have shown such marked reluctance to incorporate a modern understanding of the scientific method.”  The review concludes that Failed Evidence “provides a thoughtful analysis of the scientific bases underlying forensics, current evidentiary and investigatory problems, and possible solutions. [The] suggestions are particularly well thought-out because they consider the problems faced by law enforcement when implementing ideal solutions in the real world.”

You can read the full review here.

Today the U.S.  Supreme Court hears arguments in Maryland v. King, the Court’s latest foray into  DNA testing.  Most reports have focused on the clash between law enforcement’s desire to test every arrested person in order to try to solve old cases, and those who advocate for a strict interpretation of the Fourth Amendment’s protection of privacy.  For example, a report by the excellent Nina Totenberg of National Public Radio discussed the positions of police, who believe taking a sample from every person arrested is a minimal intrusion that can have a big payoff, and the arguments of defense attorneys and civil libertarians, who feel that allowing testing of all arrestees would surrender the basic principles of the Fourth Amendment’s protection against unlawful searches and seizures.

But there’s a question that may be far more important: from a crime fighting point of view,   is it a good idea for police to take and process a DNA sample for everyone who gets arrested?   Of course there will be some cases — such as the King case itself — in which the time-of-arrest sample leads to an arrest for a different, serious crime.  But taking samples from every arrestee may actually hurt our efforts to use DNA most effectively to make ourselves safe.  According to an article in Slate by Brandon Garrett and Erin Murphy, real public safety gains from DNA lie not with taking samples from every jaywalker and burglar and hoping for a hit in a cold case, but instead in taking many more samples from crime scenes.  In other words, we get more hits when we process samples from active crime scenes and match them against our already-large DNA database, instead of fishing for leads among the whole population of more than 12 million people arrested every year.  And all of those additional samples from arrestees crowd out and slow down the processing of samples from real crime scenes and victims, creating backlogs.  In other words, bigger DNA databases is not the answer to crime.

[B]igger is only better if DNA databases grow in the right way: by entering more samples from crime scenes, not samples from arrestees. DNA databases already include 10 million-plus known offender profiles. But a database with every offender in the nation cannot solve a crime if no physical evidence was collected or tested.  And police collect far too few such samples….The police solve more crimes not by taking DNA from suspects who have never been convicted, but by collecting more evidence at crime scenes.  Even worse, taking DNA from a lot of arrestees slows the testing in active criminal investigations….Backlogs created by arrestee DNA sampling means that rape kits and samples from convicted offenders sit in storage or go untested.

The bottom line: even if the Supreme Court says we can take a sample from every person arrested, doesn’t mean we should.