Posts Tagged ‘forensic match’

Here we are, more than a month after chemist Annie Dookhan, formerly of the Massachusetts State Drug Laboratory, entered a guilty plea to producing fraudulent forensic testing results, and went to prison.  The scandal, potentially involving tens of thousands of cases, has resulted in the release of hundreds of convicted persons. All of this has reportedly cost the state of Massachusetts more than 8 million dollars, and the state has budgeted almost 9 million more for the continuing damage.    Readers have seen coverage of the Massachusetts scandal, and several others, here and here and here.

But Dhookan and the Massachusetts Drug Laboratory are far from alone.  According to a report by National Public Radio, there have been  twelve major crime lab scandals in the U.S. in just the last two years.  With all of this damage  — to individual cases and defendants, to state and local budgets, and to the public trust — occurring all over the country, how have policy makers at the national level responded?

Well, they haven’t.  At least not yet.

Almost five years after the release of the National Academy of Sciences’ report “Strengthening Forensic Science in the U.S.: A Path Forward,” which recommended (among other things) the establishment of national industry standards for forensic labs and a National Institute of Forensic Science,  as well as the independence of every crime lab from police agencies and prosecution offices, none of this has happened.  Nor is accreditation of laboratories required.

According to National Public Radio, Senators John Cornyn of Texas and Patrick Leahy of Vermont “are working to introduce legislation this year” which could address some of these problems.  But nearly five years after the NAS report, and with the parade of crime lab scandals  continuing without let up, why has it taken five years to get to this very preliminary point?

Readers, would mandatory national standards help?  Are they appropriate?  What about requiring accreditation?

If you are from outside the U.S., does your country set mandatory national standards for crime labs?  Is accreditation required?

The time has long since passed for us to do something about this set of problems in the U.S.  We just can’t afford the damage to the credibility of our criminal justice system and the costs of  reviewing cases and releasing convicted prisoners — some of whom may very well be guilty, but whose cases are tainted.

The use of DNA identification as a forensic tool, beginning in 1989, changed the way that we think about guilt, innocence, and traditional police investigation.  It isn’t just the 311 wrongful convictions that DNA identification has confirmed; it’s the far more numerous cases in which DNA has determined guilt — sometimes in cases years or decades old.

Now DNA identification is about to change: it will become even more powerful than it is now.  A new way of processing and interpreting DNA has arrived that will make our current DNA techniques look weak by comparison.

On Friday, Nov. 8, I attended an incredibly interesting talk by Dr. Ria David.  Dr. David is one of the co-founders of Cybergenetics, a company based in Pittsburgh.  Cybergenetics has perfected computer-based techniques and technologies that will change the way that DNA is analyzed.  With Cybergenetics’ TrueAllele (R) system, the analysis relies the power of computers instead of interpretation done by humans.   The talk was sponsored by the Center for Women’s Entrepreneurship (CWE) at Chatham University.  (Disclosure: my wife runs CWE; my wife and I know Dr. David and her Cybergenetics co-founder, Dr. Mark Perlin, but neither my wife nor I have any personal or financial ties of any kind to Cybergenetics.)

Most of us know that a DNA sample allows forensic scientists to say things like “the odds that this sample came from anyone other than the defendant are fifty million to one.”  Pretty powerful stuff — until you learn that Cybergenetics’ systems will allow prosecutors to offer juries odds of not tens of millions to one, but trillions or even quadrillions to one.  In addition, Cybergenetics will allow analysts to pull apart mixtures of DNA from different people, which is common at crime scenes, and which current DNA technology often can’t handle.  Readers of my book Failed Evidence: Why Law Enforcement Resists Science can get a little more information in Chapter 7, pp. 186-190; you can get the book here.

Cybergenetics’ DNA system has found ready acceptance in the United Kingdom, but the process has been slower in the U.S. There has been considerable resistance — something readers of Failed Evidence are quite familiar with — particularly at the FBI, which governs current DNA protocols and use.

There is much more to how the Cybergenetics’ TrueAllele system works, and what it can do; I’d urge readers to take a good look at Cybergenetics web site, which gives details on what they do, and the many criminal cases and mass disaster identification cases (including the identification of remains at the World Trade Center site).  Once law enforcement sees what this new method of using DNA can do, and once resistance to change is overcome, DNA will be able to identify many more guilty criminals, as well as exonerate many more of the wrongfully convicted, and it will do so with more certainty that we ever thought possible.

 

In today’s world, we think of DNA matching as the gold standard of identification: a scientifically precise method of matching human tissue or fluids left at a crime scene to a particular individual.  But in an op-ed article last week, High-Tech, High-Risk Forensics, published in the New York Times, Cal Hastings professor of law Osagie Obasogie reminds us that even DNA identification isn’t foolproof.  It’s an excellent point and one worth remembering.  But then he stumbles over the details of DNA identification, labeling it high risk, because many people might have the same DNA profiles.  Risk is always present, but is the risk of getting DNA identifications high because many people might have the same DNA profile?  No.

Professor Obasogie tells us the story of a man arrested for a murder near San Jose, California, when his DNA was found on the victim’s body.  Only after the man was arrested and jailed did his alibi surface: at the time of the murder, he was a patient at a hospital, suffering from severe intoxication, and there were voluminous records to prove it.  He was freed after five months, and prosecutors now think the most likely explanation is that paramedics who transported the man to the hospital were the same ones called to the crime scene later that night.  The DNA was likely transferred to the victim from the paramedics’ clothing or equipment.  Professor Obasogie decries “the certainty with which prosecutors charged Mr. Anderson with murder,” because it  “highlights the very real injustices that can occur when we place too much faith in DNA forensic technologies.” This is hard to argue with; for those who can remember this far back, it recalls lawyer Barry Scheck cross-examining Dennis Fung in the O.J. Simpson murder trial over the sloppy collection of DNA and other evidence.  Scheck and other lawyers argued that the DNA that seemed to implicate Simpson just couldn’t be trusted; “garbage in, garbage out.” Professor Obasogie is not saying that the paramedics did anything wrong or were sloppy in any conventional sense; rather he’s arguing (correctly) that contamination can happen even when we’re not aware of the possibility.  Thus caution is always advisable, even with DNA.

But then Professor Obasogie begins to argue that there are deeper problems with DNA identification than just contamination.

Consider the frequent claim that it is highly unlikely, if not impossible, for two DNA profiles to match by coincidence. A 2005 audit of Arizona’s DNA database showed that, out of some 65,000 profiles, nearly 150 pairs matched at a level typically considered high enough to identify and prosecute suspects. Yet these profiles were clearly from different people.

Professor David H. Kaye of Penn State Dickinson School of Law has pointed out the problem with this argument on his blog Forensic Science, Statistics & the Law.  In a blog post on July 26, Professor Kaye explains why Professor Obasogie is wrong on this point.  In his words:

The 150 or so matches were, in fact, mismatches. That is, they were partial matches that actually excluded every “matching” pair. Only if an analyst improperly ignored the nonmatching parts of the profiles or if these did not appear in a crime-scene sample could they be reported to match.

There is much more to Professor Kaye’s thorough and lucid explanation.  If you are interested in a real understanding of how DNA actually works, I strongly recommend Professor Kaye’s post.

The U.S. Department of Justice (DOJ), in partnership with the Innocence Project and the National Association of Criminal Defense Lawyers (NACDL), will review 2,000 cases in which microscopic hair analysis of crime scene evidence was conducted by the FBI Laboratory.  The review, prompted by the DNA-based exoneration of several men convicted on the basis of hair microscopy, will focus on “specific cases in which FBI Laboratory reports and testimony included statements that were scientifically invalid.”  The Innocence Project’s announcement of the review is here; a representative news article is here.

In a move that shows just how seriously the DOJ is taking this review, it has done something unheard of:

Because of the importance attached to these cases, the DOJ has agreed, for the first time in its history, not to raise procedural objections, such as statute of limitations and procedural default claims, in response to the petitions of criminal defendants seeking to have their convictions overturned because of faulty FBI microscopic hair comparison laboratory reports and/or testimony.

Translation: DOJ is not going to fight this review in any of these cases; they’re going to be part of it.

It’s hard to describe the magnitude of the shift in outlook this represents.  Usually, as readers of Failed Evidence know, law enforcement (and I include DOJ in that phrase) resists science-based review and testing; that’s the thrust of the book.  I am happy to say that this is refreshingly different.  According to Peter Neufeld, Co-Director of the Innocence Project, “[t]he government’s willingness to admit error and accept its duty to correct those errors in an extraordinarily large number of cases is truly unprecedented.  It signals a new era in this country that values science and recognizes that truth and justice should triumph over procedural obstacles.”

Of course, this review will not affect cases in which hair analysis was handled by state crime labs.  But here’s hoping they will take this as an example, as the Grits for Breakfast blog argues ought to be done in Texas.

For a sense of the damage that sloppy hair analysis and testimony about it has done in prior cases, listen to this NPR story and interview about the case of Dennis Fritz, in Ada, Oklahoma.  John Grisham’s nonfiction book “The Innocent Man” is an excellent read about the case.

Maybe this is the beginning of a trend.  Hats off to DOJ, the Innocence Project, and NACDL.

The current issue of the American Criminal Law Review has a review essay of Failed Evidence: Why Law Enforcement Resists Science (2012).   According to the review, the book “engages…broadly with forensics” to explore “why law enforcement and prosecutors have shown such marked reluctance to incorporate a modern understanding of the scientific method.”  The review concludes that Failed Evidence “provides a thoughtful analysis of the scientific bases underlying forensics, current evidentiary and investigatory problems, and possible solutions. [The] suggestions are particularly well thought-out because they consider the problems faced by law enforcement when implementing ideal solutions in the real world.”

You can read the full review here.

Today the U.S.  Supreme Court hears arguments in Maryland v. King, the Court’s latest foray into  DNA testing.  Most reports have focused on the clash between law enforcement’s desire to test every arrested person in order to try to solve old cases, and those who advocate for a strict interpretation of the Fourth Amendment’s protection of privacy.  For example, a report by the excellent Nina Totenberg of National Public Radio discussed the positions of police, who believe taking a sample from every person arrested is a minimal intrusion that can have a big payoff, and the arguments of defense attorneys and civil libertarians, who feel that allowing testing of all arrestees would surrender the basic principles of the Fourth Amendment’s protection against unlawful searches and seizures.

But there’s a question that may be far more important: from a crime fighting point of view,   is it a good idea for police to take and process a DNA sample for everyone who gets arrested?   Of course there will be some cases — such as the King case itself — in which the time-of-arrest sample leads to an arrest for a different, serious crime.  But taking samples from every arrestee may actually hurt our efforts to use DNA most effectively to make ourselves safe.  According to an article in Slate by Brandon Garrett and Erin Murphy, real public safety gains from DNA lie not with taking samples from every jaywalker and burglar and hoping for a hit in a cold case, but instead in taking many more samples from crime scenes.  In other words, we get more hits when we process samples from active crime scenes and match them against our already-large DNA database, instead of fishing for leads among the whole population of more than 12 million people arrested every year.  And all of those additional samples from arrestees crowd out and slow down the processing of samples from real crime scenes and victims, creating backlogs.  In other words, bigger DNA databases is not the answer to crime.

[B]igger is only better if DNA databases grow in the right way: by entering more samples from crime scenes, not samples from arrestees. DNA databases already include 10 million-plus known offender profiles. But a database with every offender in the nation cannot solve a crime if no physical evidence was collected or tested.  And police collect far too few such samples….The police solve more crimes not by taking DNA from suspects who have never been convicted, but by collecting more evidence at crime scenes.  Even worse, taking DNA from a lot of arrestees slows the testing in active criminal investigations….Backlogs created by arrestee DNA sampling means that rape kits and samples from convicted offenders sit in storage or go untested.

The bottom line: even if the Supreme Court says we can take a sample from every person arrested, doesn’t mean we should.

 

Good news: Failed Evidence: Why Law Enforcement Resists Science is the Feb. 4 selection by delanceyplace.com, a service that highlights and quotes new works for a large community of readers.  Delancyplace.com provides daily subscribers with “an excerpt or quote we view as interesting or noteworthy, offered with commentary to provide context. There is no theme, except that most excerpts will come from a non-fiction work, primarily historical in focus, and will occasionally be controversial. Finally, we hope that the selections will resonate beyond the subject of the book from which they were excerpted.”  Other recent selections have included Jared Diamond, The World Until Yesterday: What Can We Learn from Traditional Societies; Gordon Wood, Empire of Liberty: A History of the Early Republic, 1789-1815; and Ray Kurzweil and Terry Grossman, Transcend: Nine Steps to Living Well Forever.

 

Among failed forensic methods, virtually no type of evidence that has a worse record than bite mark analysis.

In my book Failed Evidence (NYU Press, Sept. 12), I say that courts must simply stop accepting bite mark analysis.  It has no scientific basis whatever, and has been the source of numerous wrongful convictions.  Now, a practitioner of this, um, method has spoken out, and lo and behold, I find myself agreeing with him.  That’s because he now says there is nothing to it.  Of course, this comes after years of testifying to the bite-based guilt of defendants in “dozens” of cases.

With thanks to the Innocence Blog, I see that Dr. Michael West of Hattiesburg, Mississippi, whose bite mark work has sent defendants to prison and even death row, no longer thinks bite mark analysis has any validity.  A Mississippi newspaper reported that in a 2011 deposition, West said that that “I no longer believe in bite-mark analysis.”   In fact, he went further:  “I don’t think it should be used in court. I think you should use DNA. Throw bite marks out.”

Forensic odontology — the study and analysis of teeth for legal purposes — can be a useful tool.  For example, it is often used, quite successfully, to identify human remains when nothing else can.  But analyzing bite marks to identify perpetrators of crime has never had any scientific validity, and has been proven incorrect in case after case after so-called experts like West used bite marks to send innocent  men and women to jail.

It’s worth noting that, even as West now disavows bite mark analysis, he somehow maintains that his work was correct.  In two different cases of raped and murdered children, West’s bite mark testimony put two defendants behind bars, one on death row for 15 years, the other in prison for 18 years.  When DNA proved that another man had committed both crimes, West refused to back down.  “I never accused them of killing or raping anybody – just biting them while they were alive.”

Along with the reversal by the remarkable Dr. West, there’s news on the Wrongful Convictions Blog that Douglas Prade, a former police captain from Akron, Ohio, convicted of killing his ex-wife in  a 1998 case largely on “bite mark” evidence, has been exonerated based on DNA.

There is only one real question about bite mark evidence: when will courts ban it?  When is enough damage enough?

 

In the July 23 issue of The New Yorker,  “Words on Trial” introduces us to forensic linguistics.  Linguists, it seems, can now harness their knowledge to solve crimes.   The article describes a number of cases in which linguistic patterns expose who, or in some cases who did not, commit crimes that police could not solve.

It is an intriguing idea: linguists analyze various sorts of patterns and usages in writing and speaking.  With writings, they detect patterns that can go a long way toward identifying the author of a ransom note or a threat.

For me, what was so striking was how the same old forensic science questions from fingerprint analysis, tool marks, and hair and fiber comparisons emerge in this very new forensic discipline.  Is this a field the power of which is based on human interpretation, in which the experience and knowledge of the analyst rules, as is true with fingerprint analysis? If so, does it suffer already from the same deficiencies?  Or should forensic linguists strive for a data-based standard, such as DNA?  Both approaches are on display in the article, though the former is featured as the probable future of the field.

After looking hard at the 2009 NAS report, we should be cautious about introducing another forensic methodology that cannot quantify its error rate, cannot state the probability that the answer it gives is correct, and therefore cannot truly state whether or not a particular piece of writing is in fact associated with a defendant.  One forensic linguist says that he does not testify that he knows who wrote the items he analyzes; he will only say that the language in a given item “is consistent with” the language in the defendant’s other writings.  The caution is absolutely appropriate, but I wonder whether a jury will overlook this nuance, and simply hear “it matches.”

Every new tool that promises to solve crimes deserves serious attention; some new methods will make the world safer from predators.  But let’s not take these new tools where we have gone blindly before — into the land of failed forensics.  Instead, courts should accept them only when they can actually claim to use rigorous methods that support their claims.  Anything else will just generate a new wave of wrongful convictions.

DNA evidence has become a forensic tool of precision and power in the criminal justice system.  Nothing can touch its scientific pedigree, the strength of its ability to make an identification, or its rigorous protocols for collection, analysis, and presentation of probability-based evidence.

But something new is on law enforcement’s radar: using DNA in a way that brings us back to the bad old days of using the blunt instrument of serology to “match” a perpetrator to the crime.

In July 5’s Chicago Tribune, reporter Steve Mills’s article, “Weak DNA Evidence Could Undermine Justice, Experts Say,”, tells the story of Cleveland Barrett, who was arrested for a predatory criminal sexual assault on a 9-year-old girl.   Part of the evidence against Barrett: according to prosecutors, DNA showed Barrett committed the crime.

But this wasn’t the kind of millions- or billions-to-one DNA identification we’ve grown accustomed to seeing, because prosecutors were using tiny amounts of DNA in an unusual way.  According to Mills’ story, the DNA used in Barrett’s case

was different. It was not from semen, as is often the case in rapes; instead it came from male cells found on the girl’s lips. What’s more, the uniqueness of the genetic link between the DNA and Barrett was not of the 1-in-several-billion sort that crime lab analysts often testify to in trials with DNA evidence. Instead…the genetic profile from the cells matched 1 in 4 African-American males, 1 in 8 Hispanic males and 1 in 9 Caucasian males.  Fact is, the DNA profile from the cells on the victim’s lips could have matched hundreds of thousands of men in the Chicago region.

And yet, the prosecutor’s closing argument used the word that juries know signifies an almost irrefutable, scientifically solid identification: “match.”

The use of weak DNA evidence like this is “preposterous” and has “almost no value at all as evidence,” according to Edward Blake, a leading DNA authority quoted by Mills.

Just what the criminal justice system needs: a way to make some of the most powerful scientific evidence ever devised into misleading junk science that claims to “match” a defendant to a crime.  Law enforcement resists existing science that can make its basic procedures like eyewitness identification, interrogation, and forensics better; yet it seems determined to push one of its best tools away from scientific precision, and toward inaccuracy.