Posts Tagged ‘fingerprints’

Good news: Failed Evidence: Why Law Enforcement Resists Science is the Feb. 4 selection by delanceyplace.com, a service that highlights and quotes new works for a large community of readers.  Delancyplace.com provides daily subscribers with “an excerpt or quote we view as interesting or noteworthy, offered with commentary to provide context. There is no theme, except that most excerpts will come from a non-fiction work, primarily historical in focus, and will occasionally be controversial. Finally, we hope that the selections will resonate beyond the subject of the book from which they were excerpted.”  Other recent selections have included Jared Diamond, The World Until Yesterday: What Can We Learn from Traditional Societies; Gordon Wood, Empire of Liberty: A History of the Early Republic, 1789-1815; and Ray Kurzweil and Terry Grossman, Transcend: Nine Steps to Living Well Forever.

 

Last night, PBS broadcast “Forensics on Trial” on Nova, the network’s terrific science show.  My take: the show got most of the issues regarding forensic science right, but not all.  And I think it might leave a misleading impression on some viewers.

Here’s a quick list of some of the things the program got right:

* The Brandon Mayfield fingerprint fiasco set in motion deep scrutiny of forensic science, culminating in the National Academy of Science 2009 report, “Strengthening Forensic Science in the United States.”

* According to that report, aside from DNA, most forensic science isn’t science at all but a craft based on human interpretation, subject to all of the cognitive flaws one would expect.

* Most of forensic science operates without standards, accountability, or basic certification.

But I had some reservations:

* The show conveyed a sense that the lack of scientific rigor in forensic methods would be solved by shiny new high-tech gadgetry.  Too little attention was paid to the ways we should address more basic flaws: the lack of real data on which to base our judgments about the sources of the evidence we find; the absence of standard laboratory practices, such as blind testing,  to protect against cognitive biases and flaws.

* The show failed to probe into the weaknesses of some of the lamest forensic disciplines, such as tire and shoe prints, hair and fiber matching, and the like.  Some of this was mentioned, but only in passing.

* The segment on bite mark identification was particularly striking.  It did a good job of exposing how this “discipline” put an innocent man in jail for fifteen years.  But it made it seem as if the problem was shoddy use of a legitimate method, when the issue is more fundamental: bite marks on skin are not consistent, and they change as the body tissue changes, moves, etc.

* I recall nothing about fraudulent forensics — so-called “dry labbing” that is now rocking the Jamaica Plain lab in Massachusetts, and that has shown up again and again in other places.  (Paging Fred Zain…paging Joyce Gilchrist…)

A worthy effort?  Yes, no question.  But I was hoping for better.

Reaction, readers?

I was a guest on WYPR Public Radio’s “Midday” program today, discussing Failed Evidence.  Today was the monthly “Midday on Science” show, and host Dan Rodicks and regular science contributor John Monahan asked great questions on everything from DNA to more traditional forensic sciences to eyewitness identification and false confessions.  Listeners asked terrific questions too.

You can hear the whole show by clicking here and clicking on the audio button.

Failed Evidence: Why Law Enforcement Resists Science will be the subject of two public forums this week, one in Washington, D.C., and the other in Baltimore.  Both events are free and open to the public.

On Wednesday, October 3, I’ll discuss the book at noon at American University’s Washington College of Law, 4801 Massachusetts Avenue N.W. (6th floor).  My talk will be followed by a panel discussion featuring former Assistant U.S. Attorney Dan Zachem and Professor Paul Butler of Georgetown University.  Full details are here.

On Thursday, October 4, I’ll lead a discussion of the book at 5:30 p.m. at the University of Baltimore School of Law, 1401 Charles Street.  The panel to follow will include Gregg Bernstein, the elected State’s Attorney for Baltimore.  Full details are here.

Today, September 20, at 5:30 p.m., I’ll discuss Failed Evidence: Why Law Enforcement Resists Science (NYU Press) at an author’s talk and panel discussion in New York City at John Jay College of Criminal Justice today, September 20, at 5:30 p.m.  Full details are here.  The talk is free and open to the public.  The talk will be followed by a panel discussion by four members of John Jay’s faculty:  Margaret Bull Kovera (Psychology), Nicholas D. K. Petraco (Forensics and statistics), Lawrence Kobilinsky (Forensics and DNA), and Eugene O’Donnell (Police Science and Criminal Justice Administration). The panel will be moderated by Zachary Carter, former U.S. Attorney for the Eastern District of New York.

In the July 23 issue of The New Yorker,  “Words on Trial” introduces us to forensic linguistics.  Linguists, it seems, can now harness their knowledge to solve crimes.   The article describes a number of cases in which linguistic patterns expose who, or in some cases who did not, commit crimes that police could not solve.

It is an intriguing idea: linguists analyze various sorts of patterns and usages in writing and speaking.  With writings, they detect patterns that can go a long way toward identifying the author of a ransom note or a threat.

For me, what was so striking was how the same old forensic science questions from fingerprint analysis, tool marks, and hair and fiber comparisons emerge in this very new forensic discipline.  Is this a field the power of which is based on human interpretation, in which the experience and knowledge of the analyst rules, as is true with fingerprint analysis? If so, does it suffer already from the same deficiencies?  Or should forensic linguists strive for a data-based standard, such as DNA?  Both approaches are on display in the article, though the former is featured as the probable future of the field.

After looking hard at the 2009 NAS report, we should be cautious about introducing another forensic methodology that cannot quantify its error rate, cannot state the probability that the answer it gives is correct, and therefore cannot truly state whether or not a particular piece of writing is in fact associated with a defendant.  One forensic linguist says that he does not testify that he knows who wrote the items he analyzes; he will only say that the language in a given item “is consistent with” the language in the defendant’s other writings.  The caution is absolutely appropriate, but I wonder whether a jury will overlook this nuance, and simply hear “it matches.”

Every new tool that promises to solve crimes deserves serious attention; some new methods will make the world safer from predators.  But let’s not take these new tools where we have gone blindly before — into the land of failed forensics.  Instead, courts should accept them only when they can actually claim to use rigorous methods that support their claims.  Anything else will just generate a new wave of wrongful convictions.

The Los Angeles Times reports that fingerprint analysis — long a troubled part of the LAPD operation — has such a large backlog that work on property crimes will now be rationed.

In “LAPD to Ration Fingerprint Analysis to Deal With Backlog,” we learn that the fingerprint operation is so overwhelmed with requests to collect and analyze prints that each of the LAPD’s 21 stations and divisions will get to make only ten requests per month for priority analysis in property cases.  Others will be put on a waiting list.  Violent crimes, like homicides and sexual assaults, will not be affected by the rationing plan, but LAPD officials conceded that even for violent crimes, “it typically takes six to eight weeks to produce fingerprint results.”  And this is a very serious problem.  According to Michael Brausam, a detective in the LAPD’s Central Division:

The longer you leave these criminals out on the street, they’re likely going to be committing more crimes. And, if you do get a match on prints months later, it can be much harder to prove your case.

I discuss fingerprint analysis extensively in Failed Evidence.  It used to be thought of as the gold standard of forensics, before DNA came along to reveal the flaws in its methods.  Then came the 2009 National Academy of Sciences report “Strengthening Forensic Science in the United States: A Path Forward,” which showed that fingerprint analysis was not the science we had thought it was.

And now: rationing.  The situation in LA calls to mind an old Catskills joke.  Two ladies sit in the resort’s dining area.  One complains to the other: “The food is terrible in this place!”  The other replies, “Yes — and they serve such small portions.”  It looks like tightening municipal budgets have pushed us to the “small portions” phase.  So even if fingerprint analysis improves — and there are signs that it may , which I will discuss in a subsequent post — we won’t have enough of it to go around.