In the July 23 issue of The New Yorker,  “Words on Trial” introduces us to forensic linguistics.  Linguists, it seems, can now harness their knowledge to solve crimes.   The article describes a number of cases in which linguistic patterns expose who, or in some cases who did not, commit crimes that police could not solve.

It is an intriguing idea: linguists analyze various sorts of patterns and usages in writing and speaking.  With writings, they detect patterns that can go a long way toward identifying the author of a ransom note or a threat.

For me, what was so striking was how the same old forensic science questions from fingerprint analysis, tool marks, and hair and fiber comparisons emerge in this very new forensic discipline.  Is this a field the power of which is based on human interpretation, in which the experience and knowledge of the analyst rules, as is true with fingerprint analysis? If so, does it suffer already from the same deficiencies?  Or should forensic linguists strive for a data-based standard, such as DNA?  Both approaches are on display in the article, though the former is featured as the probable future of the field.

After looking hard at the 2009 NAS report, we should be cautious about introducing another forensic methodology that cannot quantify its error rate, cannot state the probability that the answer it gives is correct, and therefore cannot truly state whether or not a particular piece of writing is in fact associated with a defendant.  One forensic linguist says that he does not testify that he knows who wrote the items he analyzes; he will only say that the language in a given item “is consistent with” the language in the defendant’s other writings.  The caution is absolutely appropriate, but I wonder whether a jury will overlook this nuance, and simply hear “it matches.”

Every new tool that promises to solve crimes deserves serious attention; some new methods will make the world safer from predators.  But let’s not take these new tools where we have gone blindly before — into the land of failed forensics.  Instead, courts should accept them only when they can actually claim to use rigorous methods that support their claims.  Anything else will just generate a new wave of wrongful convictions.

  1. […] Law professor David A Harris posted a great commentary on his blog Failed Evidence . […]

    • Many, many thanks for the comment. I think the tension between the work you do and that of the others mentioned in the NYer article is a very important issue that we need to pay attention to now, on the front end, instead of later, when we’re dealing with wrongful convictions.

  2. Hey, thanks. It’s a powerful article, but the “scientist as magician” aspect, as well as the way that the scientists speaking in it assert the primacy of human interpretation over data analysis, looks so much like what is wrong with other forensic disciplines. I couldn’t resist pointing this out.

  3. Frank says:

    Wow, well said.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s