Reducing Human Error in Fingerprint Analysis

Posted on June 8, 2012 by


Image: Amazon

The human fingerprint has been used by law enforcement for over a hundred years. Until the development of DNA as a forensic tool, the fingerprint was the single most important type of crime-scene evidence. As a unique identifier, the fingerprint captured not only our forensic attention, but our collective armchair detective imaginations.

Puddn’head Wilson, after all, managed to indict slavery and racism with little more than a pattern of swirls and whorls of fatty oils.

Yet, despite the fact that fingerprints generally do not require tremendous scientific sophistication for analysis, they are not as easily read, analyzed, and presented in court as you might think. The fallibility of fingerprints is not often recognized, especially in a culture that worships shows like CSI, Dexter, and Forensic Files.

In forensic science circles, however, the human errors in latent fingerprint analysis are increasingly under the magnifying glass. The National Institute of Justice (NIJ) and the National Institute of Standards and Technology (NIST)  recently produced a report designed to address these issues. They assembled an expert working group to examine the role of human error and and identify ways to reduce such errors.

Their key recommendations include what should be, in retrospect, obvious:

  1. Employ a system to identify and track errors and their causes.
  2. Establish policies and procedures for case review and conflict resolution, corrective action, and preventive measures.

They also produced this flow chart, which, if nothing else demonstrates the complexity of the decision-making process for latent fingerprint examiners.

Image: NIJ Public Domain

While DNA is clearly the future of forensic science, fingerprints will continue to take center stage in crime scene investigations. Hopefully this report will help to improve the science and reduce the fallibility of the process.