Posted 10/12/08


Commonly accepted techniques may lack scientific value

    For Police Issues by Julius (Jay) Wachtel.  On February 17, 2004 Texas inmate Cameron Todd Willingham was strapped to a gurney and given a lethal injection.  He had been convicted of arson and murder in a 1991 house fire that killed his three daughters.  Evidence against him included the statement of a jailhouse informer who said that Willingham confessed and scientific testimony by the State Fire Marshal’s office that the fire was deliberately set.

     Willingham protested to the very end that he was innocent.  Now it looks like he might have been right.  In August 2008 the Texas Forensic Science Commission agreed to review a 2006 report by five nationally recognized fire experts who refuted the “arson indicators” cited by Texas authorities at Willingham’s trial and said the fire was accidental.  One of these indicators, crazed glass, was once thought to be evidence of a superhot fire fed by accelerants.  It’s now known to be caused by spraying water on hot glass.  According to the experts, another indicator, burn patterns in the floor suggestive of accelerants were meaningless in a fire that burned as hot as the one that destroyed Willingham’s home.  And so forth.

Click here for the complete collection of technology & forensics essays

     In addition the Commission will also be considering the wrongful conviction of Ernest Ray Willis, who spent 17 years on death row for an arson/murder much like the Willingham case.  While preparing to retry Willis (his case had been overturned on technical grounds) the prosecutor concluded that the State Fire Marshal’s “scientific” testimony was mistaken and that the fire was accidental.  Willis was released.

     On March 11, 2004 terrorist bombings in Madrid train stations killed 191 and injured two-thousand more.  During their investigation Spanish police recovered fingerprints from inside a bag of unexploded detonators and furnished images of the prints to the FBI.

     FBI fingerprint examiners digitized the images and ran them through the national database.  They soon identified the prints as belonging to Brandon Mayfield, a Portland attorney who was Muslim and once represented a suspected terrorist in a civil case.  Confident in their conclusions, the FBI ignored Spanish investigators who insisted that the prints didn’t match and that the bombers were Moroccan terrorists with no known connection to Al Qaeda or the U.S.  On May 6, 2004 the FBI arrested Mayfield as a material witness in the bombings and searched his residence.  Within days Spain positively identified the man who left the prints as a known Algerian terrorist.  Two weeks after arresting Mayfield the FBI let him go.  He got $2 million in taxpayer cash for his troubles.

     How could this happen?  Found or “latent” fingerprints are nothing like the complete, neatly inked fingerprints taken from job applicants and persons arrested for crimes.  Instead, they’re often fragmentary, smudged, distorted and overlapping, which can make it difficult for examiners to identify the “minutiae”, the islands, dots, bifurcations and ridge endings on which comparisons rely. (In this example, a “good quality” latent is on the left, and the same finger inked is on the right.)

     Every State and the FBI have large repositories of digitized fingerprint cards.  The FBI holds prints for nearly one-hundred million persons, split about evenly between arrestees and applicants.  Running recovered prints through these databases yields cards with the closest matches.  It’s up to local examiners to order those of interest and microscopically compare them to the latent to see if there’s a fit.  Generally at least seven minutiae must match, while only one inconsistency disqualifies.  Extrinsic factors such as investigator’s suspicions must never intrude on an examiner’s judgment; if they do, as what apparently happened in Mayfield, the examiner (in the FBI’s case, several examiners and their boss) might mistakenly “find” matching minutiae in the latent that simply aren’t there.

     Firing a weapon leaves markings on bullets and cartridge casings that are supposedly unique to that particular gun. If cartridge casings or bullets found at a crime scene or extracted from a body have a sufficient number of identical markings and no inconsistencies examiners will testify that they were also fired by that gun.

     That’s the belief.  However, a recent report by the National Academies concludes that while “one can find similar marks on bullets and cartridge cases from the same gun,” the assumption that only that gun could have produced those markings “has not yet been fully demonstrated.”

     Even if we believe that ballistics evidence is reliable, humans aren’t.  In this comparison a recovered bullet in excellent condition is on the left, and a bullet test-fired through the same gun is on the right.  We can see that the striations left by the barrel line up perfectly.  In the real world, though, bullets are often deformed and fragmented, making comparison difficult.  Detroit PD’s lab was recently shut down after State Police auditors found three “false positives”, cases where examiners mistakenly reported a match that didn’t exist.

     Why did the State come in?  After a recent murder conviction a retired State firearms examiner conclusively demonstrated that shell casings found at the crime scene came from at least two weapons, not one as the police lab claimed.  The judge dismissed the case, which will be retried.

     Everyone’s heard of Phil Spector, the celebrity murder defendant whose first trial ended in a hung jury (his retrial will begin any day.)   There’s no disputing that the victim, Lana Clarkson, died from a bullet discharged while a gun barrel was in her mouth.  Spector claims that he was six feet away when the gun went off.  His claim was propped up by blood spatter expert Stuart James, who said that droplets could travel six feet.  But Sheriff’s criminalist Dr. Lynne Herold, who admitted she had taken one of James’ courses, said no, that their range was at most three feet.  That little duel is likely to replay itself.  Meanwhile, what are we to think of blood spatter evidence?  Is it meaningful or not?

Be sure to check out our homepage and sign up for our newsletter

     Maybe CSI isn’t all that it’s cracked up to be.  Physical evidence has to be collected, bagged, tagged and interpreted by fallible humans who can slip at any stage of the process, damaging the goods, making them out to be what they’re not, or inferring that they mean something they don’t.  It’s happened with arson, fingerprints, ballistics and blood spatter.  Last week we mentioned that goofs leading to wrongful convictions have even happened with DNA, which is particularly scary given its aura of infallibility.

     According to a recent article in the New York Post, the National Academy of Sciences is expected to shake things up this December with a report that will question the value and accuracy of accepted forensic techniques.

     Not to worry, Joe Friday.  Looks like shoe leather will be in style a while longer.

UPDATES (scroll)

1/14/21  DOJ issued a formal rebuttal to criticisms in the President’s Council of Advisors 2016 report that challenged the validity of comparison techniques such as those used by firearms and toolmarks examiners to match firearms to recovered bullets and shell casings. According to DOJ, the report’s conclusions are inaccurate but are being used by courts to exclude worthwhile testimony.

11/5/19  A New York Times investigation revealed that alcohol breath tests are under challenge across the U.S. Misconduct by operators, faulty technology and ill-maintained, uncalibrated devices are blamed for chronic inaccuracies. In New Jersey, 13,000 drivers were convicted with machines “that hadn’t been properly set up.” Tens of thousands of like cases are pending elsewhere.

Did you enjoy this post?  Be sure to explore the homepage and topical index!

Home   Top    Permalink     Print/Save     Feedback     


Guilty Until Proven Innocent     Better Late Than Never (I)   (II)     State of the Art...Not!

Taking the Bite Out of Bite Marks     One Size Doesn’t Fit All     More Labs Under the Gun


Arson investigation issues

Posted 10/5/08


Can police crime laboratories be trusted?

    “Of the 33 adjudicated cases from the Wayne County Prosecutor’s Office that were reanalyzed, 3 exhibited Class I inconsistencies.  In total, this equates to approximately 10% of the completed firearms cases having significant errors.  On average, the DPD firearms unit analyzes 1,800 cases per year.  If this 10% error rate holds, the negative impact on the judicial system would be substantial, with a strong likelihood of wrongful convictions and a valid concern about numerous appeals.”

     For Police Issues by Julius (Jay) Wachtel.  These words aren’t from do-gooders wringing their hands about possible miscarriages of justice.  They’re from an official September 2008 report by the Michigan State Police setting out the preliminary findings of an audit of the Detroit crime lab’s firearms unit.

Click here for the complete collection of technology & forensics essays

     Firearms examiners often test-fire recovered guns hoping to link them to crimes.  Firing a weapon leaves markings on bullets and cartridge casings that are supposedly unique to that specific gun.  If a sufficient number of identical marks are present in the same locations on cartridge casings or bullets recovered at a crime scene (or extracted from a body) it’s evidence that they were fired by the same gun.  Naturally, great care must be taken to insure that there are enough points in common.  It’s also critical that there are no dissimilarities; just like in fingerprinting, only one inconsistency rules out a gun as being the source of a particular bullet or cartridge casing.

     Michigan State Police auditors reviewed 200 firearms cases.  Nineteen had “either Class I or Class II inconsistencies”.  A Class I error means that an examiner erroneously declared a match.  Such “false positives” can obviously lead to a wrongful conviction.  In Class II errors, “false negatives,” a match was overlooked, possibly letting a guilty person go free.

     Detroit PD responded by shutting down the entire crime lab -- not just the firearms unit -- and turned over all forensic analysis to the State.  It then set out on the unenviable task of reviewing past cases involving testimony by firearms examiners.  There was little choice, as defense attorneys immediately announced they would begin questioning everything the lab has ever done.  Meanwhile displaced lab employees mounted a protest, claiming that problems in the firearms unit did not affect the good work the lab was doing elsewhere, from fingerprint comparison to DNA.  Their view was undercut by the words of their own superior, who in a September 2008 memo reported that the lab’s overworked and underpaid staff was running four-thousand chemistry and biological cases behind.  (She retired when the lab closed.)

     Alas, Detroit isn’t unique.  In November 2002 Houston shut down the DNA section of the police crime lab after an investigation by a TV station revealed a history of shoddy work.  A subsequent audit of the lab’s DNA work disclosed “a wide range of serious problems ranging from poor documentation to serious analytical and interpretive errors that resulted in highly questionable results being reported by the Lab.”  Issues were also reported in firearms, trace evidence and drug analysis.

     What’s worse, at least two cases of wrongful conviction have been attributed to Houston DNA errors.  Josiah Sutton served four and one-half years of a 25-year term after the lab incorrectly determined that his DNA was present in a sperm sample.  George Rodriguez served seventeen years of a 60-year term; his conviction was due in part to bad witness ID, in part to a mistaken failure to exclude him as a DNA donor, and in part to an incorrect conclusion that a hair found on the victim was likely his.

     That’s not all.  In January 2008, one and one-half years after the Houston lab’s DNA section reopened, its new supervisor was allowed to resign for helping staff members cheat on proficiency exams.  (Amazingly, she was then hired to run the State lab’s DNA section.)

     It happens to the best of labs.  In May 2005 a grievous analytical error at the Virginia State crime lab, reportedly one of the nation’s finest, prompted Governor Mark Warner to order the re-examination of 150 DNA cases.  His move was prompted by a 1985 case, where a prisoner on death row, Earl Washington, came within nine days of being executed before a team of pro-bono defense lawyers finally got him a stay.  In 1993, with the threat of execution again looming, a DNA test (which the State partly botched) got Washington’s sentence commuted to life imprisonment.  It would take another seven years and a correctly performed DNA procedure to conclusively clear Washington and identify the guilty party.  By the time he was finally released in 2000 the innocent man had spent seventeen years behind bars.

     Mismanagement and lax quality control have vexed crime laboratories for decades.  O.J. might have never been in the position to pull the shenanigan in Vegas except for a lab goof.  (His acquittal in the 1994 murder was in large part due to evidence of widespread contamination at the LAPD crime lab.)  But trying to keep labs on the straight and narrow with after-the-fact controls such as accreditation visits is a loser’s game.  As long as facilities are tidy, paperwork is in order, equipment is in proper repair, manuals are up to date and everyone on staff is certified a “pass” is virtually guaranteed.

Be sure to check out our homepage and sign up for our newsletter

     Everyone wants to solve crime through science and technology.  But as auditors in Houston pointed out, running a good lab is an expensive proposition.  When resources are limited -- and when aren’t they? -- it’s easy to wind up with a production-oriented pressure cooker that encourages shortcuts and sloppy work. Throw in a dash of unskilled examiners and a pinch of poor oversight and it’s a recipe for disaster.

     Next week we’ll look at issues in forensic techniques, from fingerprinting to ballistics. Stay tuned!

UPDATES (scroll)

4/7/22  Defense lawyers have long questioned the conclusions of Orange County crime lab DNA expert Mary Hong, whom they claim tilted her conclusions to favor prosecutors. In 2016 she testified that a 2008 murder victim’s DNA was on the defendant’s wrist. But jurors hung. Jailed since 2012, the accused was recently released on a manslaughter plea when Hong conceded during retrial that she couldn’t match the DNA. All her cases - she worked at the lab for 30 years - may need to be reviewed.

2/17/22  San Francisco D.A. Chesa Boudin slammed SFPD for using DNA profiles of bodily fluids voluntarily contributed by victims of sexual assault to help identify the perpetrators of ordinary crimes. Doing so, which he called unethical and, search-and-seizure wise, illegal, reportedly led police to arrest a sex-crime victim for felony theft.

Did you enjoy this post?  Be sure to explore the homepage and topical index!

Home   Top    Permalink     Print/Save     Feedback     


Technology’s Great - Until It’s Not     Accidentally on Purpose     Better Late Than Never (I)

Wrongful and Indefensible     False Confessions Don’t Just Happen     More Labs Under the Gun

Posted 9/28/08


Is brain scanning the new polygraph?

     For Police Issues by Julius (Jay) Wachtel.  Hey, Dick Tracy: don’t knock yourself out pounding the pavement!  There’s a far easier way to solve a whodunit.  Have a suspect put on a helmet full of electrodes.  Then show him a series of photos, including some neutral pictures and some of the crime scene.  Looking at the photos will stimulate brain activity, sending electrical signals through the helmet to an EEG machine.  You’ll wind up with an electroencephalogram, a chart that identifies the precise regions of the brain that the images stimulated.

     Now look closely: if “experiential” areas of the brain “light up” for the crime scene photos, but not for the others, you’ve got your man.  Hook him, book him and reward yourself with a trip to Winchell’s!  If not, move on to the next chump.

Click here for the complete collection of technology & forensics essays

     According to an emerging technology known as BEOS, for “Brain Electrical Oscillations Signature,” there are places in the brain that store memories of events that one actively experienced, not just passively observed.  Proponents claim that’s what makes it possible to distinguish between a killer and someone who merely discovered a body.  Peddled in the U.S. by companies including No Lie MRI and Cephos for use in everything from commercial disputes to intelligence, the technology supposedly far surpasses polygraphy in accuracy.  In fact, it was recently used by prosecutors as evidence in a murder case in Mumbai, India.  To clear herself, a woman charged of poisoning her husband volunteered for a BEOS test.  It wasn’t a wise choice -- the test said she did it.  Oopsie!

     No Lie and Cephos aren’t alone.  A competing technology known as Brain Fingerprinting also gauges the brain’s electrical reaction to visual and aural stimuli, but in a fundamentally different way.  Developed by neuroscientist Larry Farwell, it relies on a well-established neurological phenomenon, the so-called “P300 wave,” an involuntary electrical impulse that our brains generate whenever we recognize (have an existing memory of) something, be it an object or a piece of information.

     For example, tell a suspect that they’re about to see a picture of the murder weapon, but don’t say what it is.  Strap on the helmet (on them, not you) and run a series of slides, say, a gun, a knife, a baseball bat, and what was actually used, Auntie’s embroidery needle.  If he emits a P300 wave when the needle comes up, and only when it comes up, have a scrumptious jelly-filled gut buster on us!  If not, move on.  To his credit, Farwell readily admits that the process has limitations; it won’t work, for example, if word of the needle got out to the public, since everyone would then react to its image. But he claims that when investigators come up with something only the real perp knows, the technology is virtually foolproof.

     Alas, neither BEOS nor Brain Fingerprinting have made it into the judicial mainstream.  (Brain Fingerprinting claims otherwise, but the episodes cited in its website hardly set a precedent.)  According to the landmark Frye decision, before expert scientific testimony can come into court its validity must be widely acknowledged.  But the kingdom of the nerds remains highly skeptical.  As J. Peter Rosenfeld, a pioneer of using brain waves in lie detection points out, there’s a lack of peer review and replication, the sine qua non of scientific acceptance.  Other neuroscientists feel likewise.  “Well, the experts all agree,” says Michael Gazzaniga, director of a UCSB mind-research center, referring to BEOS.  “This work is shaky at best.”

     Unlike the polygraph, which records physiological changes supposedly brought on by the stress of lying, neither BEOS nor Brain Fingerprinting directly measure deception.  They’re also far more passive, as no interaction is required between tester and subject.  Keeping the two apart prevents contaminating the results, but it also means that EEG technicians won’t get what polygraphers really aim for.  It’s the lie detector’s dirty little secret that its real worth isn’t in the squiggles it produces -- the National Academy of Sciences considers those close to worthless -- but on the incriminating statements, admissions and full-blown, tearful confessions that scared, stressed-out subjects occasionally make while in the chair.

     But it’s not just about ends -- means are also important.  The privacy and liberty implications of brain-wave technology are (pardon the pun) mind-boggling.  Just to mention one issue, polygraph subjects are free to clarify and challenge each question before answering.  In contrast, EEG screening is purely passive, allowing sneaky administrators to venture into areas far afield of their manifest purpose without the test subject realizing or having a realistic opportunity to refuse.

Be sure to check out our homepage and sign up for our newsletter

     What’s more, we might not even know that we’re being checked out.  Technology now in development allows the remote detection of “anxious” people.  FAST, an acronym for “Future Attribute Scanning Technology” (how’s that for an Orwellian nightmare) uses cameras and sensors to screen passers-by for hostile thoughts and intentions, assessing characteristics such as facial expressions and pulse rate.  Imagine the false positives that a gaggle of ACLU lawyers would produce!

     Well, we’ve got a label for these precious new techniques:  Mindboarding.  Feel free to use it, but be sure to say that you saw it first on!

Did you enjoy this post?  Be sure to explore the homepage and topical index!

Home   Top    Permalink     Print/Save     Feedback     


Polygraph: Science or Sorcery?

Posted 5/11/08


DNA random match probabilities may be overstated

     For Police Issues by Julius (Jay) Wachtel.  Looking for a growth industry?  Think genetics. With more than one million profiles, California’s DNA databank is the third largest in the world, trailing only those of the FBI and Great Britain. At its 1990 debut the GoldenState’s database only kept track of sex offenders, but it has since expanded to include everyone convicted of a felony.  What’s more, starting next year DNA specimens will be collected from every adult arrested for a felony, a move that should increase the databank’s size by 390,000 profiles each year.

Click here for the complete collection of technology & forensics essays

   When DNA got its start there weren’t databanks, so police had to have someone in mind to make crime scene DNA useful.  Now it’s possible to run unknown DNA through massive databanks like California’s hoping for a “cold hit.”  A recent example is the case of John Puckett, a previously convicted rapist who is appealing his conviction on a thirty-year old rape/murder. An expert testified that there was only one chance in 1.1 million that the match between Puckett’s DNA and the crime scene sample could have happened at random.  With a probability of error that low, prosecutors suggested there was only one explanation: both samples came from the same source.  Not unexpectedly, jurors agreed, sending the 70-year old to prison.

     Since the human genome is exceedingly large, DNA is only typed at thirteen known places (“loci”) in the strand.  Each location has two chemical sequences (“alleles”), one inherited from each parent.  Scientists have determined how often specific loci/allele combinations occur in different populations, such as Caucasian males.  Single combinations are commonplace and can be present in one out of every three or four persons.  Multiple loci/allele combinations occur less frequently.  In this example the probability of randomly selecting a DNA profile with four specific loci/allele combinations is 14 in 100,000.

     Just like with fingerprints, a single dissimilarity between DNA profiles means that they’re not from the same person.  If no differences are observed a sufficient number of identical loci/allele combinations must be present to suggest that they have a common origin.  How many is enough? There’s no set answer. Five and six loci/allele combinations can yield probabilities of a random match in the one-in-a-million range; while seven or more can generate probabilities in the hundreds-of-millions, billions, trillions or even quadrillions.  (For an online tool that lets users run a sample profile, click here.)

     When a suspect is independently developed a subsequent DNA match obviously carries enormous weight. Still, the DNA match alone is not a probability of guilt -- it’s an estimate of the likelihood that DNA drawn at random will match the profile of crime scene DNA.  (Probability of guilt requires that all other pertinent factors be considered. This requires use of Bayes’ theorem.)   Random match probabilities also assume that we only draw once from a population.  But that’s not what happens in cold hits. No one knows whether the match that cooked Puckett’s goose came on the computer’s first draw or last (at the time California had 338,000 DNA profiles online.)  Had the expert witness followed the recommendation of the National Academy of Sciences he would have multiplied the random match probability of one in 1.1 million by the number of draws (338,000), yielding a true random match probability -- in effect, the chance of mistakenly identifying an innocent person -- of one in three.

     Interestingly, the expert told a reporter that he didn’t mention the adjustment, which he agreed was a superior approach, because the judge wouldn’t allow it.  After the trial jurors said that the probability of one in 1.1 million was a key factor in deciding to convict. Asked if correcting it might have affected the verdict a juror said, “of course it would have changed things. It would have changed a lot of things.”

     Bigger DNA databases will yield more matches. While that seems beneficial, more profiles mean  more draws, so the probability that matches may be caused by chance will increase. Of course, random match probabilities with denominators that approach or exceed the population of the U.S. or the planet will remain noteworthy. In any event, understating the probability that a match might point to the wrong person is no solution.  At least one expert has already warned that an invaluable tool for freeing the innocent -- DNA -- could inadvertently become an instrument of wrongful conviction.

Be sure to check out our homepage and sign up for our newsletter

     Only days ago Puckett’s appeal was argued before the California Supreme Court.  Its decision is expected soon. In the meantime keep away from the lottery.  The probability of hitting it is so low that if you do, it could be evidence that you fixed it!

Did you enjoy this post?  Be sure to explore the homepage and topical index!

Home   Top    Permalink     Print/Save     Feedback     


Is Your Uncle a Serial Killer?     DNA: Proceed With Caution

Posted 12/7/07


Lab goofs and dueling “experts” give forensics a black eye

     For Police Issues by Julius (Jay) Wachtel.  New York State’s Inspector General recently recommended that criminal charges be considered against the retired director of the New York Police Department’s crime lab and three former analysts for botching thousands of drug tests in 2002.  Investigators claim that analysts took shortcuts when analyzing large seizures, falsely certifying that every container of suspected drugs was tested, and that managers who suspected something was amiss turned a blind eye. The lapse caused NYPD to start re-examining 3,000 individual drug tests last March. However, by that time more than 700 had been destroyed, bringing every conviction based on those tests into question.

Click here for the complete collection of technology & forensics essays

     Problems at crime labs are nothing new. In June 2007 an investigative panel cast doubt on thousands of convictions in Houston, calling its police lab deficient “across the board,” with serious errors in ballistics, drugs, DNA and serology. The damage was not merely hypothetical, with mistakes responsible for at least three wrongful convictions:  Ronald Taylor, who served 14 years because the lab missed finding the real perpetrator’s DNA on a bedsheet, and George Rodriguez and Josiah Sutton, who served 17 and 4 ½ years respectively due to faulty serology. Nearly two-hundred other cases are on review.

     In May 2005 Virginia’s Governor ordered a review of 150 cases processed through the State’s crime lab after two botched DNA tests nearly led to the execution of Earl Washington, Jr., who served 18 years after being wrongfully convicted of rape. Washington was only nine days away from lethal injection when discrepancies in the case prompted the prior Governor to commute his sentence to life imprisonment.  A properly conducted DNA test later proved that the perpetrator was an already-convicted serial rapist.  Auditors attributed the Virginia lab’s sloppy work to pressures to increase productivity.  A Federal civil jury awarded Washington $2.25 million in compensation.

     Two months after terrorists bombed a Spanish train, leaving 200 dead and 1,400 injured, FBI agents arrested Portland attorney Brandon Mayfield as a material witness.  FBI fingerprint examiners said they matched Mayfield’s fingerprints to latent prints found by Spanish police on a bag of unexploded detonators. Confident that they had the right man (Mayfield is Muslim and represented a suspected terrorist in a civil action), the Feds refused to believe Spanish experts who insisted that the prints were not Mayfield’s.  A chastened FBI eventually apologized when Spanish investigators positively identified the fingerprints as belonging to an Algerian suspect.

     It’s not just lab goofs that give forensics a black eye.  In the recent Phil Spector trial renowned experts argued about, well, everything -- from the cause of the injury to the victim’s tongue, to how far blood spatter can travel, to whether the victim could have coughed after being shot. Spector’s trial is remarkably similar to the 2004 murder trial of Idaho resident Craig Perry, who insisted that the uncle he was accused of shooting committed suicide.  Thanks to blood spatter expert Stuart James, the same witness who raised enough doubt to hang Spector’s jury, Perry won an acquittal. (Demonstrating the whimsical, musical-chairs aspect of forensic “science,” another of Spector’s experts, Dr. Vincent Di Maio, testified against Perry.  Back then Di Maio was still Chief Medical Examiner for San Antonio and working for prosecutors.)

Be sure to check out our homepage and sign up for our newsletter

     A litany of lab disasters, dueling experts, wrongful convictions and bizarre acquittals (O.J. and Robert Blake come to mind) have done little to reassure a skeptical public about the merits of physical evidence.  Police, prosecutors, courts and juries must be confident in the accuracy of laboratories and the trustworthiness of government witnesses. That’s hard to do when labs and experts are captive parts of the law enforcement establishment.  Regaining confidence in forensics calls for a national system of independent, government-funded laboratories, much like the National Institutes, that are operated and controlled by top-notch scientists. Anything less is not good enough.

Did you enjoy this post?  Be sure to explore the homepage and topical index!

Home   Top    Permalink     Print/Save     Feedback     


People do Forensics     Guilty Until Proven Innocent     Better Late Than Never (I)   (II)


Junk science links     US DOJ/OIG report on Brandon Mayfield

Posted 11/27/06


Pop psychology can lead investigators astray

     For Police Issues by Julius (Jay) Wachtel.  “That really ordinary guy living next door could be a serial killer.”  Dave Shiflett of the Bloomberg News says that’s the lesson we can draw from “Inside the Mind of BTK,” John Douglas’s new book about the infamous serial killer Dennis Rader, who tortured and murdered ten Wichita women between 1973 and 1991.

     But, wait!  John Douglas is the most famous FBI profiler ever, an author of several true-crime best sellers and the model for Jodie Foster’s superior in “Silence of the Lambs”. If a sick puppy like BTK can seem so “ordinary”, how could he be identified through profiling?

Click here for the complete collection of technology & forensics essays

     That, according to a lengthy exposé in The New Yorker (“Dangerous Minds,” 11/12/07), is the problem. John Douglas and his FBI colleagues told Wichita police that BTK was an American male with a decent IQ, that he drove a decent car, liked to masturbate, was selfish in bed, a loner (but could get along socially), uncomfortable with women (but could have women as friends,) maybe married, maybe not (but if married his wife was younger or older,) and so forth.  Thankfully, officers managed to eventually solve the case sans profile.  Rader was nothing like the FBI suggested. He was married, with children, active in his church and a pillar of the community.

     Profiling is one of several psychological techniques, along with investigative hypnosis and the recovery of repressed memories, that gained popularity during the free-wheeling 80’s.  Although the latter methods have been thrashed for over-promising, under-performing and generally leading investigators astray, profiling lives on, its findings so elastic that they can seldom be disproven.

     It’s when profilers get specific that the nonsense becomes obvious. On the morning of January 21, 1998, Stephanie Crowe, 12, was stabbed to death in her Escondido (Calif.) home while the family slept. Detectives soon zeroed in on her reticent 14-year old brother, Michael.  After relentless interrogation, he confessed and implicated two friends.  Both got raked over the coals; one confessed while the other didn’t.  Police arrested all three.  They and prosecutors remained confident in the case even after the coerced statements were suppressed. After all, didn’t the FBI profile conclude that the murder was planned?  Didn’t profilers say that the killer had “familiarity, comfort and knowledge” of the residence and the victim’s bedroom?

     Months later, while the boys awaited trial, a violent, mentally ill transient whom detectives originally discounted as a suspect (in part because of the FBI profile) was arrested when DNA testing revealed that those spots on his clothes were the victim’s blood.  Charges against the boys were dropped and the man was convicted and imprisoned.

   On November 5, 2003 Gary Ridgway, the “Green River Killer,” pled guilty to murdering 48 women in KingCounty between 1982 and 1998.  The investigation dragged on for twenty years and several FBI profiles, the first prepared by -- you guessed it -- the celebrated John Douglas.  Their conclusions: the killer was likely an unemployed transient who had left the area and was either dead or in prison.

     Fortunately, the cops had Ridgway in mind all along.  Deputies knew that the married truck driver, a local resident, had a reputation for picking up prostitutes and playing rough. In 2001 new DNA techniques matched Ridgway to four of the victims. He got life without parole.

     During the 1996 Atlanta Olympics a bomb exploded in a city park, leaving two dead and more than one-hundred injured. FBI agents immediately focused on Richard Jewell, the security guard who found the device before it detonated and sounded the alarm, undoubtedly saving many. But the FBI didn’t see him as a hero.  Convinced that the chubby bachelor who lived with his mother fit the profile of a lone bomber, the Feds searched his home and conducted an exhaustive, highly public investigation.  Jewell was cleared after two months.  But the stain on his reputation never disappeared.

     In 2003 police finally caught up with the man responsible.  Eric Rudolph had used identical devices to bomb the park and a string of abortion clinics. He confessed and got life without parole.

     It’s the patina of science that makes profiling so disturbing, lending confidence in conclusions with no more factual basis than the prognostications of a horoscope.  Although recent studies seriously challenge the technique’s reliability, the FBI’s thirty-odd profilers remain on the job, reportedly fielding more than one-thousand requests from local police each year.

Be sure to check out our homepage and sign up for our newsletter

     More than twenty years after its inception profiling chugs on, the embarrassing detritus of a decade when overburdened police and prosecutors were seduced by the promises of pop psychology. Let’s hope it doesn’t take us another twenty to rediscover that it’s shoe leather, not magic, that solves crime.


7/24/17  Mark Safarik, the former FBI profiler whose testimony was critical in the murder conviction of Raymond Lee Jennings, now agrees that he was wrong. It’s a bit late, as the wrongfully convicted man’s exoneration didn’t come until after he had spent eleven years behind bars.

Did you enjoy this post?  Be sure to explore the homepage and topical index!

Home   Top    Permalink     Print/Save     Feedback     

Posted 11/20/07


Its usefulness is mostly as a prop

     For Police Issues by Julius (Jay) Wachtel.  Exposing a stunning breach of national security, Nada Prouty, 37, a former FBI and CIA agent, pled guilty this month in D.C. Federal court to nationalization fraud, illegal computer access and conspiracy. Admitted in 1989 on a student visa, the Lebanese immigrant staged a sham marriage and gained permanent residency. In 1997, now-citizen Prouty was hired by the FBI and allegedly started passing top-secret information about Hezbollah to accomplices.  A few years later the super-achiever landed in the CIA, an even better place from where to compromise American secrets.

Click here for the complete collection of technology & forensics essays

     So what’s the rub?  Prouty sailed through FBI and CIA pre-employment polygraph exams, supposedly the toughest in the universe.  In all likelihood she would still be a mole except that her name came up during an investigation of her brother-in-law, Talal Chahine, who allegedly channeled millions of dollars to Lebanese militants.

     The history of lie detection is replete with disasters. None seems worse than the case of Aldrich Ames, a CIA agent who got rich by exposing his colleagues to the USSR (Ames’ treachery led to the execution of several Soviet citizens who were spying for the U.S.)  While pocketing bundles of cash Ames passed two routine CIA polygraphs, and when caught bragged that he had never employed countermeasures.

     Ames wasn’t lying.  In an exhaustive 2001 report, the National Academy of Sciences concluded that the polygraph is worthless for screening job applicants and employees.  It held out a bit more hope when polygraphs are used for investigating specific, known events (i.e., crimes), but cautioned that research that supports this more limited application lacks scientific validity and probably overstates the technique’s accuracy.

     That’s a warning to take to heart.  Between 1982 and 1998 forty-two women, mostly prostitutes, were murdered in King County, Washington.  Most of their bodies were found in or near the Green River.  Suspicion soon fell on Gary Ridgway, a truck painter whom prostitutes accused of rough treatment.  Ridgway took and passed a police polygraph.  In 2001, improved DNA techniques proved that he was indeed the killer.  Ridgway was arrested and plea-bargained to life without parole.

     Polygraphs are frequently used to narrow the field of suspects.  They are routinely administered to the parents and caregivers of missing and abducted children.  Results are not reassuring.  In the 1997 disappearance of Sabrina Aisenberg, local police, who suspected the parents, called polygraph results “inconclusive,” while an ex-FBI polygrapher hired by the defense insisted that it cleared them. A like controversy dogged the investigation of the 1996 murder of JonBenet Ramsey, where police rejected the findings of a renowned polygrapher who insisted that the victim’s mother and father were being truthful.  (The Ramseys refused to be tested by the FBI because its profilers told police that the murder was probably an inside job.)

     Leery of being led down the wrong path, many savvy investigators shun the polygraph as a “truth machine” but use it as a prop when physical evidence or witnesses are lacking.  Refusing to take a polygraph can land one on the short list of suspects. Even better, a few guilty persons get so intimidated by the black box that they shrivel up and confess even before the test begins.  It’s a form of legalized coercion that leaves no bruises and may be impossible to challenge in court.

Be sure to check out our homepage and sign up for our newsletter

     It’s no surprise that shortcuts to finding the truth are hugely popular.  As long as we’re willing to dig in our pockets there will always be someone happy to supply all the elixirs we want.  We will soon be reporting on other questionable techniques, including cognitive interviews, profiling, investigative hypnosis and the recovery of repressed memories.  Stay tuned!

Did you enjoy this post?  Be sure to explore the homepage and topical index!

Home   Top    Permalink     Print/Save     Feedback