Scientific Testimony: An Online Journal

From, Thompson, W.C., Taroni, F & Aitken, C.G., How the probability of a false positive affects the value of DNA evidence.  Extracted, with permission from the Journal of Forensic Sciences, Vol. 48, No. 1, (Jan. 2003), Copyright ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA  19428-2959.  

Errors Happen

When DNA evidence was first introduced, a number of experts testified that false positives are impossible in DNA testing (6, 8) . This claim is now broadly recognized as wrong in principle (1, 9-12) and it has repeatedly proven wrong in practice (3, 13, 14) . But it has been mentioned frequently, without skepticism, in appellate court opinions (6, 8) .  

Why did experts offer this questionable testimony?  One commentator has suggested that avid proponents of DNA evidence sought to allay judicial concerns about the potential for error by engaging in “a sinister semantic game” (8) . They were able to deny that a DNA test could produce an error by excluding consideration of human error in administering or interpreting the test. Sinister or not, it is misleading to exclude considerations of human error in DNA testing when humans are necessarily involved in the administration and interpretation of DNA tests. For those who must evaluate DNA evidence, it makes little difference what causes a false match, what matters is how often false matches might be expected (9, 15) .

False positives have occurred in proficiency tests (2, 3, 11, 13, 16) and in actual cases (14, 17) . For example, the Philadelphia City Crime Laboratory recently admitted that it had accidentally switched the reference samples of the defendant and victim in a rape case. The error led the laboratory to issue a report that mistakenly stated that the defendant was a potential contributor of what the analysts took to be “seminal stains” on the victim’s clothing (18) . The report also stated that the defendant’s profile was “included” in a mixed sample taken from vaginal swabs. After the sample switch came to light, the laboratory reassessed the evidence and concluded that the “seminal stains” were actually bloodstains that matched the victim’s DNA profile and that the defendant was excluded as a potential contributor to the vaginal sample (19) . 

In 1995, Cellmark Diagnostics admitted that a similar sample-switch error had caused it to report, incorrectly, that a rape defendant’s DNA profile matched DNA found in vaginal aspirate from a rape victim. After the error came to light during the defendant’s trial, Cellmark issued a revised report that stated that the vaginal sample matched the victim’s own DNA profile and that the defendant was excluded as a potential donor (20) .

False positives can also arise due to misinterpretation of test results. One such error led to the false conviction of Timothy Durham (14, 17) . In 1993 a Tulsa Oklahoma jury convicted Durham of the rape of an 11-year-old girl. He was sentenced to 3000 years in prison. The prosecution presented three pieces of evidence against him: the young victim’s eyewitness identification, testimony that Durham’s hair was similar (in microscopic examination) to hair found at the crime scene, and a DNA test (DQ-alpha) that reportedly showed that Durham’s genotype matched that of the semen donor. Durham presented eleven witnesses who placed him in another state at the time of the crime, but the jury rejected his alibi defense.  Fortunately for Durham, post-conviction DNA testing showed that he did not share the DQ-alpha genotype found in the semen. He was also excluded at several other genetic loci in multiple tests. The initial DNA test result that helped convict Durham was proven to have been a false positive. The error arose from misinterpretation. The laboratory had failed to completely separate male from female DNA during differential extraction of the semen stain.  The victim’s alleles, when combined with those of the true rapist, produced an apparent genotype that matched Durham’s. The laboratory mistook this mixed profile for a single source result, and thereby falsely incriminated an innocent man. Durham was released from prison in 1997 (14) .

Although experience has shown that false positives can occur, the rate at which they occur is difficult to estimate on the basis of existing data. Most laboratories participate in periodic proficiency tests, which can cast some light on the potential for error. European forensic laboratories have carried out collaborative exercises involving analysis of stains from known sources (21-26) . However, this work is designed more to test the uniformity of DNA test results among laboratories using the same protocol than to determine the rate of errors. In the United States, TWGDAM guidelines call for each analyst to take two proficiency tests each year (27) and proficiency testing is a requirement for laboratory certification under the program administered by ASCLAD-LAB (28) . However, these tests generally are not well designed for estimating the rate of false positives. The tests typically are not blind (i.e., the analysts know they are being tested), they involve limited numbers of samples, and the samples may be easier to analyze than those encountered in routine casework.

In 1992 a report of the National Research Council called for more extensive proficiency testing, declaring that “laboratory error rates must be continually estimated in blind proficiency testing and must be disclosed to juries.” (1) . The NRC called for external, blind proficiency tests “that are truly representative of case materials (with respect to sample quality, accompanying description, etc.)”. Thereafter, the Federal DNA Identification Act of 1994 required the director of the National Institute of Justice (NIJ) to report to Congress on the feasibility of establishing an external blind proficiency testing program for DNA laboratories. But the move toward external blind proficiency testing lost momentum when the NIJ director raised a number of practical concerns. It was dealt another blow by the 1996 report of the National Research Council, which downplayed the need for proficiency testing.  The 1996 NRC report suggested that the problem of laboratory error be addressed through a variety of means, and concluded that the best way to safeguard against error is to allow re-testing of samples (28) .  

 

 

References

 

1.         Natl. Res. Councl.  DNA Technology in Forensic Science. Washington, D.C.: National  Academy Press, 1992.

2.            Thompson WC, Ford S. The meaning of a match: Sources of ambiguity in the interpretation of DNA prints. In: Farley J, Harrington J, editors. Forensic DNA Technology. New York: CRC Press, Inc, 1991;

3.            Thompson WC. Subjective interpretation, laboratory error and the value of forensic DNA evidence: three cases studies. Genetica 1995;96:153-68.

4.         Taroni F, Aitken CGG. Forensic science at trial. Jurimetrics 1997;37:327-37.

5.         Kaye DH, Sensabaugh GF. Reference Guide on DNA Evidence. In: Cecil J, editor. Reference Manual on Scientific Evidence. Washington, D.C.: Federal Judicial Center, 2000;2:485-576.

6.            Thompson WC. Forensic DNA Evidence. In: Black B, Lee P, editors. Expert Evidence: A Practitioner's Guide to Law, Science and the FJC Manual. St. Paul, Minn.: West Group, 1997;195-266.

7.         Jerome Smith v. State. Southern Reporter, Alabama Court of Criminal Appeals, 1995;677:1240-48.

8.            Koehler JJ. Error and exaggeration in the presentation of DNA evidence. Jurimetrics 1993;34:21-39.

9.         Kaye D. DNA evidence: probability, population genetics, and the courts. Harv J Law  Technol 1993;7:101-72.

10.            Jonakait RN. Stories, forensic science and improved verdicts. Cardozo L Rev 1991;13:343-52.

11.            Koehler JJ. DNA matches and statistics: important questions, surprising answers. Judicature 1993;76:222-9.

12.            Thompson WC. Comment. In Roeder K., DNA fingerprinting: a review of the controversy. Stat Sci 1994;9:263-6.

13.            Koehler JJ. The random match probability in DNA evidence: irrelevant and prejudicial? Jurimetrics 1995;35:201-19.

14.            Thompson WC. Accepting Lower Standards: The National Research Council's Second Report on Forensic DNA Evidence. Jurimetrics 1997;37(4):405-24.

15.       Mueller L. The use of DNA typing in forensic science. Acct in Res 1993;3:1-13.

16.       Roeder K. DNA fingerprinting: a review of the controversy. Stat Sci 1994;9:222-47.

17.       Scheck B, Neufeld P, Dwyer F. Actual Innocence. New York: Doubleday, 2000.

18.       Brenner L, Pfleeger B. Investigation of the Sexual Assault of Danah H. Philadelphia (PA): Philadelphia Police Department DNA Identification Laboratory; 1999 Sept. 24. Lab No.: 97-70826.

19.       Brenner L, Pfleeger B. Amended Report: Investigation of the Sexual Assault of Danah H. Philadelphia (PA): Philadelphia Police Department DNA Identification Laboratory; 2000 Feb. 7. Lab No.: 97-70826.

20.       Cotton RW, Word C. Amended Report of Laboratory Examination. Germantown (MD): Cellmark Diagnostics; 1995 Nov 20. Case No.: F951078.

21.            Schneider PM, Fimmers R, Woodroffe S, Werrett DJ, Bär W, Brinkmann B, et al.  Report of a European collaborative exercise comparing DNA typing results using a single locus VNTR probe. Forensic Sci Intl 1991;49:1-15.

22.       Gill P, Woodroffe S, Bar W, Brinkmann B, Carracedo A, Eriksen B, et al. A report of an international collaborative experiment to demonstrate the uniformity obtainable using DNA profiling techniques. Forensic Sci Intl 1992;53:29-43.

23.       Gill P, Kimpton C, D'Aloja E, Anderson JF, Bar W, Brinkmann B, et al. Report of the European profiling group (EDNAP) - Towards standardisation of short tandem repeat (STR) loci. Forensic Sci Intl 1994;65:51-9.

24.            Kimpton C, Gill P, D'Aloja E, Anderson JF, Bar W, Holgersson S, et al. Report on the second collaborative STR exercise. Forensic Sci Intl 1995;71:137-52.

25.            Wiegand P, Amgach E, Augustin C, Bratzke H, Cremer U, Edelman J, et al. GEDNAP IV and V.  The 4th and 5th stain blind trials using DNA technology. Intl J Legal Med 1995;108:79-84.

26.            Anderson JF, Martin P, Carracedo A, Dobosz M, Eriksen B, Johnsson V, et al. Report on the third EDNAP collaborative STR exercise. Forensic Science International 1996;78:83-93.

27.            Technical Working Group on DNA Analysis Methods (TWGDAM). Established guidelines for a quality assurance program for DNA testing laboratories; including RFLP and PCR technologies. Crime Lab Dig 1995;18:44-75.

28.       Natl Res Councl. The Evaluation of Forensic DNA Evidence. Washington, D.C.: National Academy Press, 1996.

29.       Balding DJ. Errors and misunderstandings in the second NRC report. Jurimetrics 1997;37:469-76.

30.            Thompson WC. DNA Evidence in the O.J. Simpson Trial. U Colorado L Rev 1996;67(4):827-57.

31.       People v. Venegas: California Reporter, California Supreme Court, 1998;18:47-88.

32.       Aitken CGG. Statistics and the Evaluation of Evidence for Forensic Scientists. Chichester: J. Wiley & Sons, 1995.

33.       Schum DA. Evidential Foundations of Probabilistic Reasoning. New York: John Wiley & Sons, 1994.

34.            Robertson B, Vignaux GA. Interpreting Evidence.  Evaluating Forensic Science in the Courtroom. Chichester: J. Wiley & Sons, 1995.

35.            Lempert RO. Modeling Relevance. Michigan L Rev 1975;75:1021-101.

36.            Friedman RD. Answering the Bayesioskeptical Challenge. Intl J Evid Proof 1997;1:276-8.

37.       Ceci SJ, Friedman RD. The Suggestibility of Children: Scientific Research and Legal Implications. Cornell L Rev 2000;86(1):33-108.

38.       Schlup v. Delo: United States Reports, U.S. Supreme Court, 1995;513:298-322.

39.            Thompson WC, Schumann EL. Interpretation of statistical evidence in criminal trials: The prosecutor's fallacy and the defense attorney's fallacy. Law Hum Behav 1987;11:167-87.

40.            Donnelly P, Friedman RD. DNA database searches and the legal consumption of scientific evidence. Michigan L Rev 1999;97:931-84.

41.            Leonard J.  Using DNA to trawl for killers.  Los Angeles Times 2001 Mar 10; Sect. A:1 (col . 1).

42.       Balding DJ, Donnelly P. Evaluating DNA profile evidence when the suspect is identified through a database search. J Forensic Sci 1996;41(4):603-7.

43.            Koehler JJ. Why DNA likelihood ratios should account for error (even when a national research council report says they should not). Jurimetrics 1997;37:425-37.

44.            Peterson JL, Gaensslen RE. Developing criteria for model external DNA proficiency testing: Final Report. Chicago (IL): University of Illinois at Chicago; 2001 May. 

45.       Schum DA, DuCharme WM. Comments on the relationship between the impact and the reliability of evidence. Org Behav Human Perf 1971;6:111-31.