Pseudoscience seems to be everywhere: from anti-vaccine activism and HIV/AIDS denialism to postmodern fake math and quantum quackery. It would appear that no area goes unharmed by the enormous reach of pseudoscience. It has even infiltrated law enforcement where it injects pseudoscientific notions such as hypnosis and graphology, prevent safeguards against error in interrogations and fingerprint analysis or making it more difficult to remove practices that are not supported by the evidence, such as criminal profiling, the use of truth serum or lie detector tests.
This article will discuss a paper by Lilienfeld and Landfield (2008) published in the journal Criminal Justice and Behavior that examine these issues in detail. Lilienfeld and Landfield provide a tentative definition of pseudoscience and then goes on to list ten warning signs of pseudoscience together with an example of each from law enforcement. Finally, the make some recommendations for combating the influence of pseudoscience in law enforcement. It must be pointed out that some of these examples, such as interrogation and fingerprint analysis, are not by themselves pseudoscience. Rather, it is the way that some proponents make use of these tools without the proper safeguards against mistakes that is deserving of the label of pseudoscience.
What is pseudoscience?
Lilienfeld and Landfield (2008) point out that pseudoscience, like the concept of day and night, are fuzzy concepts. Nevertheless, they manage to encapsulate a couple of key aspects of pseudoscience in their descriptions. They state that “pseudosciences are disciplines that possess the superficial appearance of science but lacks it substance” and “pseudosciences are imposters of science: they do not play by the rules of science even though they mimic some of its outward features”.
Ten warning signs of pseudoscience (with examples)
Lilienfeld and Landfield describe ten key warning signs of pseudoscience and provide examples for police work. These warning signs stretch from lack of falsifiability and self-correction to the use of testimonials and the absence of connectivity with broader scientific research fields.
Lack of falsifiability and overuse of ad hoc maneuvers
If a model is falsifiability it means that if certain conditions were true, the model would be false. It also entails that models who purport to be scientific should be subjected to harsh tests that are dangerous to the model in an effort to find out whether or not it is a model supported by the evidence or not. When pseudoscientific ideas are falsified by experiments, proponents typically invent wild and improbable explanations to prevent the idea from sinking. These are called ad hoc hypotheses or ad hoc maneuvers. The researchers do point out that some scientific disciplines also sometimes use ad hoc hypotheses, but this generally increases the degree of falsifiability instead of decreasing it. They contrast this with how ad hoc hypotheses are used in pseudoscience, where it serves as a desperate move to avoid falsification. The example Lilienfeld and Landfield is fingerprint analysis. Although they do not reject fingerprint analysis as such as pseudoscience, they point out that some proponents go too far because they assert that fingerprint analysis is essentially 100% accurate. When a fingerprint analysis fails, proponents of the infallibility of finger print analysis blame it on bad prints, misuse of machines, bad fingerprint analyst and so on. Lilienfeld and Landfield point out that those claims are only falsifiable if there exists standardized methods for training and measurement of fingerprint accuracy and if the error rate of each fingerprint analyst in the test is known before the trial begins instead of being injected as an ad hoc maneuver after failure.
Evasion of peer-review
Peer-review is not perfect, but it is a viable tool for weeding out bad studies. When researchers refuse to submit their work to peer-review, it is often because they want to shield their claims from skeptical investigation. Announcing findings before peer-review can be dangerous because it could be due to a mistake (cold fusion being the classic example). The example provided for this warning sign is graphology, which is the mistaken notion that handwriting tells you something about the psychology of the writer. Despite being used as a method to find various kinds of criminals, there is zero scientific evidence in favor graphology and plenty of studies showing that it is a failed approach.
One of the most useful features of science is that it is self-correcting. Most of the time, mistakes are critically analyzed and excised. In pseudoscience, however, false claims tend to linger over a very long time, even if modern science has decisively refuted it (e. g. precession refutes astrology). The example given for lack of self-correction is truth serum. As it turns out, truth serum is not some kind of magical drug that makes it impossible to lie or otherwise withhold the truth. In reality, truth serums are often barbiturates and works in a similar fashion to alcohol: it reduces inhibitions (making you disclose more information, but not necessarily more accurate information). There is also evidence that it is possible to lie during the effects of a truth serum.
No safeguards against confirmation bias
Confirmation bias is a cognitive bias that occurs when people tend to remember evidence that supports their position and forget or dismiss evidence that contradicts it. Lilienfeld and Landfield calls it “psychological tunnel vision”. Although scientists are not immune against confirmation bias, rigorous research methods and peer-review work as safeguards against some of the confirmation bias that can occur. The interrogation process is a place were confirmation bias can be very dangerous. If interrogators make decisions based on a hunch or their gut feeling, they can become close-minded to disconfirmatory evidence. Even worse, if interrogators believe that a certain interviewee is guilty of a crime, then that fact itself may produce so called “guilty behaviors” (i.e. nervousness and shifting) in the interviewee which in turn is interpreted as evidence for guilt by the interrogator.
Strong reliance on testimonials anecdotal evidence
Anecdotes are often unverified, probably not representative of the phenomena or population under study and seldom contradict alternative explanations for a given discovery. Thus, they cannot constitute scientific evidence. One area were anecdotal evidence and testimonials are given a very strong consideration is eyewitness testimony. The innocence project details many hundreds of cases were alleged criminals, exonerated by DNA evidence, were convinced primarily based on eyewitness testimony. This is because eyewitness memory can be contaminated by a wide range of differences processes in each step from the event to the courtroom and human memory does not work as a video recorder.
Scientists should never claim conclusions that are not supported by the evidence. Instead, scientific models are tentative and only reign as long as the evidence supports them. When it comes to pseudoscience, it is reversed: prominent proponents of pseudoscience almost always overstate their conclusions into areas were the evidence is not in their favor. As Carl Sagan once said, “extraordinary claims require extraordinary evidence”, but some quack “treatments” claim a considerably higher efficacy rate than justified by the evidence (which either do not exist or contradicts the “treatment” should it exist). The example provided by Lilienfeld and Landfield is polygraph tests. Some proponents claim that it is essentially an infallible tool for detecting lies. In reality, it is not even close to 100% accurate and barely performs better than chance. The key to understanding the problem with polygraph tests is that it measures physiological responses, but the method is thwarted by three key facts: (1) when accused of a terrible crime, even innocent people have a physiological response, (2) psychopaths may not have sufficient stress reactions and (3) forensic countermeasures may obfuscate inferences from physiological responses.
Appeal to tradition
The fact that a tool or belief has been around for a very long time says nothing about its validity or truth. An error does not become a fact just because people are too stubborn to give it up. Hypnosis is not a viable tool for retrieving accurate versions of forgotten memories, yet it is sometimes still used in police investigations instead of the more accurate extended cognitive interview.
Reversing the burden of evidence
Typically, the proponent of a claim has the burden of evidence for establishing the truth of the claim and it is fallacious to invent a claim and then assert that skeptics have to disprove it before the person is willing to abandon it. The classic example from pseudoscience where UFOlogists insists that skeptics disprove every single image of an unidentified object. In reality, the burden of evidence is on the UFOlogist, not on the skeptic. The example from police work provided by Lilienfeld and Landfield is the usage of anatomically correct dolls. These are not valid tools for finding evidence of sexual abuse among children because the control group consisting of non-abused children also use the dolls for sexualized play. They are also not really standardized, further adding to the difficulty of using such dolls. Despite this, a report from the American Psychological Association subtly shifted the burden of evidence when they failed to take a stance against their usage. If there is no evidence, then agnosticism is not the proper response, but rather advocating caution or even ban until the evidence has collected.
No connection to the overarching scientific literature
Pseudoscience often fail to build on the existing body of scientific research. ESP powers are alleged to not follow inverse square laws and we can add that homeopathy ignores most of physics, chemistry and biology in its rejection of dose-response. Criminal profiling tends to lag behind research on clinical versus statistical predictions, as it has been known for almost 60 years that statistical predictions outperform clinical predictions in this area.
Use of hypertechnical language
Pseudoscience often abuse scientific terminology and it is very common for proponents of pseudoscience to inaccurately deploy terms like “energy”, “frequency”, “vibration” and “quantum” to prop up their crank beliefs. The final example provided by Lilienfeld and Landfield is eye movement desensitization and reprocessing (EMDR), which is no more effective than behavioral exposure therapy and the rapid eye-movements has no relation to its efficacy. Despite this, proponents of EMDR has invented impressive-sounding babble to explain the alleged efficacy of EMDR, including “brain hemisphere synchronization” and “downward shifting of receptor valence”. Hypertechnical language does not mean scientific substance.
What are some possible solutions to the problem of pseudoscience in law enforcement?
Lilienfeld and Landfield point out that pseudoscience in law enforcement is very dangerous, because it can be a powerful contributing factor in making law enforcement personnel take the wrong decisions.
For instance, if they are not wary about confirmation bias in interrogation, it can lead to false confessions and wrongful convictions.
To try to prevent this, they propose a number of potential methods to mitigate the problem of pseudoscience in law enforcement:
(1) attend the warning signs of pseudoscience
(2) incorporate guidelines for distinguishing science from pseudoscience into law enforcement training programs
(3) put in safeguards against cognitive biases
Scott O. Lilienfeld, & Kristin Landfield (2008). Science and Pseudoscience in Law Enforcement: A User-Friendly Primer Criminal Justice and Behavior, 35 (10), 1215-1230 DOI: 10.1177/0093854808321526.