Lying and Deception Final Study Guide

¡Supera tus tareas y exámenes ahora con Quizwiz!

prosocial lie (politeness ritual)

Consider the following examples from everyday life of being polite-but-not-completely-honest. Researchers refer to these as prosocial lies (i.e., lies told for someone else's benefit): • • This kale-quinoa souffle is delicious but I can't eat another bite Your hair looks just fine • You don't look a day over 30 • Leaving in 5 mins • Nobody thinks you're stupid

truth we observe

Sometimes we find truth in what we observe. Unlike the three preceding ways of arriving at something we believe to be true, this approach requires sensory contact with the object in question. The senses serve as a way of certifying the truth. When someone says, "I saw it with my own eyes so it must be true," or "I know she said it because I heard her say it," they are testifying to their reliance on their own sensory observations as a source of truth

mental reservation

Two ancient but ingenious inventions for helping people who believed in not lying (but needed to lie) were the concepts of mental reservation and equivocation. The first allowed a person to make a misleading statement to someone else (e.g., "I have never cheated on an exam . . .) and silently add the qualification in his or her mind to make it true (e.g., ". . . until the one you gave last week"). As long as the person used the mental reservation, there was no lie and no sin. Equivocation took advantage of words' multiple meanings to mislead without technically lying. For example, when asked by persecutors whether a man they intended to kill passed this way, the equivocator might reply "He did not pass here" with "here" signify-ing the precise spot on which the speaker stands and not the other spots the man actually did walk through (Denery, 2015).

Sissela Bok

"A good strong lie for the sake of the good and for the Christian church ... a lie out of necessity, a useful lie, a helpful lie, such lies would not be against God, he would accept them" (Bok, 1978, p. 47). Needless to say, those who tried to follow Augustine's prohibition against all lies found themselves sinning a lot. As a result, he conveniently developed a list of eight types of lies. All were considered sins, but some were greater sins than others. Later, Thomas Aquinas boiled the list down to three: (1) lies told in jest, (2) lies told to be helpful, and (3) malicious lies. Again, telling a lie in any category was considered a sin, but only malicious lies were consid-ered a "major" sin. In short, they recognized that their position was problematic, but instead of changing the principle, they created loopholes. Bok (1978) explains: Many ways were tried to soften the prohibition, to work around it, and to allow at least a few lies. Three different paths were taken: to allow for pardoning ofsome lies; to claim that some deceptive statements are not falsehoods, merely misinterpreted by the listener; and finally to claim that certain falsehoods do not count as lies. (p. 34) Answer 2: it is not Right to lie Except as a last Resort Sissela Bok's Lying: Moral Choice in Public and Private Life (1978), provides an extended discussion of this position. Although Bok believes there are circumstances under which lying could be justified, she also agrees with the basic tenets of the "lying is never justified" position: • lying is generally bad for the liar and for society in general • people today lie too often and too easily • unless something is done, the frequency of lies will continue to grow • the more lies that are told, the more trust is destroyed and the more dysfunctional society becomes. For Bok, deception of any kind is a virus that attacks human social life, a position that stands in sharp contrast to fellow philosopher David Nyberg's, who believes that "social life without deception to keep it going is a fantasy" (1993, p. 25). Bok's position is grounded in what she calls the veracity principle, the claim that "truthful statements are preferable to lies in the absence of special consideration" (p. 30). Like the absolutists, her position gives lies an inherently negative value. But despite her philosophi-cal kinship to them, Bok's ethical solution does differ from theirs. Instead of prohibiting all deception, she says certain positive lies may be acceptable—provided they pass a stringent set of tests. Our main concern with these standards is the assumption that, in any given situation, "the truth" and what it means to tell it are perfectly clear (they are often not). Related to this is the equally troubling notion that a person never has to justify telling the truth. Regardless, her three tests are summarized below. We encourage you to research them further and deter-mine for yourself what value they hold. While problematic in respects, her overall position seems light-years ahead of the fantasy world of the "never lie" absolutists. 1. First, no lie is permissible if the same goal can be achieved by using the truth. She argues that observing this restriction will eliminate many lies that are told too easily and without careful examination. Even if a lie saves a life, it is not acceptable if there is a way to save the life by telling the truth: Iflies and truthful statements appear to achieve the same result or appear to be as desirable to the person contemplating lying the lies should be ruled out. And only where a lie is a last resort can one even begin to consider whether or not it is moral-ly justified. (p. 31) Asking liars to consider truthful alternatives is admirable, but how is the prospective liar supposed to find (and be drawn to) them? In the next step, perhaps? • mentally assuming the place of the deceived and considering your reaction • understanding how the lie might affect the deceived as well as others • asking friends/colleagues how they would react One of the results of such a thorough moral investigation should be to brainstorm a list of truthful (and desirable) alternatives to the planned deception—in theory at least. But note that Bok is not asking us to find excuses to lie. Excuses, she says, can be used to forgive deception but not to justify it. So, what acceptable deceptive options remain at this point? Few, if any, but if they do exist they will still have to pass the third test. 3. This final step in Bok's approval process requires the would-be liar to seek the approval of an imaginary audience of "reasonable" people who subscribe to a shared moral code (religious, legal, etc.). It's a "publicity test" of sorts, designed to counter the biases and hastily-drawn con-clusions that often cloud the perspective of liars. Beyond the specific person(s) you're intending to deceive, how would the general public respond (or an audience of millions on social media)? "Only those deceptive practices which can be openly debated and consented to in advance are justifiable in a democracy," she says (p. 181). This doesn't necessarily mean you have to tell the target the specific lie you are going to tell—only that the target be forewarned that a lie might be told. Since most people expect enemies to lie to each other and since one might get the con-sent of reasonable people to lie to an enemy, lying to enemies is justified—but only when there is no truth telling alternative; there is a crisis; and there are open, lawfully declared hostilities. Bok's position begins with the premise that lying is inherently wrong and that exceptions should be exceedingly rare. But how "reasonable" a public is and whether ethical decisions should always be determined by majority perceptions remain unresolved (and critical) issues. Considering how politically polarized the world's major democracies are at the moment, a truly shared moral code across a general public may not even be possible. The antithesis of this perspective views lying as functional—with its goodness or badness being determined by how well it helps or doesn't help in accomplishing one's goals. Isn't it okay to lie, they might argue, if you were trying to avoid harm, to produce benefits, to promote or preserve fairness, or to protect a larger truth (Bok, 1978)? Even though probably all U.S. presidents have lied to the American public, philosopher and cul-tural critic Sissela Bok (1978) believes that in 1960 when President Eisenhower lied about send-ing planes to spy on the Soviet Union, "it was one of the crucial turning points in the spiraling loss of confidence by U.S. citizens in the word of their leaders" (p. 142). Shortly thereafter, lies told to the American public about the conduct of the Vietnam War were revealed in the Penta-gon Papers and President Nixon resigned after it was discovered he authorized and lied about a burglary of the Democratic Party's national headquarters. As a result, says Bok (1978) citing a 1975/76 Cambridge Survey Research poll that found 69% of Americans agreeing that "over the last ten years, this country's leaders have consistently lied to the people" (p. xviii). The goal of serving others takes a permanent back seat to the deceiver's own political desires. And the deceiver's reality becomes distorted by a belief in his or her own lies, which may lead to a sense of invulnerability and righteousness. Bok (1978, p. 173) addresses the issue in this way: As political leaders become accustomed to making such excuses, they grow insensitive to fairness and to veracity. Some come to believe that any lie can be told so long as they can convince themselves that people will be better off in the long run. From there, it is a short step to the conclusion that, even ifpeople will not be better offfrom a particular lie, they will benefit by all maneuvers to keep the right people in office.

imposeur

A term coined by Keyes (2004), "imposeurs" are people who retain most of their own personal identity but fabricate key elements such as experiences they never had, skills they never learned, degrees they didn't earn, jobs they never had, and awards they never received. They may be done with selfish or altruistic motives but may also reflect the imposeur's "deepest yearnings and feelings of inadequacy" (p. 71). To be convincing, these falsehoods may require the use of forged/stolen documents or mislead-ing props. Imposeurs probably occupy all domains of human life, but representative illustrations of their behavior can be seen in stories about relationships, the military, illness, and crime. Even though the deceptive ability of imposeurs is sometimes formida-ble, they are amateurs when compared with full-blown imposters discussed later in this chapter. That's why they are classified separately even though the nature of their behavior is similar. Like other skilled liars, the best imposeurs mix lies with related truths—e.g., "I was in the Army (true) where I flew helicopters (can fly a helicopter but did not fly one in the Army) in Iraq (false)." Another lie may be implicit in this statement if the person was not an Army offi-cer since helicopter pilots are officers. Imposeurs will also invent identities to control and/or defraud a romantic relationship part-ner (Campbell, 2000). People may claim, for example, that they are from a wealthy family or make a lot of money in order to develop a relationship with someone who actually is wealthy. They may try to increase the credibility of such claims by renting or borrowing an expensive car. A person who falsely claims to have an MBA in accounting or investment banking may be setting the stage for taking charge of his or her partner's money. In addition to greed, fabricat-ed stories in close relationships can also be used to control and create dependence—e.g., "I'm a doctor, so you should come to me first when you have any questions about health matters." Sometimes couples work together to make up or embellish aspects of their relationship his-tory to impress friends, draw public attention, and even reap financial gain. A particularly notorious example is Herman and Roma Rosenblat's (2009) memoir Angel at the Fence. The book purports to tell how the couple met and later married under poignant circumstances. According to the story, they met in 1944, when he was a young Jewish man imprisoned in the infamous Buchenwald concentration camp in Nazi Germany, and she (also Jewish) was posing as a Christian with her family at a nearby farm. Rosenblat claimed she had tossed him apples and passed other food to him through an electrified camp fence for several months, until he was transferred to another location. He did not see her again until the war was over and the two had moved with their families to the United States. As the story went, they met in 1957 on a blind date in Coney Island, New York, discovered their "shared past," and married shortly thereafter. In 2007, about a year before the book was to be released, Oprah Winfrey invited the couple to appear on her show (for the second time since 1996) to tell "the single greatest love story in 22 years of doing this show" (Zarrella & Kaye, 2008). Fueled in part by the public attention generated by the Rosenblats' 2007 appearance on her show, the film rights to the book were purchased from the publisher for $25 million. The media coverage also drew attention to their story from several Holocaust scholars. Historian Ken Waltzer, who was familiar with the physical layout of the Buchenwald camp, publicly challenged the story on the grounds that it would have been impossible for civilians or prisoners to approach the perimeter fence without being detected by SS guards. Waltzer also determined that although Roma and her family did pose as Christians in the German countryside, in 1944 they resided in a town over 200 miles away from Buchenwald (Langer, 2015). Several survivors who were in the camp at the same time as Herman were later interviewed and none could recall him ever mentioning a girl throwing apples over the fence (Sherman, 2008). Family members of the couple, including their son, also raised questions about the memoir's accuracy (Siemaszko, 2009). In light of these questions, the publisher canceled the book's release. Both the publisher and the movie studio demanded the Rosenblats return ad-vances totaling $150,000. According to the New York Times, neither advance was ever repaid (Roberts, 2015). Herman then admitted (sort of) in a statement that the story had been invented. "I wanted to bring happiness to people," he wrote. "In my dreams, Roma will always throw me an apple, but I now know it is only a dream" (Day, 2009). He remained astonishingly unrepentant: "It wasn't a lie," he told Good Morning America (QR). "It was my imagination," he argued. "In my imagination, it was true."

contextomy

Although not a household word, "contextomy" is something you know very well. It's more than merely misquoting someone; misquotes are often accidental and many times we are still able to glean the gist of what was origi-nally meant. Rather, contextomy is the act of deliberately altering the fundamental meaning of what someone said or wrote by taking their quote out of context (McGlone, 2005). Here, the deceiver's express purpose is to mislead the audience into believing that the author meant something that, in fact, they did not (it could even, and often is, the exact opposite of what they meant). Sometimes these are harmless and fun, even entertaining. We are often willing to overlook con-textomies when everyone is in on the game. For example, contextomy occurs frequently when critics are quoted in movie advertisements. We all know that these ads often select a word or phrase from a reviewer's comments, which makes it appear that the critic is endorsing the movie. "Tom Hanks gives a brilliant performance," says the ad, which happens to be an entirely accu-rate quote from a critic. But if you bothered to actually read the critic's review, you would find that, in context, the quote has an entirely different meaning: "The only redeeming part of this entire movie, and it is less than a minute in length, is when Tom Hanks gives a brilliant perfor-mance as a barista with attention deficit disorder." In such settings, no one is likely to sue, lodge a formal complaint, or take any other serious action. And if you're naïve enough to believe such advertising on its own to rush to the theater and drop money on a ticket, then buyer beware. In other arenas, such as politics (McGlone & Baryshevtsev, 2015) and courts of law, con-textomy has serious consequences and attributions of intentionally deceptive behavior are more likely. In the 2006 U.S. Senate race in Missouri, television ads (QR) sponsored by Republican Senator Jim Talent criticized opponent Claire McCaskill by featuring quotes like "spreading untruths," and "clearly violated ethical standards" attributed simply to the Kansas City Star. The Star did indeed publish those quotes, but they were words McCaskill's opponents had used and were simply being quoted in the article. The words were not the opinion of the newspaper editors, yet the clearly intended implication was that the Star itself was endorsing Talent and condemning McCaskill, which was not the case. In another ad, Talent's campaign accurately quoted a 2004 St. Louis Post-Dispatch article about McCaskill, which said she "used this office (state auditor) transparently for politi-cal gain." What the ad didn't say was that the quote was taken from an otherwise positive endorsement of McCaskill by the Dispatch, which said she was a "promising and dynamic leader" (FactCheck.org, "Talent for Deception," October 21, 2006).

paltering

As contextomy illustrates, a statement need not be patently false to be perceived (or intended) as deceptive. "Paltering" refers to the use of accurate statements or labels with the intention of conveying an inaccurate impression (Schauer & Zeckhauser, 2009). Unlike a lie of commission, paltering only involves truthful statements; and unlike a lie of omission, it entails actively misleading a target rather than merely omitting relevant information. Advertisers wishing to draw consumers' attention to the contents of an envelope sometimes put a government warn-ing about tampering with the mail on the outside of the envelope while also omitting a return address, thus purposefully creating the false impression the envelope contains an official letter from a government agency. Some PhDs may make restaurant or hotel reservations using the title "Dr.," hoping in the process to lead the establishment to believe they are (typically wealthy) physicians rather than (middle class) academics. Politicians routinely present extreme and unrepresentative examples of social problems with the intent of leading voters into making erroneous gen-eralizations (for example, in reality, welfare fraud is relatively rare and unprofitable; Levin, 2013). For example, Linda Taylor, the woman Ronald Reagan decried for collecting exces-sive public assistance, became known as the "welfare queen." This atypical example never-theless tapped into stereotypes about the welfare system that resonated with many voters who believed the system was corrupt. Rogers, Zeckhauser, Gino, Schweitzer, and Norton (2014) investigated corporate executives' attitudes toward paltering in business negotiations. A sample of executives was presented with the following scenario: Imagine that over the last 10 years your sales have grown consistently and that next year you expect sales to be flat. In order to convey the impression that sales will con-tinue to grow, you might palter by saying 'over the last 10 years our sales have grown consistently' and not highlight your expectation that sales this coming year will be flat. After reading this definition and example of paltering, the executives were then asked about their paltering habits during business negotiations and their attitudes toward this behavior: • A majority of executives reported paltering in some or most of their negotiations (66%). • Only 34% said they had done so in only a few or none of their negotiations. • 80% said they perceived their paltering as "acceptable and honest" communication. These findings suggest paltering is an especial-ly pernicious form of deception because, un-like lies of commission and omission, it allows the deceiver to preserve an honest self-image. However, a follow-up study indicated that the targets of paltering in negotiations generally perceive it as the ethical equivalent of making intentionally false statements.

Munchasen syndrome by proxy

First identified by British pediatrician Roy Meadow (1977) and previously known as Munchausen syndrome by proxy, factitious disorder imposed on another (FDIA) is an often-misdiagnosed form of child abuse in which people induce symptoms of a disease in their own children (usually preverbal infants or toddlers) and then give false reports to med-ical caregivers (Talbot, 2004). The vast majority (95%) of people diagnosed with FDIA are women. Once the abused child is hospitalized, the perpetrator gets the attention she has been seeking. She will pretend to be grief-stricken by what has happened to the child she loves, and the physician and hospital staff provide a wealth of comfort and support. On occasion, FDIA can occur with adults as the victims. For example, a nurse may induce cardiac arrest in a patient to enjoy the exhilaration and excitement derived from being a member of the medi-cal team seeking to remedy the problem. Several methods used by FDIA patients to inflict factitious illness in children have been documented, including poisoning, drawing blood to induce anemia, rubbing dirt into wounds to cause infection, and choking to the point of asphyxiation (Criddle, 2010). In one case, a 6-year-old girl's mother put her own blood in her child's urine sample to make her appear ill. The child saw 16 doctors, underwent 12 separate hospitalizations, was catheterized, X-rayed, and given no less than 8 different antibiotics (Talbot, 2004). Fans of the blockbuster film The Sixth Sense may remember the scene in which Mrs. Collins is revealed to be responsible for murdering Kyra by slowly poisoning her (QR). In real life, there is the shocking case of Gypsy Rose Blanchard and her mother Dee Dee. The subject of HBO's documentary, Mommy Dead and Dearest (QR), Dee Dee forced her daughter to be in a wheelchair and endure a host of maladies, many of them brought on by physical abuse. The situ-ation continued until Gypsy Rose was in her mid-20s, fooling family and friends alike and earning Dee Dee and her daughter free trips to Disney-land and a house built by Habitat for Humanity (Morabito, 2017). All of it began to unravel one day in June of 2015 when Gypsy Rose posted on their joint account, "that Bitch is dead!" The post led authorities to Wisconsin, where they arrested Gypsy Rose and boyfriend Nicholas Godejohn, whom she'd met on a Christian dating site. He was charged with murdering Mrs. Blanchard at her daughter's request. For her part in the murder, Gypsy Rose was sentenced to 10 years in prison. She gave an exclusive interview (QR) to ABC News that was aired in 2018.

suggestibility, and children testifying in court

The legal arena is its own unique environment when it comes to the detection of deception in children. Judges, social workers, and mental health professionals are sometimes able to judge the veracity of children based on their experience with certain traumatic experiences like sexual abuse. When a child involved in a child custody case freely and unemotionally gives details of the abuse and occasionally uses adult terminology, the professional may suspect that the child is re-peating a story that his parent wants him or her to tell. Actual incest victims are more likely to be secretive, manifest depression, and retract the allegations before restating them (Ekman, 1989). Despite a reservoir of experience and knowledge like the preceding example, studies show that experts often find it difficult to distinguish true from false testimony in sex abuse cases (Ceci & Bruck, 1994; Lyon & Dorado, 2008). The story of the McMartin daycare scandal illustrates the unique challenges such cases present. In 1983, the mother of a two-and-a-half-year-old child called police to report that her son had been sodomized at the McMartin preschool in Manhattan Beach, California. As a result, po-lice and social workers began interviewing hundreds of children who were or had been enrolled in the McMartin preschool. Stories of sexual abuse and satanic rituals were commonly re-ported. In addition to accounts of child rape and sodomy, children reported such things as the killing of a horse, being taken on an airplane to Palm Springs, being lured into underground tunnels where day care workers dressed up like witches and flew in the air, the drinking of blood and eating of feces, and the exhumation and mutilation of bodies from a cemetery. One child said they had been regularly beaten with a 10-foot-long bullwhip and taken to the Epis-copal Church where they were slapped by a priest if they did not pray to three or four gods.

doublespeak

The term doublespeak has been around since the 1950s, but it was popularized by an annual doublespeak award given by the National Council of Teachers of English and in the works of William Lutz (1989; 1996; 1999). Lutz uses the term to describe a broad spectrum of language used to deliberately misrepresent reality. The more disturbing ones, he argues, are those used by people in positions of power to mislead others for their own purposes. Rosen's (2003) observations concerning the labels given to programs in the Bush administration are compatible with Lutz' definition of doublespeak: • The "healthy forests" program allows increased logging of protected wilderness. • The "clear skies" program permits greater industrial pollution. • The new "opt in" feature of the Head Start program is simply a way of telling states they will now share a greater portion of the costs of the program. Our individual reactions to doublespeak are more likely rooted in our particular expectations, attitudes, and values than in the language form itself. Unless there is a clear factual and/or tangible referent, words and phrases that don't fit our view of reality are more likely to be seen as a deceptive distortion of reality. But tangible referents are subject to interpretation as well. It is possible that the same behavior could be seen as either the work of "terrorists" or the work of "freedom fighters," depending on whose side you're on. Although the distinctions between them are not always sharp, Lutz illustrates doublespeak by discussing four ways in which language is used: • Euphemisms: These are constructions that try to cover up what might otherwise be a painful reality. Lutz is particularly concerned about euphemisms used to soften reali-ties he believes the public should face. One example is the use of the term "incontinent ordinance" to refer to military bombs and artillery shells that kill civilian non-combatants. He also points out that the subject of the book you are currently reading, lying and decep-tion, is often discussed by people in ways that make it seem less onerous—e.g., "strategic manipulation," "reality augmentation," "disinformation," or "counterfactual proposition." • Jargon: This is specialized language that people in a particular trade or profession understand, but when it is used with outside audiences (like the general public), it can be confusing or mis-leading. For example, a crack in a metal support beam might be described as a "discontinuity," or a tax increase as "revenue enhancement." And instead of saying in plain language that their rocket "blew up," SpaceX is apt to use "rapid unscheduled disassembly" (Hern, 2015). • Bureaucratese: Lutz also calls this "gobbledygook." It is a way of confusing a target audience with technical terms, jargon, long sentences, etc.—i.e., the proverbial "word salad." Lutz (1989) gives the following example. During the investigation of the 1986 Challenger space shuttle explosion, Jesse Moore, NASA's associate administrator, was asked if the performance of the shuttle program had improved with each launch or if it had remained the same. His "answer" was: I think our performance in terms ofliftoffperformance and in terms ofthe orbital performance, we knew more about the envelope we were operating under, and we have been pretty accurately staying in that. And so I would say the performance has not by design drastically improved. I think we have been able to characterize the performance more as a function ofour launch experience as opposed to it improving as a function oftime. (pp. 5-6) • Inf lated language: This occurs when language is specifically used to make things seem better than they really are—to "puff" them up. When the job title of a car mechanic is changed to an "automotive internist," most people don't care and the mechanic is happy. But when Chrysler Corporation "initiates a career alternative enhancement program", and it really means they are laying off 5,000 workers, people are more likely to feel that a deliberate attempt to mislead has taken place.

Thomas Aquinas

This "never" position has no exceptions—lying is always wrong. Good intentions or the absence of harm don't matter either. Even lying to save an innocent life is wrong. Aside from the practitioners of "radical honesty" (Blanton, 2005) it is difficult to find people today who are willing to publicly advocate this position, but several well-known figures from world history did. Augustine (345-430), Thomas Aquinas (1225-1274), John Wesley (1703-1791), and Immanuel Kant (1724-1804) all believed it was never right to lie (Bok, 1978). It should be noted, however, that even though Augustine and Aquinas believed that lying was always wrong and never justified, some lies could still be forgiven by God (BBC, n.d.) For these absolutist theologians and philosophers, the justification for believing it is never right to lie rests primarily on three arguments: 1. Lies, by their very nature, disrespect humanity and, like a disease, infect the liar's character and ruin his or her integrity. 2. Lying violates religious precepts. It is a sin to lie. 3. Lying begets lying. If you lie about something, you'll be tempted to lie about it again and you'll have to tell additional lies to avoid detection. These lies, in turn, lead you to lie about other things. Others will match your lies with their own and eventually society will become utterly dysfunctional. In reality, this absolute prohibition has never been one that people could follow without provisions that excused or redefined their behaviors, as a closer look at its three supporting arguments will show. Augustine, Thomas Aquinas, and John Wesley were Christians who believed the Bible pro-hibited all forms of deception. Kant was not a theologian, but his writings on moral behavior were strongly influenced by his Christian beliefs. But their interpretation is seen as extreme by Christians who do not believe the Bible says that all deception is wrong. They point to biblical examples of deceit that did not provoke God's wrath (e.g., the story of Rahab as told in the book of Joshua). And while Buddhist prayers and Jewish texts also condemn lying, they also allow for exceptions (Bok, 1978). The Quran does grant permission for Muslims to conceal their faith from non-believers (the "taqiya" doctrine), but only to avoid religious persecution (Stewart, 2013). Needless to say, those who tried to follow Augustine's prohibition against all lies found themselves sinning a lot. As a result, he conveniently developed a list of eight types of lies. All were considered sins, but some were greater sins than others. Later, Thomas Aquinas boiled the list down to three: (1) lies told in jest, (2) lies told to be helpful, and (3) malicious lies. Again, telling a lie in any category was considered a sin, but only malicious lies were consid-ered a "major" sin. In short, they recognized that their position was problematic, but instead of changing the principle, they created loopholes. Bok (1978) explains: Many ways were tried to soften the prohibition, to work around it, and to allow at least a few lies. Three different paths were taken: to allow for pardoning ofsome lies; to claim that some deceptive statements are not falsehoods, merely misinterpreted by the listener; and finally to claim that certain falsehoods do not count as lies. (p. 34) Two ancient but ingenious inventions for helping people who believed in not lying (but needed to lie) were the concepts of mental reservation and equivocation. The first allowed a person to make a misleading statement to someone else (e.g., "I have never cheated on an exam . . .) and silently add the qualification in his or her mind to make it true (e.g., ". . . until the one you gave last week"). As long as the person used the mental reservation, there was no lie and no sin. Equivocation took advantage of words' multiple meanings to mislead without technically lying. For example, when asked by persecutors whether a man they intended to kill passed this way, the equivocator might reply "He did not pass here" with "here" signify-ing the precise spot on which the speaker stands and not the other spots the man actually did walk through (Denery, 2015). Such controversial methods to make the "never lie" principle more compatible with normal human experience (where some deception is common) only served to highlight why such complete prohibitions don't work.

control-question technique

Some innocent people might show a strong response to a relevant question (perhaps topics like violence and murder upset them), as in the following sequence of questions in a tradition-al IR approach: 1. Is it afternoon? (irrelevant) 2. Is your name John Doe? (irrelevant) 3. Did you murder Ned Flanders by slicing open his intestines and chopping off his head? (relevant) Obviously, a truthful person's strong physiological response to this third question is hardly a fair test. So the "solution" to this glaring problem was the development of control ques-tions (also known as comparison questions), and the control/comparison question technique (CQT) was born. Irrelevant questions like one's name or time of day generally aren't going to bother anyone, but comparison questions are intended to be more alarming to non-deceptive people than even directly relevant questions—by creating uncertainty about their answers. A non-deceptive person (who believes a polygraph really works) will be far more certain about their answer to a relevant question ("Did you steal Matt Griffin's money?") than to a comparison question ("When you were a teenager, did you ever steal anything from anyone?"). Polygraph exams using the CQT typically ask three types of questions: neutral, control, and relevant. The questions are asked in this same sequence several times (adapted from Vrij, 2000, p. 176): Neutral 1: Do you live in the United States? Control 1: During the first 20 years of your life, did you ever take something that did not belong to you? Relevant 1: Neutral 2: Did you take that camera? Is your name Inigo Montoya? Control 2: Relevant 2: Prior to 1987, did you ever do something dishonest or illegal? Did you take that camera from the desk? Neutral 3: Were you born in the month of November? Control 3: Relevant 3: Before age 21, did you ever lie to get out of trouble or to cause trouble for someone else? Did you participate in any way in the theft of that camera? Some modifications of the Control Question Test exist. Probable-lie CQT questions, for example, try to induce innocent people to lie on their own by choosing to answer "no." But in certain circumstances, directed-lie CQT questions might be used instead. In that case, examinees are told in advance to lie (by answering "no") rather than leaving it up to them to decide whether to lie or not (Honts, 1994; National Research Council, 2003; Raskin, Kircher, Horowitz, & Honts, 1989). The goal is to try to make sure the examinee is not telling the truth on the control questions, but critics such as Lykken (1998) say that such variations do little to improve the test. In the positive control CQT scenario, examinees are told to answer a relevant question truthfully one time and deceptively the next—assuming that greater physiological arousal will occur with the lie. But Lykken (1998) points out that a rape victim might have similar levels of arousal for truth-ful and deceptive answers to questions like, "Did he use threats and force you to have intercourse?" It is no surprise that the assumptions associated with the control question approach have been challenged: • For one thing, people don't always react in predictable ways. A memory triggered by a word or phrase in a relevant question may heighten the arousal of an innocent person and make them seem guilty. • As noted, the success of every polygraph exam is highly dependent on the skills of the exam-iner, who must be able to convince suspects that their lies will be exposed. • The examiner must also be able to construct relevant questions that will elicit the responses liars and truth tellers are expected to give. In the question, "Did you pass the counterfeit money?" both liars and truth tellers can truthfully answer "yes." But it is likely that a "no" answer to the question, "Did you know the money you passed was counterfeit?" is not a true answer for both guilty and innocent examinees. • Questions also need to be adapted to the person being tested—e.g., "Have you ever told a lie to avoid getting into trouble?" might be a useful control question for some examinees, but not for a person with a long criminal record. As we have seen in exploring these various techniques, a polygraph exam is not a single, standardized test, so the influence (and skill and experience) of the examiner can have a profound effect on the results. Another one of the many possible exam types, the Concealed Information Technique, is discussed next.

rationalization

Rationalizing goes hand in hand with self-deception. We "rationalize" inconsistent acts, beliefs, and feelings to our self in order to justify self-deception. Often this intrapersonal justification is done at a subconscious level and the self-deceiver may not know exactly how he or she has rationalized a particular behavior unless he or she is forced to overtly confront it in a social context. Excuses may also be a vehicle for "making sense" out of irrational and/or inconsistent behavior when talking to others. It wouldn't be surprising if stating an excuse out loud was the first time a self-deceiver had con-sciously justified their belief or behavior even to themselves. We learn from an early age that we are better off when we can give an acceptable reason to explain why we did something—even if we really don't know why we did it. Sanford (1988) says the "thirst for rationality" is a major source of lies and he maintains that self-deception could not exist without it. When excuses are used to rationalize be-havior, they are designed to minimize the fault of the actor—i.e., masking "I have acted improperly and am guilty" with "I have done nothing wrong and am not responsible." According to Snyder (1985) excuses may: dismiss or deny the problem diminish the degree of harm done downplay one's responsibility for the problem The following are examples of rationalization used as a self-defense mechanism: • During the Holocaust, German physicians participated in gruesome and inhumane experiments on prisoners in concentration camps (Roelcke, 2004). Some viewed their work as a "duty to their country" and others saw it as "furthering scientific knowledge." • A business executive goes to his hometown on business and visits his mother. To show her he is doing well, he takes her to a fancy restaurant. When he seeks reimbursement for the trip from his employer, he lists the high bill as a travel expense. He tells himself, "The rules in such situations aren't clear. After all, my mother always has good business advice." In this manner, he frames the rules in the situation as ambiguous, avoids a moral dilemma, acts as he wishes, and does not feel bad about it. • Georgina is an alcoholic. She attends weekly meetings of Alcoholics Anonymous (AA) and this makes it harder for her to start drinking again. But recently she stopped attend-ing these meetings. The reason she stopped attending, she said, was because her day job at a rehabilitation center involved working with addicts all day long and she felt like another hour in the evening with addicts was more than she could handle. Rationalization is sometimes facilitated by relationship partners. A wife, for example, may construct excuses and rationalize her husband's misdeeds, make his faults look like virtues, and downplay the significance of his shortcomings. Her behavior is based on the hypothesis that if her husband feels good about himself, the relationship will be better for her. And, indeed, Murray and Holmes (1996) found that peo-ple in satisfying relationships do exhibit these "positive illusions."

Felipe Fernandez-Armesto

Fernández-Armesto (1997) suggests that people throughout history have sought and experienced truth in one or more of the following ways: 1. Felt truth, based on personal feelings/emotions 2. Told truth, received from other people or sources 3. Reasoned truth, arrived at through the use of logic 4. Observed truth, perceived via our senses In this view, any given truth a person acquires is derived from one or more of these sources. And we often determine the truth of something in different ways as time passes. For example, we may initially believe something because we feel it, then later reinforce that belief by read-ing a book by someone else who has the same belief and, many years from now, recall that our belief came about through reasoning. Each approach has enjoyed a special prominence during different human epochs (the Enlightenment, for example, was known as the Age of Reason [QR]). On a much smaller scale, these approaches provide a useful lens for helping us under-stand how people find the truths they use to cope with the uncertainties of daily life. We'll consider each in turn As Fernández-Armesto (1997) says: We cannot get to the conclusion that Socrates is mortal without saying fi rst, " If all men are mortal," or that spheres are round without fi rst specifying roundness as a defi ning characteristic ofa sphere. " You can only fi nd truth with logic," as Chesterton said, " ifyou have already found truth without it." (p. 119)

Ernest Becker

In his Pulitzer Prize-winning work The Denial ofDeath , anthropologist Ernest Becker (1973) argued humans are so frightened by awareness of their own mortality that they have created an elaborate and collective defense mechanism to buffer them from the fear: the notion of "civ-ilization" or culture. We are able to transcend the fear of death through the noble work of "civilization" (the arts, philosophy and religion, law, government, etc.). By focusing our atten-tion and efforts on the "immortality project" of civilization, our symbolic selves may enjoy a sense of eternal life our physical selves cannot. This in turn gives people the feeling that their lives have meaning, a purpose, and significance in the grand scheme of things. Although critically acclaimed, Becker's book has been controversial for many reasons, not the least of which is his claim that culture in gener-al and religion in particular are mechanisms of mass self-deception. Moreover, Becker asserts that humans will need new "illusions" to replace these mechanisms as advances in science and engineering rob them of their capacity to distract us from death thoughts. He does not specu-late about what these new illusions might be, instead recommending that people come to grips with their own mortality and thereby reduce its power to make them self-deceive.

civilization

There is another variation on mechanisms like denial and rationalization, but in this version it isn't a single person or any one group that's experiencing it. Instead, it's all of humanity, and rather than calling it rationalization or denial, we know it by a much more common name: civilization . In his Pulitzer Prize-winning work The Denial ofDeath , anthropologist Ernest Becker (1973) argued humans are so frightened by awareness of their own mortality that they have created an elaborate and collective defense mechanism to buffer them from the fear: the notion of "civ-ilization" or culture. We are able to transcend the fear of death through the noble work of "civilization" (the arts, philosophy and religion, law, government, etc.). By focusing our atten-tion and efforts on the "immortality project" of civilization, our symbolic selves may enjoy a sense of eternal life our physical selves cannot. This in turn gives people the feeling that their lives have meaning, a purpose, and significance in the grand scheme of things. Although critically acclaimed, Becker's book has been controversial for many reasons, not the least of which is his claim that culture in gener-al and religion in particular are mechanisms of mass self-deception. Moreover, Becker asserts that humans will need new "illusions" to replace these mechanisms as advances in science and engineering rob them of their capacity to distract us from death thoughts. He does not specu-late about what these new illusions might be, instead recommending that people come to grips with their own mortality and thereby reduce its power to make them self-deceive.

Utilitarianism

English philosopher Jeremy Bentham (1748-1832) is normally credited with initiating a philosophical position called utilitarianism—that decisions are moral to the extent they pro-mote happiness and immoral to the extent they do the reverse. Within this framework, moral decisions about lies would be based on the consequences of telling them. Utilitarianism favors those things that benefit the most people and those outcomes in which the advantages out-weigh the disadvantages (Gorovitz, 1971; Hearn, 1971; Robinson, 1994; Smart & Williams, 1973). It would be "right" to tell a lie, then, that resulted in benefits that outweighed the problems or that benefited the most people. As a result, a utilitarian might determine that the telling of a lie in a particular situation was justified even though they simultaneously main-tained the belief that lying usually causes more harm than good and should be avoided.

Paul Ekman

Many law enforcement and military organizations are obsessed with finding a machine that reliably detects lies. Dr. Paul Ekman found this out following the 9/11 attacks. Ekman knows as much about liar behavior and how to accurately observe it as anyone in the world. He has trained members of the FBI, Secret Service, police units, and military intelligence on the observation of liar behaviors. But when he inquired how he might help the CIA, Department of Defense, and other federal agencies, one representative told him, "I can't support anything unless it ends in a machine doing it" (Henig, 2006).

change blindness

Scientific studies, however, tell a different story. Sometimes we do not observe large changes in objects and scenes (change blindness), and sometimes we do not even perceive certain highly visible objects in our visual field (inattentional blindness). For example, drivers may fail to notice another car when trying to turn or a person may fail to see a friend in a movie theater while looking for a seat, even though their friend is waving. Our brain is constantly trying to make sense out of an environment filled with a tremendous array of changing stimuli that vary in intensity. As a result, the brain tries to create a meaningful narrative and overlooks those stimuli that don't fit the narrative being created.

impressionism

impressionism: impression, sunrise (claude monet 1873) Impressionism is a 19th-century art movement characterized by relatively small, thin, yet visible brush strokes, open composition, emphasis on accurate depiction of light in its changing qualities (often accentuating the effects of the passage of time), ordinary subject matter, inclusion of movement as a crucial element of human perception and experience, and unusual visual angles. Impressionism originated with a group of Paris-based artists whose independent exhibitions brought them to prominence during the 1870s and 1880s.

truth bias

A lie detector's accuracy score is typically derived from the number of correct identifications of both truth tellers and liars. However, human observers tend to perceive more messages as truth-ful than deceptive, which results in a greater accuracy for truthful messages and a lower accuracy for deceptive ones. This "truth bias," which negatively affects our ability to detect lies, is a result of both social and cognitive processes. DePaulo (1994) summed it up this way: "the empirical fact is that most people seem to believe most of what they hear most of the time" (p. 83)

confirmation bias

Another source is confirmation bias—i.e., the human tendency to seek out and interpret new information that confirms our prior beliefs (see Chapter 6). Many people who believe stereotypes about lying behavior claim they are consistent with their past experience in successfully spotting liars (Castillo & Mallard, 2012). In all likelihood, they are selectively remembering a time or two when the stereotypes were confirmed but forgetting the majority of instances when they were not. Confirmation bias is the chief mechanism by which stereotypes of many kinds become entrenched in people's beliefs and attitudes (Aronson & McGlone, 2008). Humans have a general "confirmation bias" to seek out and interpret information that supports their prior beliefs or goals.

awareness level

Awareness Level Did the person knowingly and consciously perform the falsehood in question? Was it planned? The answers to such questions matter because, in most cases, lying is perceived as something done with a high degree ofawareness. This is why accused liars will sometimes feign incompetence: "What? I said that? No way. If I did, I must have been completely out of my mind." If this person's defense is believed, the attribution of outright lying is less likely and any accompa-nying sanctions much less severe.

sociopath

The DSM-5 lists sociopathy and psychopathy as subtypes of antisocial personality disorder. Some mental health professionals use the terms interchangeably, but others argue there are im-portant and significant distinctions between the types (e.g., Pemment, 2013). Sociopaths tend to be nervous and prone to emotional outbursts, including fits of rage. They also tend to be uned-ucated, living on the edge of society, and unable to hold a steady job. While they have difficulty forming attachments with individuals, sociopaths may form attachments to groups with an anti-social orientation (street gangs, white supremacists, etc.). If they commit crimes, they are typically spontaneous, haphazard, and disorganized.

general progression of child development through age stages

Between the ages of 2 and 3, children will often make false statements, but it's important to recognize that these utterances bear little resemblance to the teenage or adult versions. The following qualifiers are key to understanding the differences: • Perhaps the most common form of false statements at this age involves denials ofwrong-doing. These denials usually serve one of two purposes: 1. To avoid punishment: "Billy did it. Not me." 2. To get rewarded for good behavior: "I cleaned my plate. I get a cookie now." • They seem to have very little understanding of the effects their behavior might have on their intended target(s). • Some false statements are simply mistakes based on a limited knowledge of the language they are learning (and experimenting with) during this period of development. • Some of these false statements are simply the result of poor memory. • Sometimes a child's everyday reality is enriched by his or her fantasy life (having an imagi-nary friend, for example). • At this age, some false statements are the result of wishful thinking—e.g., saying "My Mom is going to take me to Disneyland" when Mom has never mentioned this possibility. In most kids, the skills typical of adult lying ability (perspective-taking, intentionality, behav-ioral control, etc.) have yet to develop significantly. Nevertheless, a few youngsters may be running slightly ahead of the pack, as these examples illustrate: • An experiment by Chandler, Fritz, and Hala (1989) found evidence that some chil-dren between 2 and 3 did perform acts intended to mislead others into believing something that was false. A puppet hid a "treasure" in one of several containers and children were told the purpose of the game was to hide the treasure from adults who were searching for it. Some of these children not only erased the tracks left by the puppet to the container holding the treasure, but made new tracks leading to an emp-ty container. • Sodian, Taylor, Harris, and Perner (1991) also found a few children under 3 years who understood the idea of creating a false belief, but even these children needed prompting and rarely anticipated the effects of their deception on the target's beliefs. Even the presence of older siblings does not seem to help a great deal in facilitating the un-derstanding of false beliefs for 2-and 3-year-olds, but it can speed up such learning with chil-dren over the age of three-and-a-half (Ruffman, Perner, Naito, Parkin, & Clements, 1998). Ages 3-6 Children develop a theory of mind and thereby acquire perspective-taking ability typically between the ages of 3 and 6. The combination of this ability and their expanding knowledge of intentionality and social norms leads children in this age range to lie with increasing fre-quency and skill. A number of studies support this conclusion. One of the first forms of deception employed by children in this age range is of the "I didn't do it" variety, in which they violate orders issued by adults and then attempt to conceal it: • Researchers have studied this phenomenon in "temptation resistance" experi-ments. A version of this test (QR) was recently featured on the British TV series What Would Your Kid Do? In the original scientific versions of these studies, a child is seated in a room and a toy is placed behind her. An adult experimenter then instructs the child not to peek at or play with the toy for several minutes while the adult leaves the room. The child is covertly monitored while alone in the room and, when the experimenter returns, is asked whether she followed the instructions. Many children don't obey the order, so researchers are watching to see if they confess the transgression or lie about it (Lewis, 1993; Lewis, Stanger, & Sullivan, 1989). In numerous studies using this procedure (reviewed in Lee, 2013), most children between 3 and 5 lie in this situation, but the youngest ones aren't especially convincing. For example: o When 3-year-olds lie about peeking at the toy and then are later asked by the experi-menter to "guess" what the toy might be, many blurt out its name without hesitation, revealing that they both violated the instructions and lied. o A 5-year-old girl who lied about peeking later said, "I didn't peek at it. I touched it and it felt purple. So, I think it is Barney." o Many 6-year-old peekers subsequently feign complete ignorance of the toy's proper-ties (Evans, Xu, & Lee, 2011). Leekam (1992) says that by age 4 or 5, children "understand the effects of a false message on a listener's mind, recognizing that the listener will interpret and evaluate a statement in the light of their existing knowledge." In support of this, Sodian et al. (1991) say that by age 4 most kids have developed an understanding of false beliefs. These children have a lot to learn about how complex the other person's perspective really is and how many different ways it can be tapped, but the basic mechanism for developing this knowledge is now beginning to function. As they get older, children incrementally learn to avoid such blatant inconsistencies. More-over, somewhere in this age range, children begin to tell white lies in situations for which social norms dictate that they not convey awkward truths: • Talwar and Lee (2002) asked 3-to 6-year-olds to take a photograph of an adult who had a large red mark on his nose. Most children lied to this adult when he asked "Do I look okay for the photo?" but later told someone else that he did not look okay. • In a similar study, children in this age range were given an undesirable present (a bar of soap) but told the giver that they liked it, even though their behavior while opening it clearly indicated disappointment (Talwar, Murphy, & Lee, 2007). In this wildly popular Vine (QR), for example, Henry does his best to politely feign enthusiasm for the birthday avocado he's just unwrapped. Despite their sometimes imperfect manifestations of it, this is also a time when we can see a child's early efforts at behavioral control to conceal a lie: • In-depth analyses of children's nonverbal behaviors by Talwar and Lee (2002) reveal that those in the act of telling a lie mimic the behaviors of people who tell the truth (e.g., making direct eye contact with the listener). • When the situation calls for children to avert their gaze when telling the truth (be-cause they have to ponder the answer to a question), they also deliberately avert their gaze when lying (McCarthy & Lee, 2009). • By the age of 6, a child's nonverbal conceal-ment behaviors are coordinated and natural enough to convince many adults that they are telling the truth, including their parents, teachers, social workers, police officers, and judges (Crossman & Lewis, 2006). Ages 6-9 Beyond age 6, the deception phenomenon in children begins to evolve dramatically. When a young child begins spending time with other children in school, new developmental challenges arise. The process of developing and managing new interpersonal relationships and undertaking new tasks may create new conditions for lying. Ford (1996) says some children experience what he calls "double bookkeeping"—keeping family secrets that might be embarrassing or espousing beliefs that fit one's peer group, but not one's family. In addition, many children in this age range are spending a lot of time playing board, card, and sports games that highlight the need for deceptive skills in order to win the game. Vasek (1986, p. 288) puts it this way: "Games, then, provide a situation in which children can prac-tice deception and its detection, learn about its functions, and become acquainted with the social implications of its use." During this period, children are facing a variety of conditions that may prompt them to lie and an increasing variety of situations provide ample opportunities to practice, elicit feedback, and refine their deceptive skills. The teenagers Ekman (1989) interviewed recalled that their first experience in "getting away with" a lie was when they were between 5 and 7 years old. Whereas some young communicators will gain confidence in their deceptive ability, others will be reminded that they still have a lot to learn, as the following classic dialogue illustrates (Krout, 1931, p. 23): "Hello Miss Brown, my son is very ill and, I am sorry to say, cannot come to school today." "Who is talking?" asked the teacher. "My father," the boy answered. This is also a time when adults start pressing children to learn politeness norms (which often require deception). Saarni (1984) promised an attractive toy to groups of 6-, 8-, and 10-year-old children if they performed a particular task for her. After completing the task, the children were given a less attractive toy than had been promised and their facial expressions were observed. Analysis of the expressions showed that as the child gets older, less disappointment is shown in the face. The girls in Saarni's study manifested this ability to facially mask their disappointment in the name of politeness earlier than the boys. Ages 10-12 By the end of this period, most children have developed adult-like deception skills. This doesn't mean that these kids have nothing more to learn—only that many 10-12-year-old chil-dren are able to (and do) lie without being de-tected. Ten-year-olds with more Machiavellian tendencies may be especially adroit at deception, commission and omission. By about age 11 they also think about lying and truth telling differently. Their views are in sharp contrast to 5-year-olds. For example, most no longer believe it is always wrong to lie and fewer are willing to say they've never lied (Peterson, Peterson, & Seeto, 1983). Adults, in turn, hold these pre-teens responsible for knowing what they are doing. Along with their increasing verbal skills, children in this age range also show a greater sophis-tication in their ability to manage their nonverbal behavior as well (DePaulo & Jordan, 1982; Talwar & Crossman, 2011). Notably, the encoding skills of 11-and 12-year-old girls are likely to be superior to their male counterparts. Ages 13-18 With adult-like deception ability in place by the beginning of adolescence, children practice their skills in an ever-expanding range of social interactions. In this period, teens not only hone their skills, but also develop more sophisticated reasoning about whether and when lying serves their interests. The decision to lie or not depends in part on a consideration of whether it will assist in the attainment of a goal and at what cost. The weighing of various facts in this cost-benefit analysis becomes more compli-cated with age. In particular, adolescents give more thought to probabilities than younger children, considering not merely the punish-ment for getting caught but also the different likelihoods of getting caught in various cir-cumstances. They also consider consequences of getting caught that extend beyond the immediate context, such as disappointment in the eyes of friends, parents, and teachers. In particular, parental disappointment is a con-sequence that could hinder the expansion of autonomy children crave in their teens. Thus if getting caught seems like more than a remote possibility, a teen might be hesitant to risk this anticipated cost regardless of a lie's immediate benefit (Perkins & Turiel, 2007). As the first generation to grow up immersed in an online world, teens today have opportu-nities to deceive via technological channels that their parents didn't have at their age. What's more, teens often exploit their parents' lack of experience and technical limita-tions to engage in digital deception that can be risky, rude, and sometimes illegal. Consider the results of this survey of over a thousand teenagers and their parents about online behavior (McAfee, 2013): • About half of the teenagers admitted searching the Internet for material they believed their parents would not approve of (pornography, simulated or real violence, etc.). When asked, 86% of the par-ents didn't believe their children would do these things. • About 70% of teens overall reported hiding their online behavior from their parents. The frequency with which the young respondents reported digital deception was clearly fueled by their parents' complacency and cluelessness. • 62% of parents reported believing their kids cannot get in serious trouble online. • Only 40% of parents reported using software to monitor or restrict their children's online behavior. More than half of the children of these parents claimed to know how to bypass it. Another way to look at how children's capacity for deception evolves into its adult form is to step back and consider it as a series of critical developmental leaps.

staged events

Both major political parties have a long history of creating props, planting audience questions, scripting the president's behavior, and in other ways manipulating events to ensure they cast an impression of a leader who is positive, confident, and in control. In 2005, a satellite tele-vision feed accidentally captured these preparations for a reportedly "spontaneous" give-and-take videoconference President Bush scheduled with American troops in Iraq. It was billed as a chance for the president to hear directly from the troops and White House reporters were summoned to witness the event. The participating soldiers were handpicked and coached for 45 minutes prior to the interaction on what questions the president would ask, what answers they should give, and how best to express their answers. Even with the preparation, there were awkward moments when the answers didn't match the questions. At first, the existence of any rehearsal was denied by the White House and Pentagon. Later, a White House spokesman said the prepping was done in order to make the soldiers feel at ease (VandeHei, 2005).

perspective taking

Children develop a theory of mind and thereby acquire perspective-taking ability typically between the ages of 3 and 6. The combination of this ability and their expanding knowledge of intentionality and social norms leads children in this age range to lie with increasing fre-quency and skill. A number of studies support this conclusion As adults, we take for granted that people have differing needs, beliefs, attitudes, interests, and priorities. But we were not born knowing these things. Somewhere along the way, we had to develop this sense of perspective. Human infants are profoundly "egocentric"—i.e., unable to com-prehend that someone else may have a different mental experience from their own and consequently unable to take another person's perspective (Piaget, 1954). As young children develop, they not only learn that other perspec-tives exist, but also learn how to take those perspectives and use them. Children who can recognize that other people are independent entities who can think on their own are thus said to have devel-oped a "theory of mind" (QR) (McHugh & Stewart, 2012). A coherent theory of mind typically emerges between ages 3 and 5 (although rudiments of this skill, such as following another person's gaze to understand what he or she is look-ing at, appear earlier). Failure to acquire a theory of mind and perspective-taking skills is a hallmark symptom of autism, a developmental disorder that usually appears early in life (Korkmaz, 2011). But even for adult humans, perspective-taking can be challenging. Its accuracy is hindered by the "other minds problem," which occurs because we can never know from a first-person perspective exactly how things are experienced in another person's mind. Some scholars argue that a true understanding of theory of mind is unique to the human species (e.g., Penn & Povinelli, 2007). Regardless, perspective-taking has important social implications: • In both children and adults, it is often associated with greater empathy, prosocial behavior, and more favorable treatment of the person (or group) whose perspective is taken. • Instructing people to take the perspective of a person in need often increases feelings of compassion and leads to offers of help (Malle & Hodges, 2005; Vasek, 1986). • Taking the perspective of another is also essential for someone to engage in deception. Ceci, Leichtman, and Putnick (1992) explain it as the ability to "substitute belief for dis-belief, in accepting the stance of the other." Perspective-taking enables liars to understand the idea of a false belief held by the target: • Knowledge gained through perspective-taking is also invaluable to deceivers in determining what messages are likely to create that false belief. • Some lies can become terribly complex and the liar's ability to anticipate the target's behavior is the difference between a successful and a failed lie.

inserting

By definition, inserting part of one photo into another or combining two separate photos is an act of fiction. If the audience is unaware of the insertion, then it becomes an act of decep-tion as well. In an official photo taken at the 1999 wedding of England's Prince Edward, his nephew, Prince William, did not smile. So William's smiling face from another photo was digitally inserted into the official photo and all was well in the royal household. The same process of insertion is used to "bring back" deceased friends and family members by digitally including them in current photos. As mentioned previously, movies often insert living actors into films with deceased people or insert deceased actors into new films (from Forrest Gump to Rogue One). Since movies are designed to be entertaining, the intent of these insertions is rarely considered deceptive. But attributions of deception can still happen—even if the creator considered it a humorous act rather than a primarily deceptive one (e.g., a 1989 cover of TVGuide placed Oprah Winfrey's head on the body of movie actress Ann-Margret in what was intended by the artist as a lighthearted illustration; it didn't help that "Oprah" was sitting provocatively on a pile of cash). As a general practice, circulating images with the heads of female celebrities on the bodies of other women has become so common that it requires special websites whose purpose it is to catalog the fakes. The University of Wisconsin tried to make the school seem more diverse by inserting the face of an African-American student into a bro-chure photo showing a group of football fans. A lawsuit brought by the student edited into the photo resulted in a $10 million "budgetary apology" to be used for recruitment of minority students and diversity initiatives (Paul, 2014). Such diversifying of universities (in photogra-phy, at least) is widespread. Pippert, Essenburg, and Matchett (2013) analyzed over 10,000 photos from 165 institutions of higher education in the United States and found that the majority of schools portray diversity in marketing materials that differs significantly from the actual student population. The modern trend by universities is to use student actors in photos and videos who are "racially ambiguous" or who appear multi-ethnic (as in this stock photo). That is, instead of risking the perception or claims that they are inflating the proportion of students in their advertising materials who are non-Caucasian, schools are increasingly using images of people who are not easily regarded as any particular ethnicity. In 2003, Los Angeles Times photographer Brian Walski combined two photographs captured moments apart of a British soldier directing Iraqi civilians to take cover during combat. But when an employee of the Hartford Courant (which had used the photo) noticed that the im-age appeared altered, an investigation ensued and Walski was soon dismissed from the Times (Irby, 2003). Apparently, Walski preferred the blending of the two photographs better than either one of them and didn't feel that it significantly changed the story behind the event. But, the Times and other newspapers depend on strict policies about altered photographs in order to maintain their credibility as sources of news. When news photos are altered for any reason, it raises questions about the extent to which other reported information might also be altered to tell a better story. Even though some degree of damage may have occurred in the preceding examples, the orig-inators would claim that harm was not their intent. There are, however, other examples of insertions that were created explicitly for the purpose of hurting others—too many to count, in fact. But for a fairly up-to-date listing at any point in time, check out the "Fauxtography" feature at Snopes. There you'll find the good, the bad, and the truly ugly sides of digital fakery.

theory of the mind

Children develop a theory of mind and thereby acquire perspective-taking ability typically between the ages of 3 and 6. The combination of this ability and their expanding knowledge of intentionality and social norms leads children in this age range to lie with increasing fre-quency and skill. A number of studies support this conclusion. As adults, we take for granted that people have differing needs, beliefs, attitudes, interests, and priorities. But we were not born knowing these things. Somewhere along the way, we had to develop this sense of perspective. Human infants are profoundly "egocentric"—i.e., unable to com-prehend that someone else may have a different mental experience from their own and consequently unable to take another person's perspective (Piaget, 1954). As young children develop, they not only learn that other perspec-tives exist, but also learn how to take those perspectives and use them. Children who can recognize that other people are independent entities who can think on their own are thus said to have devel-oped a "theory of mind" (QR) (McHugh & Stewart, 2012). A coherent theory of mind typically emerges between ages 3 and 5 (although rudiments of this skill, such as following another person's gaze to understand what he or she is look-ing at, appear earlier). Failure to acquire a theory of mind and perspective-taking skills is a hallmark symptom of autism, a developmental disorder that usually appears early in life (Korkmaz, 2011). But even for adult humans, perspective-taking can be challenging. Its accuracy is hindered by the "other minds problem," which occurs because we can never know from a first-person perspective exactly how things are experienced in another person's mind. Some scholars argue that a true understanding of theory of mind is unique to the human species (e.g., Penn & Povinelli, 2007). Regardless, perspective-taking has important social implications: • In both children and adults, it is often associated with greater empathy, prosocial behavior, and more favorable treatment of the person (or group) whose perspective is taken. • Instructing people to take the perspective of a person in need often increases feelings of compassion and leads to offers of help (Malle & Hodges, 2005; Vasek, 1986). • Taking the perspective of another is also essential for someone to engage in deception. Ceci, Leichtman, and Putnick (1992) explain it as the ability to "substitute belief for dis-belief, in accepting the stance of the other." Perspective-taking enables liars to understand the idea of a false belief held by the target: • Knowledge gained through perspective-taking is also invaluable to deceivers in determining what messages are likely to create that false belief. • Some lies can become terribly complex and the liar's ability to anticipate the target's behavior is the difference between a successful and a failed lie.

McMartin daycare scandal

In 1983, the mother of a two-and-a-half-year-old child called police to report that her son had been sodomized at the McMartin preschool in Manhattan Beach, California. As a result, po-lice and social workers began interviewing hundreds of children who were or had been enrolled in the McMartin preschool. Stories of sexual abuse and satanic rituals were commonly re-ported. In addition to accounts of child rape and sodomy, children reported such things as the killing of a horse, being taken on an airplane to Palm Springs, being lured into underground tunnels where day care workers dressed up like witches and flew in the air, the drinking of blood and eating of feces, and the exhumation and mutilation of bodies from a cemetery. One child said they had been regularly beaten with a 10-foot-long bullwhip and taken to the Epis-copal Church where they were slapped by a priest if they did not pray to three or four gods. Sounds hard to believe, doesn't it? Not for the prosecutors, who were so sure of their case that they charged seven people. The multi-year trials that captured national headlines involved a series of acquittals, mistrials, and deadlocked juries. During this time, Peggy McMartin Buckey and her son spent several years in jail and their life savings on legal fees. In 1990, all defendants were acquitted (Eberle & Eberle, 1993; Nathan & Snedeker, 1995). The mother who made the original complaint was later determined to be a paranoid schizo-phrenic. Neighbors and parents who stopped by the day care facility during the day could not corroborate these bizarre happenings and the police were not able to find any physical evidence (e.g., tunnels, witch costumes, and horse bones) to support the allegations. Instead of shutting down the investigation, however, the lack of evidence and corroboration forced prosecutors to make the testimony of the children paramount to the outcome of their case. Were these children telling the truth? It turned out of course that they were not—but why? Videotapes of the initial interviews with children were revealing: • It was not uncommon for adult interviewers to use leading questions and show children dolls with shockingly realistic genitalia ("He did touch you there, didn't he?"). • Outright coercion was also used—e.g., praising kids who confirmed the offenses and bizarre happenings and telling those who didn't that they were "dumb." • Sometimes the answers children gave to court-appointed interviewers were the result of first being "coached" (intentionally or not) in discussions with their parents. As an adult many years later, one of the children admitted he lied in order to please the peo-ple who were questioning him (Zirpolo, 2005). Even though the McMartin case was perhaps the most widely publicized in the United States, there were several similar cases here and abroad during the 1980s. In addition, the allegations of sexual abuse in child custody cases were increasing at this time (indications suggest that between 36% and 50% were later proved to be untrue; Benedek & Schetky, 1985; Cramer, 1991; Ekman, 1989; Green, 1986). Given the obvious importance of determining the truthfulness of children in situations like this, researchers have closely examined issues surrounding a child's competency to tell the truth in court and the extent to which they are subject to adult influence or "suggestion."

Munchausen syndrome

People exhibiting factitious disorder imposed on self (FDIS) are being deceptive about some aspect of an illness in order to encourage attention from medical providers, friends, or family members. The psychological motivations are complex, but are based on the need to relieve emotional distress by playing the role of a sick person (Savino & Fordtran, 2006). A severe personality disorder is often at the heart of this syndrome, but there can be other driving forces. Ford (1996) identified three possibilities: 1. A person who feels vulnerable and incompetent in other areas of life may feel clever, skill-ful, and powerful by fooling physicians and nurses. 2. For a person who needs a clear and well-defined identity, a "sick person," particularly one with a serious disease, satisfies that need while simultaneously generating attention and feelings of self-importance. 3. A person with an excessive and/or unmet need to be cared for and nurtured may also find satisfaction in being treated for a false disease. Patients with FDIS may exaggerate the extent of actual symptoms, or they may make up symptoms they don't have. They could feign an illness altogether or, in extreme cas-es, purposely make themselves sick, engage in self-injury, or tamper with the results of medical tests (Mayo Clinic, 2018). In one documented case, a patient was admitted to over 400 hospitals (Boyd, 2014). The most extreme cases of factitious disorder were pre-viously referred to as Munchausen syndrome (named after an 18th-century military offi-cer known for extreme exaggeration). Patients with FDIS are routinely examined by physicians. To be successful in their deception, therefore, they need to have an extensive knowledge of the diseases they are faking. But the ruse isn't always easy to get away with. Doctors may become suspicious when patients report symptoms that sound too much like descriptions in medical textbooks. Those who carry on their pretense even when they are not being scrutinized by a physician may also enhance their chances of deceiving others. Side note: Pretending to be sick once in a while to get out of a commitment or take a day off does not qualify as FDIS

Niccolo Machiavelli

Since it prohibits nothing, this perspective has the advantage of being totally flexible—if you're the liar, that is. If you're the target, then it's totally forced. For the liar, the right-ness or wrongness of deception (or of truth) is an irrelevant measure. Unlike Bok's tests, the only question these would-be liars must ask is, "Does this behavior help me or hinder me from accomplishing my goal?" The liar is the sole moral arbiter and what is "right" is whatever serves the liar best. The best known advocate of this position is Niccolò Machiavelli, an Italian statesman and political philosopher who lived 500 years ago. His book The Prince (1532/2010) was designed as a leader's guide to effective governance. In Chapter 18 of his book, Machiavelli argues that a prince should know how to be deceitful when it suits his purpose. However, he cautions, the prince must not appear deceitful. Instead, he must seem to manifest mercy, honesty, humaneness, uprightness, and religiousness. By emphasizing the strategic manipulation of others for one's own benefit, Machiavelli's name is now used by many people to signify things like deceit, treachery, and opportunism in human relations. Most people think there are certain occa-sions where "the end justifies the means"—e.g., lying to survive or to protect national security. Almost 2,000 years ago, the Roman rhetorician Quintilian said there was "no disgrace" in "making use of vices" to accom-plish your goals and speaking "the thing that is not" (1922, Chapter II, xvii, 26). Quintilian, however, was assuming that the speaker's goals were noble ones and there-fore justified the use of deception. But for the highly Machiavellian person (or "high Machs"), lying is a justified means to accomplish any particular end or goal. To his credit (or not), Machiavelli practiced what he preached. In a letter to a friend he wrote: For a long time I have not said what I believed, nor do I ever believe what I say, and ifindeed sometimes I do happen to tell the truth, I hide it among so many lies that it is hard to find. (R. Greene, 2002, p. 321) To more fully appreciate the Machiavellian belief system (and your relationship to it), try answering the questions in Figure 3.1 below.

repression

Repression describes what happens when a person mentally blocks ideas and feelings from reaching consciousness. They are ideas and feelings that are likely to cause pain, anxiety, or threat. It is a motivated amnesia or an unwillingness to recall. In Goleman's (1985, p. 117) words, it is "forgetting and forgetting we have forgotten." Traumatic events may trigger repression. For example, a sexual assault survivor may not recall information about the rape or a soldier can't recall killing people. In situations like this, an effort to retrieve the repressed information can create enough anxiety so that the accuracy of the memory may suffer a great deal.

equivocation

Two ancient but ingenious inventions for helping people who believed in not lying (but needed to lie) were the concepts of mental reservation and equivocation. The first allowed a person to make a misleading statement to someone else (e.g., "I have never cheated on an exam . . .) and silently add the qualification in his or her mind to make it true (e.g., ". . . until the one you gave last week"). As long as the person used the mental reservation, there was no lie and no sin. Equivocation took advantage of words' multiple meanings to mislead without technically lying. For example, when asked by persecutors whether a man they intended to kill passed this way, the equivocator might reply "He did not pass here" with "here" signify-ing the precise spot on which the speaker stands and not the other spots the man actually did walk through (Denery, 2015).

The Words

When shallow wannabe-writer Rory (Bradley Cooper) finds an old manuscript tucked away in a bag, he decides to pass the work off as his own. The book, called "The Window Tears," brings Rory great acclaim, until the real author (Jeremy Irons) shows up and threatens to destroy Rory's reputation. Cut to Clayton Hammond (Dennis Quaid), a writer whose popular novel "The Words" seems to mirror Rory's story, leading to speculation that the tome is Hammond's thinly veiled autobiography.

realism

realism: the black jacket (edouard manet 1868) In its specific sense realism refers to a mid nineteenth century artistic movement characterised by subjects painted from everyday life in a naturalistic manner; however the term is also generally used to describe artworks painted in a realistic almost photographic way.

pseudo events

A "pseudo-event" is an event that occurs solely for the purpose of generating media and public attention. The term was coined by historian Daniel Boorstin (1992) to describe a tactic perfected by Edward Bernays, who founded the field of public relations and also happened to be a nephew of psychoanalysis pioneer Sigmund Freud. A crucial element of Bernays' strategy was to create pseudo-events, which on their surface appeared to have a purpose different from their true goal. For example, consider a famous pseudo-event Bernays created in 1929 during the Easter Parade in New York City. A group of well-dressed women led by Bertha Hunt started a scandal by walking into a crowded Fifth Avenue during the height of the parade and lighting up cigarettes.

lie of omission

Awareness level is closely related to a sec-ond key context for evaluating deception—whether or not the message contained altered information. When deceivers who deliberately altered information are questioned, we are generally accusing them of that most straight-forward of lies—the lie of commission. Lies of omission may be judged just as harshly, but because the liar deliberately left out some criti-cal, game-changing detail rather than altering any facts. Whether commission or omission, our perception is that the information was conveyed in a less-than-straightforward way, and, above all, with the will to deceive.

Fred Demara Jr.

Whether you spell it imposter or impostor, Fred Demara Jr. was probably the most successful one of the 20th century (Crichton, 1959). In fact, a Hollywood version of his extraordinary life was portrayed in the movie, "The Great Impostor" (1961). He did not graduate from high school but would one day manage to convincingly pose as a PhD in Psychology. His work masquerades included teacher, deputy sheriff, college dean, monk, civil engineer, assistant prison warden, and a cancer biologist. At one point, he actually deserted the Army and then joined the Navy. He later deserted the U.S. military altogether and joined the Canadian Navy (where he successfully impersonated a ship's doctor, even performing surgery). He reportedly performed the tasks associated with each role quite well, and his surgical skill was featured in newspaper articles. He spent time in prison, but he did not seem to be driven by the desire to take advantage of people for his personal financial gain. In fact, he liked helping people. But there was also a drive for identity and he seemed to enjoy the experience and challenge of living lives other than the one he had. In each of the impostures he performed, Demara established his credentials with faked or stolen documents. He kept his impostures within the areas he had experienced—the Catholic Church, the military, educational in-stitutions, and prisons. On the job, he sought ways to fulfill others' needs. He fostered feelings of affinity from others and with the aid of a little self-deception showed complete confidence in his ability to perform job-related tasks. He was able to learn things quickly. On one occasion, he managed to successfully extract an infected tooth from his ship's captain after reading about the procedure the night before. Konnikova's 2016 best selling book, The Confidence Game: Why We Fall for It . . . Every Time, chronicles Demara's exploits in depth, revealing some unsavory details that Demara managed to slip past his own biographer (whom he later impersonated).

lie of commission

Awareness level is closely related to a sec-ond key context for evaluating deception—whether or not the message contained altered information. When deceivers who deliberately altered information are questioned, we are generally accusing them of that most straight-forward of lies—the lie of commission. Lies of omission may be judged just as harshly, but because the liar deliberately left out some criti-cal, game-changing detail rather than altering any facts. Whether commission or omission, our perception is that the information was conveyed in a less-than-straightforward way, and, above all, with the will to deceive.

histrionic personality disorder

The word "histrionic" is associated with excessively dramatic and emotion-laden behavior, like that of an actor. These affectations are designed to call attention to themselves. The speech of a person with this disorder may be theatrical; their appearance and dress may be flamboyant. They are not comfortable for long if not the center of attention. Like all the preceding personality disorders, people with a histrionic personality disor-der (HPD) are self-centered and demanding, but unlike antisocials, they are not violent. Histrionic individuals seek to: build their self-esteem by calling attention to themselves, use flattery to get others to meet their needs, engage in sexually seductive or predatory behavior, respond positively to suggestions by others, and refuse to dwell on frustrations and unpleasantries. People with this disorder may engage in deception in order to garner and hold attention and to offset feelings of threat and rejection. A person with HPD might, for example, sponta-neously make up a story for his or her co-workers that he or she was nearly run over by a circus clown riding a bicycle while returning from lunch. It is the kind of story sure to draw a crowd. And, like the narcissist, a person who depends on being the center of attention to boost his or her self-esteem is dependent on proficient self-deception. All of this may help to explain why former Penn State assistant football coach Jerry Sandusky was diagnosed with HPD during his trial over the alleged sexual abuse of 10 boys. Exhibits at trial included "creepy love letters" Sandusky sent to one of his victims. In one of them, Sandusky wrote, "Yes, I am a 'Great Pretender.' I pretend that I can sing. I pretend about many things. However, I can't pretend about my feelings and want you to always remember that I care" (Cosentino, 2012). To be sure, Sandusky's ill-advised phone interview (QR) with NBC's Bob Costas didn't help his case any. When Costas asked him, "Are you sexually attracted to young boys?" Sandusky prevaricated, saying that he was attracted to young people in the sense that he enjoyed being around them. It took him more than 16 seconds to get around to saying the word 'no' (Carmichael, 2011). "In retrospect," he said in response to another question, "I shouldn't have showered with those kids" (Greene, 2011). From prison, where he is serving 30 to 60 years, Sandusky has continued to deny all charges and maintain his innocence (Deppen, 2016). Women are more likely to have histrionic personality disorder than men. It may be the case, however, that more women are diagnosed than men because societal norms about sexual for-wardness and attention-seeking make such behavior seem less acceptable (and more aberrant) in women (Berman, 2012)

David Nyberg

According to philosopher David Nyberg (1993), we need the idea of truth as much as any-thing else. "Truth" in this sense acts as a symbol for certainty—and a belief in certainties comes in handy in an existence so full of question marks. Since we have the almost limitless capacity to imagine more than we are capable of understanding (including our own mortal-ity), truth and its synonyms (e.g., reality, evidence, facts, accuracy) are often the things that make it possible for us to navigate the complexities of daily life. Entire intellectual disciplines depend on these concepts just as much as (if not more than) individuals do. Law, journalism, religion—even the sciences—would be hard-pressed to function without various standards for approximating certainty. In a letter to one of the authors of this textbook, David Nyberg suggested that a more pro-ductive rephrasing of the Golden Rule would be: Do unto others as ifyou were the others. Imag-ining that we are the other person(s) affected by our behavior might not be the easiest thing to do, but Nyberg believes the new wording accomplishes two important goals: 1. It changes the authority to decide what the right thing is from the doer to the other 2. It highlights the need for empathetic understanding that is not explicit in the traditional wording. Empathy, Nyberg believes, plays a key role in most ethical and moral decisions. There is no completely adequate way to summarize the ethical guidelines discussed above, but Nilsen (1966) may capture their essence as well as anyone, suggesting that we should ap-proach "truth telling [and lying] with benevolence and justice as our moral commitments, and then apply knowledge and reason to the best of our ability" (p. 34).

reciprocal altruism

Cooperating as a team required our ancestors to build a social system based on trust, honesty, reliability, and mutual aid, a process biologist Robert Trivers and anthropologist Richard Leakey called "reciprocal altruism" (Leakey & Lewin, 1978; Trivers, 1971). While this cooperative form of social organization helped our ancestors survive, it is also easy to see how individuals who wanted to game the system for personal benefit might have been tempted to do so. Undoubtedly, some did just that, but there was a catch—they had to be good at it. If you got caught trying to fool people who put their trust in you, the group punished you in some way.

denial

Denial is the refusal to attend to something. Even though the self-deceiver may be initially conscious of the thing being denied, the ultimate goal of denial is to keep the disagreeable information out of awareness in order to enhance or protect self-esteem. This is largely accomplished by selectively giving attention to confirming data and giving little attention to that which is disconfirming. With selective attention, unpleasant truths may be avoided all together or simply passed over quickly in order to focus more intently on other things. " the ultimate goal of denial is to keep the disagree-able information out of awareness in order to enhance or protect self-esteem. Sometimes we can spot the possibility of potentially threatening information and preemptively ignore it. This can be done in conversation by changing the topic or by " willful ignorance "—i.e., making it clear to others you do not want to know something. Through denial one may be able to reject an unpleasant reality, but it may be obvious to others. For example: • • • " excessive alcohol consumption by the person who refuses to recognize it a person who has a terminal illness and refuses to face his or her mortality lovers who refuse to see the signs that their partners are no longer in love with them

Phineas Gage

Despite all the efforts to nurture honesty in this culture, it appears that nature may also play an important role in our moral development. This was first brought to the attention of the medical community in 1848 when Phineas Gage (QR) suffered a terrible accident in which an explosion hurled an iron bar through his eye, brain, and skull. His speech and other cognitive abilities remained intact and he recovered his health rather quickly. But his person-ality was irreparably damaged: This pleasant, responsible person who had been popular with his coworkers was transformed into a jackass. After his accident, he had no respect for social conventions, no sense of responsibility, could not be trusted to honor his commitments, and would lie and curse uncontrollably (Macmillan, 2000).

Catch Me if You Can

Frank Abagnale, Jr. (Leonardo DiCaprio) worked as a doctor, a lawyer, and as a co-pilot for a major airline -- all before his 18th birthday. A master of deception, he was also a brilliant forger, whose skill gave him his first real claim to fame: At the age of 17, Frank Abagnale, Jr. became the most successful bank robber in the history of the U.S. FBI Agent Carl Hanratty (Tom Hanks) makes it his prime mission to capture Frank and bring him to justice, but Frank is always one step ahead of him.

dissociation

Dissociation is a psychological process in-volving the separation and isolation of mental content—psychologically removing the links, connections, or associations to related content. We may be quite aware of our troubling behavior, belief, or emotion, but not willing to acknowledge its relevance to other parts of our-selves with which it is incompatible. When the associations between contrary and inconsistent content are mentally removed, it also removes the need to explain the behavior. People sometimes justify their immoral acts after the fact by pointing to others' immoral deeds. Recent research indicates that when people cannot deny, confess, or compensate for their wrongdoings, they psychologically "distance" themselves from these transgressions, use stricter ethical criteria, and judge other people's immoral behavior more harshly (Barkan, Ayal, Gino, & Ariely, 2012). Distancing the self from evil and demonizing others allow peo-ple to view themselves as "ultra-moral" and lessen the tension elicited by a "one-time" slip

obssesive-compulsive disorder

Distinct from the anxiety disorder that we're more familiar with (OCD), obsessive-compulsive personality disorder (OCPD) is characterized by a state of mind that is preoccupied with orderliness, rules, matters of right and wrong, and interpersonal control at the expense of flexibility, openness, and efficiency (Bressert, 2017). Dogged perfection-ists, the things that occupy these people's attention often keep them from seeing the big picture. Because emotions are held in check, personal relationships may lack the kind of closeness that free-flowing feelings may provide. This person's obsession with perfection may deter success in some occupations and facilitate it in others. However, the amount of time and effort dedicated to work may be at the expense of leisure-time activities and friendship maintenance. Compared to the lies told by those with other personality disorders, those told by people with OCPD can seem fairly benign. If they resort to deceptive tactics, it is in an effort to protect their secrets, preserve their independence, maintain control of a situation, or perhaps to support their own self-deception. Persons with OCPD may be especially prone to lies of omission, wherein important information is never shared with others because "it's a secret" and "you never asked." As Sullivan (2001) notes, they consider themselves extremely honest simply because they rarely resort to bald-faced lies. This makes them true "masters at deception" and can confound the people around them (p. 155). Another possible source of de-ception for those with OCPD is their extreme difficulty in admitting fault, which stems from their overpowering need to always be in the right and in control (Hammond, 2017) Men are twice as likely as women to be diagnosed with OCPD

Frank Abagnale

Frank abagnale Abagnale was another remarkably gifted imposter whose exploits also led to the Hollywood blockbuster Catch Me If You Can (2002). He masqueraded as an airline pilot, pedia-trician, hospital administrator, and lawyer (Abagnale, 1980). He wrote about 2.5 million dollars in bad checks, and served time in prison in France, Sweden, and the United States. Abagnale was smart enough to pass bar exams and to convince the United States government to cut his prison time from 12 years to 5. He did that by convincing law enforcement agencies that his knowledge and experience would greatly assist them in catching swindlers and preventing fraud. Running away from an unhappy home life at 16, Abagnale learned various dishonest ways to survive. He needed to be older in order to get work, so he changed the date on his driver's license. He needed money, so he put his checking account number on other account holders' bank deposit slips. When people filled them out and turned them in, their money went to his account. He used his father's credit card to buy four tires for his car. Then, he told the seller that he would sell them back to him for half of what he just paid—as long as it was in cash. For two years, he impersonated an airline pilot so he could catch free rides to places all over the world. He would stay in hotels where airline crews stayed and where the airlines picked up the bill. To carry out this imposture, he convinced an airline representative that dry cleaners had lost his uniform and he needed a new one. He was also able to get a fake identity card made by posing as a person wanting to do business with the company that made airline identity cards

high stake lies

High-stakes lies are often told to avoid being punished for a misdeed. Lying is espe-cially likely to occur when the liar perceives that the punishment for telling the truth will be as great as the punishment for lying. With low-stakes lies, you risk little if caught but also gain little if successful. Not so with high-stakes lies, which are likely to have strong negative consequences if uncovered, but carry the potential for considerable gain if they remain undetected. Because the stakes are high, many researchers believe the liar's observable behavior and language are more like-ly to manifest the cognitive and emotional strain that often accompanies these forms of deception. High-stakes lies require more thinking than low-stakes lies, as they are presumed to be much more cognitively complex (Vrij, Fisher, & Blank, 2017). When cognition is stressed in this way, we may see (and hear) liars telling stories that are inconsistent and delivered with hesitation, or that seem too rehearsed or overly awkward (Blair, Reimer, & Levine, 2018; Zuckerman, DePaulo, & Rosenthal, 1981). The underlying belief regarding these signs of deception based on cognitive difficulty and inappropriate emotions is that they will distin-guish liars from truth tellers when the stakes are high. In order to determine the validity of such a critical assumption, researchers have endeavored to match real-world, high-stakes conditions in controlled laboratory settings and then compare the behaviors of people who are telling the truth with the behaviors of those who are lying. It's not a particularly easy thing to do, so it's worth examining approaches to this type of research

perception of effect/consequence

If we suspect a lie has occurred, part of our assessment is focused on the effects or conse-quences of the behavior. Who was affected and in what ways? The answers to these questions may alter the extent to which we perceive the behavior as a lie and greatly affect our percep-tions of the kind of sanctioning deserved. We don't like or trust a person who tells us a lot of lies. At the same time, not all lies are perceived as equally harmful (Tyler, Feldman, & Reichert, 2006). We attribute less harm to (and reduce sanctions for) lies we perceive as: • benefitting another person (not the liar) • resulting in positive consequences • producing effects that were trivial rather than consequential • producing short-rather than long-term effects • affecting few rather than many • occurring (along with any consequences) a long time ago • acknowledged (along with the consequences) by the liar Sometimes lies have multiple audiences with effects that are different for each audience. For example, a journalist writes a newspaper story about breast cancer. But the fact that he or she used a gripping, yet fictional, case study to make the story vivid comes to light. When this occurs, the consequences for the author and newspaper may be perceived by readers in very different ways (see Chapter 12). Lies directed to a large audience can be complicated, and the ways in which they are perceived by audiences can be equally as complex.

Eadweard Muybridge

In 1874, French astronomer Pierre Janssen captured photos of Venus eclipsing in front of the Sun and was able to put them together in cinematic style. In 1878, Eadweard Muybridge took a series of photographs that could be put together in animation style to show a horse in motion. By the late 1880s and early 1890s, Louis Le Prince, Thomas Edison, and the Lumière brothers all developed separate (and competing) techniques for producing video. Edison prevailed and built the first film studio in 1893.

Cottingley Fairies

In 1917, two girls in England were able to pull off a photographic hoax that ended up gaining public notoriety because it duped even Sir Arthur Conan Doyle, creator of the otherwise unfoolable detective Sherlock Holmes. Originally a joke aimed at their father, the photos of the "Cottingley fairies" were created using simple staging—but the women didn't admit the hoax until 1983. At a recent auc-tion, the images were expected to fetch nearly £70,000 (Press Association, 2019)(QR).

prepackaged news

In 1983, Jamieson and Campbell made an observation that might easily have been made today: "There is considerable evidence of the power of government to manipulate press coverage. Gov-ernment agencies spend huge sums of money to disseminate their messages through the mass media" (p. 103). This can be done through news "leaks" or newspaper editorials, but a controver-sial method that briefly gained popularity in the mid-2000s was the video news release (VNR). These professionally produced video segments were designed to present a particular point of view while appearing to be just another news story produced by a television network or local station. Thousands of VNRs were distributed and broadcast—many without acknowledgment of the government's role in their production (Barstow & Stein, 2005; Nelson & Park, 2014). They were typically produced by public relations firms hired by federal agencies although some federal agencies produced their own. Despite the fact that many messages from leaders emanate from sources other than their own mouth, presidential candidates and presidents do speak directly to the public on occasion—e.g., presidential debates, press conferences, and formal televised speeches. But rehearsals and scripting of behavior are very much at play in these appearances, so the behaviors typically associated with liars in more spontaneous contexts (see Chapter 7) may not be useful indica-tors of deception. When leaders have such a high degree of control over the dissemination and delivery of their messages, the methods of deception detection need to be adapted accordingly

Edward Daily

In a ruse that began in 1985 and lasted for 15 years, Edward Daily went to extraordinary lengths to create his make-believe military life. He did in fact serve during the Korean War, but spent all of his time well behind the front lines in a maintenance unit. After the war, how-ever, he forged documents that praised him for commanding his troops under fire, bravely res-cuing a fellow soldier, and spending time as a prisoner of war. These imaginary actions served as the basis for creating fraudulent Silver Star and Distinguished Service Cross citations. When a 1973 fire destroyed some Army records, veterans were contacted and asked to provide the lost information. This prompted Daily to forge his name and photo on an original photo of soldiers from the Seventh Cavalry Regiment, a unit that had seen a lot of combat during the Korean War. In the ensuing years, he insinuated himself into reunions and trips (including one to South Korea), gathered up stories from men who'd actually served in the regiment, added his own, and published three books (Moss, 2000). By the time his sham was uncovered, he had convinced several men in that regiment that he had actually served in combat with them (Ellison, 2000). Over the years, he filed for benefits based on this false record of service, receiving almost $325,000 in compensation and $88,000 in medical benefits (Marquis, 2002). The New York Times archive contains a detailed account of the Edward Daily saga (QR). Beginning with persons like Edward Daily, a rash of military imposeurs in the 2000s prompted the U.S. Congress to pass two versions of the Stolen Valor Act (2005 and 2013). This law makes it a crime for some-one to falsely claim having received any of a series of military decorations or awards including the Purple Heart, Silver Star, and Medal of Honor. The penalty for violating this law is a fine of up to $100,000 and up to a year in prison (Yarnell, 2013).

psychopath

In contrast, psychopaths often have disarming or charming personalities. While they, too, find it difficult to form emotional attach-ments to individuals, psychopaths find it easy to gain people's trust, which they then use for manipulation. Psychopaths are often well-ed-ucated and hold steady jobs. Some are so good at manipulation that they have families and other long-term relationships without those around them ever suspecting their true malignant nature. When they commit crimes, they tend to plan them out in a calm, cool, and meticulous manner. Many notorious and prolific serial killers such as Ted Bundy, Jeffrey Dahmer, John Wayne Gacy, and Dennis Rader have been diagnosed by forensic psychiatrists as psychopaths (Martens, 2014). If all this reminds you of Machiavellianism (discussed in Chapter 3), you see the same connections observed by McHoskey, Worzel, and Szyarto (1998). Based on their research, they maintain that psychopathy and Machiavellianism are essentially the same personality construct. Lying is an addiction for both sociopaths and psychopaths. Lies are told frequently, effortlessly, and without much guilt or anxiety. These pathological liars may even find it difficult to understand why others value truth, especially when it hurts to tell it. Because they practice lying so often, they can respond effectively to the most challenging confrontations to their credibility with consider-able aplomb. They may occasionally admit to a lie or promise to change their behavior, but be care-ful not to get fooled—normally this is a tactic to establish trust for a future lie (Cleckley, 1982). The vast majority of those diagnosed with antisocial personality disorder are men.

groupthink

Interaction with a group can also lead to self-deception on the part of all members of the group. Janis (1983) famously called this phenomenon groupthink. Groupthink occurs when the members are so intent on preserving agreement that: • they fail to seriously consider alternative courses of action • dissent and controversy are unwelcome • assertions associated with group goals go unchallenged • critical thinking is replaced by "right" thinking All this is facilitated when the leader makes an early declaration of what he or she prefers. Janis points out that even when groups are confronted with evidence demonstrating that their policy isn't working or is too risky, they tend to persist in following their original course of action. Groupthink is more likely to occur within certain types of policy-making groups, within certain types of organizational structures, and within certain situational constraints ('t Hart, Stern, & Sundelius, 1997). In every case, it is more likely when group members all share the same strong group identification: 1. The group overestimates its power and morality. It believes it is invulnerable and this belief leads to excessive optimism and extreme risk-taking. The group's inherent morality is never questioned. 2. The group exhibits closed-mindedness. Information that challenges the group's goals and decisions is collectively discounted or rationalized. Enemy leaders are stereotyped as too evil, weak, or stupid to deal with. 3. The group exerts various kinds of pressure to ensure uniformity . Individual members censor (i.e., remain quiet about) their own counter-arguments, leading others to assume that silence means consent. When a member does voice a strong counter-argument, oth-er members make it clear that this behavior is not consistent with a loyal group member.

bullshit

Liars are very concerned about the possible impact of a target learning the truth. As a result, they make a special effort to hide it or guide targets away from it. Bullshitters, on the other hand, have little regard for the truth—it just doesn't concern them. Frankfurt (2005, p. 55) put it this way: "It is impossible to lie unless you think you know the truth. Producing bullshit requires no such conviction." From this perspective: • " Bullshitters may say things that are accurate or they may say things that are inaccurate, but it doesn't matter either way as accuracy is not the issue for them. They are not trying to hide the truth. • Instead, they are either trying to hide their lack of concern for the truth or just outright illustrating it through their banter. • Bullshitters are not concerned with where they find information or whether it is true as long as it supports their overall purpose. They may some-times be accused of outright lying, but reactions to bullshit are often benign. • People who participate in "bull sessions" typically suspend their concern for factual accu-racy as well as their certainty that the speaker really believes what he or she is saying. Or-ators, politicians, and stand-up comedians frequently rely on this model, and most of the time audiences readily accept it. Bullshitting might be thought of as the original virtual reality—a way for people to share their imaginations with others. Bullshitting is widely practiced (Penny, 2005). Frankfurt (2005) offers an explanation as to why: Bullshit is unavoidable whenever circumstances require someone to talk without knowing what he is talking about. Thus the production ofbullshit is stimulated whenever a person's obligations or opportunities to speak about some topic exceed his knowledge ofthe facts that are relevant to that topic. This discrepancy is common in public life, where people are frequently impelled—whether by their own propensi-ties or by the demands ofothers—to speak extensively about matters ofwhich they are to some degree ignorant. (p. 63) Larson's (2006) view of bullshit differs from Frankfurt's. He argues that bullshit often does hide or mask the truth, but it is done in such a way that it isn't a direct contradiction of something that others subscribe to as truth. Wakeham (2017) worries that bullshit interferes with everyday truth-seeking and the solving of social problems. The perfect example is the paradox you may experience as you are reading this chapter to prepare for class/an assignment while you are also simultaneously enjoying learning new information. Bullshit is woven heavily into our lives.

flattery / ingratiation

Low-stakes lies are also used for false praise. We sometimes give compliments we don't think are deserved in order to make the target feel good. This goodwill, in turn, may make them feel obli-gated to help us in the future, something that Jones (1964) calls a "subversive masquerade," and its fundamental insidiousness is apparent when one considers the basic goals of the ingratiator: 1. Locate a target who can provide benefits (self-serving) 2. Identify what the target needs/wants to hear (scheming) 3. Determine how to satisfy the target's wants/needs in ways that will be believed (deception) or prompt the target to act in ways that benefit the ingratiator (manipulation). Forms of false praise like flattery and ingratiation are widely condemned (think of terms like ass-kisser and brown-noser). Why, then, is it so common? Gordon (1996) suggests that the main reason is simple—it works. False praise (especially giving undeserved compliments and appearing to agree with another person's opinion) can be very effective. This is because we tend to like (and thus be inclined favorably toward) people who: • appear to like us • seem to agree with us • say they see good things in our behavior and/or thinking • who ask our opinion and advice

imposter

Many of us feel like imposters, but we are not. The phenomenon known as imposter's syndrome is very different from the behavior of people we call imposters. People who experience im-poster's syndrome feel like they are imposters, but they aren't. Instead, they are high-achieving individuals who unfairly see themselves as frauds. They believe their accomplishments have been obtained because they are physically attractive, likeable, lucky, or any reason other than their own talents. In fact, they often have a low self-concept and worry that someone will reveal the sham they've been carrying on. If you've ever felt like this, you're not alone. Even highly successful actors are especially prone to feeling as though they don't deserve their suc-cess, from Tom Hanks to Tina Fey to Meryl Streep (Simon, 2017). Actual imposters also worry about whether they are fooling others, but it has nothing to do with an ill-gotten persona. It is because they have purposefully enacted a false persona designed to fool others, and they worry about getting caught. The behavior of an imposter is normally far more complex than that of an imposeur. Imposters engage in longer, more elaborate deceptions. They as-sume different names and different identities. Deception for an imposter is a part of a lifestyle. The good ones are confident, quick learners, skilled communicators, and people who know how to effectively enact the roles they choose to play. As noted in the following cases of famous imposters, there may be many reasons behind their imposture, for example: • The need to live a life other than the one they have • The need to obtain something they want • The need to prove something to themselves or someone else • The need to gather information about an enemy

St. Augustine

Martin Luther notwithstanding, allowing for exceptions was not the position taken by other well-known Christian theologians, many of whom declared that no lie was permitted in any situation. Believers were flatly forbidden from lying (despite the example of Jesus himself deliberately deceiving two of his followers in Luke 24). In his treatise De Mendacio ("On Lying"), St. Augustine (d. 430) famously argued that lying infects personality and destroys integrity (Muldowney, 2002). Many subsequent and influential theologians embraced Augustine's absolutist stance, including St. Thomas Aquinas (d. 1274) and John Wesley (d. 1791), as well as moral philosopher Immanuel Kant (d. 1804). But congregants often found it difficult, if not impossible, to comply with decrees that forbade lying under any circumstances, so official "loopholes" were developed (see Chapter 3).This "never" position has no exceptions—lying is always wrong. Good intentions or the absence of harm don't matter either. Even lying to save an innocent life is wrong. Aside from the practitioners of "radical honesty" (Blanton, 2005) it is difficult to find people today who are willing to publicly advocate this position, but several well-known figures from world history did. Augustine (345-430), Thomas Aquinas (1225-1274), John Wesley (1703-1791), and Immanuel Kant (1724-1804) all believed it was never right to lie (Bok, 1978). It should be noted, however, that even though Augustine and Aquinas believed that lying was always wrong and never justified, some lies could still be forgiven by God (BBC, n.d.). For these absolutist theologians and philosophers, the justification for believing it is never right to lie rests primarily on three arguments: 1. Lies, by their very nature, disrespect humanity and, like a disease, infect the liar's character and ruin his or her integrity. 2. Lying violates religious precepts. It is a sin to lie. 3. Lying begets lying. If you lie about something, you'll be tempted to lie about it again and you'll have to tell additional lies to avoid detection. These lies, in turn, lead you to lie about other things. Others will match your lies with their own and eventually society will become utterly dysfunctional. In reality, this absolute prohibition has never been one that people could follow without provi-sions that excused or redefined their behaviors, as a closer look at its three supporting argu-ments will show.

guilty knowledge test

More commonly known as the Guilty Knowledge Test (GKT), this polygraph exam technique is based on the assumption that the perpetrator of a crime will know things about the crime that innocent suspects will not. For example, assume the victim of a crime had been stabbed to death with a knife. Using the GKT technique, four knives are shown, one at a time, and the suspect is asked if the knife displayed is the one used in the killing. The suspect is instruct-ed to say "no" each time, but the assumption is that the killer will show more physiological arousal when the knife actually used in the killing is displayed. The arousal level for innocent suspects, however, should be similar for each of the knives (Lykken, 1998). It is best if the examiner does not know which alternative (in this case, which knife) is associated with the crime so that bias is not interjected into the way the questions are asked. A preliminary polygraph exam with people who are known to be innocent of the crime can provide insight on the equality of the alternatives and the extent to which any of them may inadvertently induce arousal in an innocent person. For criminal cases, the Guilty Knowledge Test has advantages over the Control Question Test, but is not without its own problems. For example, a guilty person may not be aware of certain aspects of the crime or crime scene that might otherwise seem obvious to investigators. The perpetrator may not have perceptions and memories that investigators expect him or her to. On the other hand, an innocent person may have knowledge of the crime scene but not be the person who com-mitted the crime. Guilty knowledge tests are not always easy to construct because they depend on using information known only to the guilty per-son and, in many cases, the details of a crime are already widely known because of media publicity.

placebo

One widely used form of deception that is justi-fied on utilitarian grounds is the placebo. When testing a new drug or surgical technique, medical researchers try to distinguish its direct benefit to patients' health from the psychological benefit that comes from believing that it will heal them. To do this, clinical studies are conducted in which the actual drug or surgery is administered to some patients while others receive a placebo (a pill containing sugar or gelatin instead of the active drug ingredient or "sham surgery" in which incisions are made in the skin and sewn up, but no invasive procedures are performed; etc.). Placebos in medical research are controversial, whether the deception is authorized or not. Howev-er, their use is rationalized in utilitarian terms. According to Miller and colleagues, "in placebo re-search, participants are not deceived for their own benefit. Rather, they are deceived for the benefit of science and society in general, through the development of generalizable knowledge" (2005, p. 262). Hundreds of studies in the medical research literature document such "placebo effects"—cases when placebos benefit patients' health, sometimes as much as the real drug or surgery helps them (Kirsch, 2013). "Authorized deception" was used in some of these studies, in that patients were told beforehand they could receive the actual medical procedure or the placebo, but weren't told which until after their health outcomes were assessed; in other studies, the use of placebos wasn't revealed to patients until the assessment phase (Miller, Wendler, & Swartzman, 2005).

low-stakes lies

Most of the lies we tell do not have serious consequences. These are called low-stakes lies because there isn't much to be gained if the lie is successful and there isn't much to be lost if the lie fails. Here's what we know about these types of lies: • Some low-stakes lies are an expected part of everyday conversation (polite remarks, for example). • It is true, however, that almost any lie, no matter how trivial it may seem, will offend some people. In addition, any lie that was initially treated as trivial can be transformed into a lie of great consequence when a positive relationship turns sour. • Low-stakes lies are mostly told for one's personal benefit—to improve one's image, to feel better, to protect oneself from embarrassment, hurt, or disapproval, etc. • We tell such lies to everyone, but more often to people we don't know all that well. • These lies occur frequently. In one study, students reported tell-ing these lies once in every three interactions (DePaulo & Kashy, 1998). " • Low-stakes lies are often communicated via technology or online because this eliminates some difficulties liars feel when telling the lie in person. Phone calls eliminate visual cues and texting, e-mail, and social media reduce the presence of the other person while also providing more control over the verbal message. Lying online or through social media (see Tsikerdekis & Zeadally, 2014) may be easier to enact, and it may even make the liar feel less responsible for the lie (Hancock et al., 2004). Low-stakes lies occur in a wide variety of contexts, each of which is worth examining in more detail: • • • • everyday conversation self-presentation • attracting a romantic partner • flattery and ingratiation sports, games, and magic the workplace With most low-stakes lies, the behavioral differences between liars and truth tellers are barely discernible, if at all. The primary explanation is simple: Such lies involve little stress, little emotion, and the amount of thought involved is minimal. DePaulo et al. (2003) stated, ". . . ordinary people are so practiced, so proficient, and so emotionally unfazed by the telling of untruths that they can be regarded as professional liars" (p. 81). Ekman (2001) echoed this sentiment when he said, "Most liars can fool most people most of the time" (p. 162).

Thank You for Smoking

Nick Naylor (Aaron Eckhart), a lobbyist for big tobacco, finds it difficult to balance his duties defending the dangerous substance with those of being a good role model for his young son. Nick's life gets even more complicated when a liberal senator mounts an anti-smoking campaign that he must counter. Based on the novel by Christopher Buckley.

Ganser syndrome

Now classified as a dissociative disorder, and sometimes called "prison psychosis," Ganser syn-drome was first observed in prisoners following their release from solitary confinement. People with this syndrome exhibit short-term episodes of bizarre behavior resembling schizophrenia. Symptoms include confusion, repeated mimicking of vocalizations (echolalia) and movements (echopraxia) of other people, and bizarre conversational interaction. "Approximate" (though potentially absurd) answers are often given in response to straightforward questions (e.g., Question: How many legs does a dog have? Answer: Five). In fact, some consider the primary feature of this illness to be the tendency to give such "near-miss" answers to simple questions (Dieguez, 2018). The subject of ongoing debate within the medical community, the disorder is exceedingly rare—with less than 100 cases officially diagnosed (of which about 75% have been male). Approximately 1 in 3 diagnosed with the syndrome had a prior history of mental illness. The vast majority of Ganser sufferers (76%) exhibit no recollection of their symptoms after the episode. What appears to be common to almost all Ganser cases is an individual faced with stress whose ability to cope is compromised by chronic personal problems (alcohol abuse, drug use, etc.) and situational pressures, such as losing a job or being incarcerated (Mendis & Hodgson, 2012).

Patternicity

Our tendency to seek out and interpret new information in a way that confirms our prior beliefs is a form of the "motivated reasoning" phenomenon described earlier in the chap-ter. This bias is exacerbated by the human tendency to search for meaningful patterns, or "patternicity" (Shermer, 2012). Finding meaningful patterns can be a very good thing if there is actually something systematic, intentional, and/or intelligent causing the patterns to occur, such as: • hearing a radio distress signal being broadcast by a sinking ship • correctly guessing the region where someone grew up based on her accent • identifying a killer based on clues left at a crime scene, etc. But people also find what they believe to be meaningful patterns in meaningless noise: • the Virgin Mary on a piece of toast • extraterrestrial spacecraft in fuzzy pictures of a night sky • satanic messages in rock music played backward Shermer argues that we make these perceptual "errors" because of their relative cost. In our evolutionary history, if our ancestors mistakenly thought a rustle in the grass was a predator when it was really just the wind, it was no big deal. However, if they interpreted the sound as just the wind when it really had been a dangerous predator, this error could be deadly. We lose little or nothing if we think we see or hear something that really isn't there (a false positive), but if we miss something that is there, we might get ambushed or killed.

the role of parents w. children and deception

Parents play an important role in determining how often their children lie, what they lie about, and the ethical framework within which lying is viewed. One way parents teach their children about lying and deception is by the way they respond (QR) to their child's deceitful behavior. Since "not telling the truth" may occur for many reasons and have numerous consequences, this underscores the need for a variety of responses. When adults vary their responses to a young child (according to the way the child misrep-resents reality, the context in which it is done, how often it has occurred, how much harm it causes, and the apparent motive for doing it) the child learns what behavior is permissible and what isn't. This learning process is ongoing and adult reactions may vary considerably to the same behavior performed by a 4-year-old versus a 14-year-old. Parents are role models for their children and when children are regularly disappointed in their parents' behavior, receive ineffective supervision, or can't establish a warm parental bond, the probability of their lying increases (Stouthamer-Loeber, 1986; Stouthamer-Loeber & Loeber, 1986; Touhey, 1973). This process works in both directions. Parents who think their adolescent children are engaging in a lot of concealment and lying seem to exhibit more withdrawal from their children. They are less accepting, less involved, less responsive, and know less about their child's activities and whereabouts (Finkenauer, Frijns, Engels, & Kerkhof, 2005). Needless to say, the way parents respond to lying (their own and their children's) will go a long way in determining how their kids behave. Experts say parents should consider the fol-lowing guidelines: 1. Adapt responses to the life stage of the child 2. Consider the effects of double standards 3. Try to "struggle visibly" 4. Practice reciprocity 5. Avoid extreme emotional reactions

narcissistic personality disorder

People with narcissistic personality disorder (NPD) are pathologically preoccupied with themselves. They exaggerate their achievements and expect this portrayal to be accepted by others. They require an excessive amount of admiration but have relatively little empathy for those who might provide the veneration they desire. They view themselves as "special" and entitled to special treatment. Ironically, this outwardly self-confident, even arrogant, behavior may mask a low sense of self-esteem. People with NPD may use lies, exaggerations, and half-truths to support the grandiose personalities they have created. This can occur in self-presentations or in the exploitation of others for their own needs. To maintain the brilliant, skilled, attractive, and immeasurably successful selves they have created, narcissists are also skilled at self-deception. Unless they deceive themselves, narcissists aren't able to function very well. Not surprisingly, people with NPD often seek and achieve positions of power and prestige. Rijsenbilt and Commandeur (2013) report evidence indicating an overrepresentation of nar-cissistic personalities among corporate CEOs who have been prosecuted for financial fraud. Although never formally diagnosed, several psychiatrists have suggested that former invest-ment advisor Bernie Madoff, currently serving a life sentence for perpetrating the largest Ponzi scheme in history, has this disorder.

Brian Williams

Professional journalists who made up stories, plagiarized, invented sources, quotes, and events, or combined elements of several stories also received national media exposure. As a result, reporters from the Washington Post, USA Today, the New York Times, and the New Republic, among others, were publicly and often spectacularly discredited. Chief among these was the most popular television news anchor in the United States, Brian Williams of the NBC Nightly News. In 2014, Williams got called out for more than a decade of fibbing about his experiences as an embedded journalist in a combat zone during the Iraq War (Farhi, 2015). And when the Governor of New Mexico, the Notre Dame football coach, the Poet Laureate of California, executives from Oracle, Radio Shack, Bausch & Lomb, and the U.S. Olympic Committee, among others, fabricated information on their résumés, the public took notice. In 2015, NBC reporter Brian Williams (QR) found himself under extreme public scrutiny for lying. A respected news anchor for decades, Williams had publicly claimed he had come under enemy fire while aboard a mil-itary helicopter in Iraq in 2003. Others on board with him challenged his account, saying the helicopter they were following had been fired on, not the one they were in. Williams eventually admitted that he "made a mistake," saying "I don't know what screwed up in my mind that caused me to conflate one aircraft with another" (Farhi, 2015) Brian Williams lost his post as anchor and managing editor of the NBC Nightly News after falsely claiming for years that he had been aboard a military helicopter in Iraq in 2003 that drew enemy fire. In truth, he was riding in an aircraft behind the helicopter that had been fired upon. Further investigation revealed this was not the only time Williams misrepresented his experience in war zones (Farhi, 2015). He also falsely claimed to have flown into Baghdad with the Navy Seals (who do not "embed" journalists), to have been present at the Branden-burg Gate the night the Berlin Wall fell in 1989, and in Cairo's Tahrir Square during the "Arab Spring" protests of 2011 (Bancoff, 2015).

projection

Projection is a self-defense mechanism that deals with unpleasant and unacknowledged realities by misattributing them to others—e.g., a manager who is unwilling to admit his own incompetence blames his failures on the fact that his employees are incompetent. When the problem is not your own, there is no need to make any changes. Projection can occur in other ways as well. People may misperceive the behavior of another person and project that misperception onto themselves. For example, someone who can't face her own angry outbursts might perceive the angry outbursts of another person as a "controlled response" that makes her feel better about her own behavior. Projection relies on our ability to selectively search for information. If we start with a pre-ferred conclusion ("I am a good manager."), we can often find the little evidence needed to support it in a very short period of time. To reject the preferred conclusion, however, normally requires a lot of evidence and takes much longer

rearranging

Rearranging takes the visual subjects and rearrang-es or repositions them in order to tell a different and/or more dramatic story (see Mitchell's The Reconfi gured Eye, 1992). For example, parades and golf matches would be much less interesting to the viewer if they were presented in real time and space. Instead, shots taken at different times and/ or places become part of a coherent narrative. In interviews and debates, the reaction shots are not always expressions that immediately followed the previous comment. If this shot of the space shuttle looks too perfect, that's because it is. All of the elements are real, but they've been rearranged for dramatic impact. One of the most infamous examples of rearrang-ing occurred in 1982 when editors at National Geographic digitally moved two pyramids closer together to make a better looking image for the cover. Given the magazine's reputation for producing "true" images of nature, this technically minor manipulation was viewed by many as scandalous. It's worth noting that the photographer, who was upset that his iconic image had been altered, had no qualms about paying the men in the picture to ride their camels back and forth until he was able to get the shot he wanted (Nickle, 2017). Photo rearrangement is also used to make politicians look foolish. George W. Bush was once shown reading to a group of schoolchildren—but the photograph was altered so that it appeared he was holding the book upside down (Jaffe, 2002).

retouching

Retouching is usually done to make the visual image look more pristine or to alter percep-tions of attractiveness. Among other changes, it is not unusual for images of movie and tele-vision personalities, models, public figures, and people appearing in advertisements to have their skin tone altered; the thickness of their eyebrows or hips reduced; their wrinkles, stray hairs, and skin blemishes removed; their teeth and eyes whitened; and their pupils, breasts, and buttocks enlarged. In 2015, Kim Kardashian famously "broke the Internet" with a photo of her nude backside. The public response was overwhelming, but a lot of it had to do with the blatant retouching of her back, waist, and buttocks. There is no shortage of videos and photos showing "before and after" retouches of people. In 2014, a Utah school was accused of selectively removing items from female students' yearbook photos (such as tattoos and nose rings). They also added sleeves and undershirts if students lacked them. The following year, a high school senior attending a private school used Reddit to communicate dismay at administrators for (re)issuing student IDs that had been manipu-lated in ways that included face thinning, skin smoothing, skin and lip recoloring, and eye-brow recoloring and shaping. Photos of criminals and others who have been missing for years can be retouched in order to show what changes the passing of time has likely brought about. In June 1994, O. J. Simpson's photo appeared on both Time and Newsweek magazines but Time's photo was darkened (QR). Time said it was done for dramatic effect, but it was viewed by many readers as an attempt to make Simpson appear more sinister. Retouching is also done with nature scenes—e.g., when an editor wants greener grass or a "cleaner" image of the planet Venus

inattentional blindness

Scientific studies, however, tell a different story. Sometimes we do not observe large changes in objects and scenes (change blindness), and sometimes we do not even perceive certain highly visible objects in our visual field (inattentional blindness). For example, drivers may fail to notice another car when trying to turn or a person may fail to see a friend in a movie theater while looking for a seat, even though their friend is waving. Our brain is constantly trying to make sense out of an environment filled with a tremendous array of changing stimuli that vary in intensity. As a result, the brain tries to create a meaningful narrative and overlooks those stimuli that don't fit the narrative being created.

borderline personality disorder

Sudden (sometimes intense) mood swings are characteristic of the borderline personality disorder. A rational and effi-cient worker may, for no apparent reason, become unrea-sonable and irresponsible; an understanding and committed lover may become angry, close-minded, and absent. Their behavior is notoriously hard to predict. These mood swings are often linked to the person's idealization of a job or person that leads to euphoric feelings but also creates unrealistically high expectations. In time, there will be anxiety, frustration, and/or rejection associated with the object of idealization. Disproportionate disappointment and anger (at self and others) may lead to destructive behavior in the form of self-mutilation, substance abuse, or spending money excessively. Impulse control is often a problem for people with a borderline personality. In addition to the types of deception that may be associated with various self-destructive behaviors, people with this disorder often use lies as a weapon to get even with people for not living up to expectations or for disappointments they are believed to have created. Spreading false rumors or filing a fraudulent lawsuit are ways people can use to get even with those who shattered their dreams. The vast majority of those diagnosed with borderline personality disorder are women, although there is debate as to whether this is equivalent to saying it is more common in women than in men (Sansone & Sansone, 2011).

Suppression

Suppression differs from repression in that it puts information or feelings away for a short time. The memories that have been set aside can then be retrieved and dealt with at a more appropriate or desirable time. One way we can suppress mental content is through a process Wegner (1989) calls "self-distraction." Self-distraction involves thinking about things that will cover and replace the things we don't want to think about—our fears, worries, secrets, or even itches. The distraction occurs because we become immersed in some activity and/or in some thoughts that blot out the unwanted ones. Many have found success in suppressing the feelings and thoughts associated with mild pain through self-distraction (exercise, for exam-ple, or binge-watching a favorite show).

pathological liar

The German physician Anton Delbruck (1891) is credited with being the first to describe the concept of pathological lying in patient case studies. He observed that some of his patients told lies so abnormal and dispropor-tionate as to deserve a special psychiatric category he described as pseudologia fantastica (not to be confused with the song (QR) by Foster the People). To date, however, there is no consensus among mental health professionals about the definition of pathological lying, although there is general agreement about its core elements (Dike, Baranoski, & Griffith, 2005). Pathological lying is characterized by a long (perhaps lifelong) history of frequent and repeated lying for which no apparent psychological motive or external benefit can be dis-cerned. Although ordinary lies are goal-directed and intended to obtain an external benefit or avoid punishment, pathological lies appear to be without purpose. In some cases, they may even be self-incriminating or damaging, which makes the behavior even more puzzling. It is a pattern of behavior familiar to psychotherapists—excessive lying, easily determined to be false, mostly unhelpful to the liar in any apparent way, and some-times harmful to the liar, yet told repeat-edly over time. Even prominent and successful individuals engage in this pattern. For example, California Superior Court Judge Patrick Couwenberg was removed from office not only for lying in his official capacity (claiming to have academic degrees and military experience he clearly did not have) but also for lying under oath to a commission investi-gating his behavior. A psychiatric expert witness diagnosed him with pseudologia fantastica and suggested the judge needed treatment (Winton, 2001).

sense of self

The idea of having a unitary self that houses a vast array of thoughts, feelings, and perceptions (some of which are bound to be incompatible and inharmonious) creates an intense pressure for consistency , which explains why certain kinds of self-deception occur. In other words, we occasionally need a mechanism like self-deception to help maintain a complicated and often disorderly jumble of stimuli we call our "self." As a result, this self is more recognizable to us and easier to live with. The inconsisten-cies that self-deceivers wrestle with, according to Chanowitz and Langer (1985), stem from the fact that we are actually composed of many social selves—each of which seeks coher-ence according to the standards appropriate to its context. Our work self, our school self, and our home self, for example, have separate qualities and sometimes we let them operate independently. But we also have the capacity to relate our various social selves together as sub-units or as a whole. Each of us struggles with the social and psychological demands associated with these processes.

perception of intent

The question of whether or not someone intentionally deceived us is often the key factor in determining if we were or were not lied to. At some level, intent trumps everything else, as in these scenarios: • Someone may have remained intentionally unaware of a situation so that they could technically claim ignorance (the "plausible deniability" defense). If so, then we may still consider them culpable in the sense that they chose to shirk their responsibility, like the proverbial ostrich hiding its head in the sand (which is a myth, by the way; as far as we know, avoiding difficult realities by pre-emptively ignoring them is a strictly human phe-nomenon). • It's possible for us to feel deceived despite the fact that the sender neither altered nor omitted important information. In other words, someone could communicate the com-plete, unvarnished truth to us, but still intend to deceive. For example, suppose you want to prank your friend with an exploding golf ball. Before they take their swing, you say to your friend, "Be careful. That's an exploding golf ball." While you have made a perfectly true statement, the context was such that your friend thought you were kidding and later may accuse you of lying—because he or she perceives that the actual intent of your tech-nically true statement was to mislead. True statements are also used with deceptive intent when people are caught doing some-thing they don't want to admit doing. For example, when a drugstore employee accuses a man who has just walked out of the store without paying for a candy bar of stealing it, the man replies (either wryly or with feigned anger): "Sure. Even though that's my new Mercedes in the parking lot and I make $250,000 a year, I wasn't willing to drop $1.50 on a candy bar. Okay, Sherlock. You got me. I'm your thief."

relevant / irrelevant technique

The relevant/irrelevant (IR) technique is the oldest of the polygraph questioning techniques and was primarily used in criminal investigations, but modern polygraphers who are con-cerned about minimizing scientific inaccuracies generally disavow it. Unfortunately, that hasn't stopped it from still being used as a general screening tool in job interviews and other non-criminal settings (National Research Council, 2003). As the name implies, IR is supposed to test the examinee's knowledge of whatever the inves-tigation is focusing on (details of a crime, for example) by asking some questions that actually relate to the investigation and some that have nothing at all to do with it. The idea is that someone with knowledge of the actual events will, overall, tend to respond differently to relevant questions than irrelevant ones. Examples of relevant questions might include, "Do you know who committed the crime?" or asking about evidence found at the scene. An irrelevant (or neutral) question can be anything as long as it isn't relevant to the case and may even be completely random (e.g., "Is today Wednesday?"). More importantly, relevant questions are supposed to provoke an emotional response whereas irrelevant questions should not (unless, of course, someone just really has a thing for Wednesdays).

spin

The term "spin" originates from baseball. When the pitcher spins the ball as it is thrown, it curves, making it difficult for the batter to accurately predict where it will cross the plate. Putting a spin on a story simply means that the communicator finds a way to: • make it look like something it isn't (e.g., a loss is recast as a win of some kind) • redirect a target's attention to a particular part of a story to distract from the original narrative (e.g., the numerous ways in which members of the Trump administration sought to discredit the inquiry into Russian election meddling) With spin, the communicator's overarching goal is to redirect the target's thinking in a way that is favorable to his or her point ofview (Jackson & Jamieson, 2007). In its purest form, spin: 1. looks like it is addressing an issue directly (but isn't) 2. is hard to factually discredit 3. uses language that allows room for interpretation so that the spinner can deny lying The spinner is a person who is predisposed to a particular point of view and perceives sev-eral possible interpretations of the information in question, so the spin on the story is not perceived as outright lying. Instead, it is seen as something closer to opinion or commentary (think of right-leaning Fox News and left-leaning MSNBC). Half-truths and refocusing/re-directing are two common ways to spin: • Half-truth: In 1996, President Clinton said he had put 100,000 new police officers on American streets. He had signed a bill authorizing 100,000 new police officers, but at the time of his statement, only about 40,000 were funded and only about 21,000 of them were actually on duty. • Refocusing: President George W. Bush justified the invasion of Iraq by claiming the regime had weapons of mass destruction. When none were found, the war was later justi-fied on the grounds that the United States had removed a tyrant from power and brought democracy to Iraq. Statistics can also be used to refocus and redirect thinking (Holmes, 1990; Huff, 1954). Notice how easy it is to argue that robberies in a particular city involving a weapon have increased either 5% or 100% from the following data. One person might argue that, "There has been a small increase of 5% in robberies with weap-ons. In 2017, it was 5% of all robberies and in 2018 it was 10% of all robberies." But another person might say, "There has been a shocking 100% increase in robberies with weapons. In 2017, there were 20 and in 2018 that figure doubled to 40." The data are the same, but the messages (and the agendas behind them) couldn't be more different. When visual graphs or charts showing statistics are distorted in ways that are misleading it is called statistification.

Ethical Guidelines

To this point, we've examined multiple answers to the question of whether lying can ever be ethically justified. Without a doubt, each reader of this book will have his/her own preferences, which is why ethics gets so messy so quickly. At the very least, we can try to agree on the premise that learning how to deal ethically with subjects like honesty and dishonesty is an ongoing process. If possible, we might also agree that no single set of rules can be expected to apply equally to all situations at all times. One ethi-cist put it this way: "As we practice resolving dilemmas we find ethics to be less a goal than a pathway, less a destination than a trip, less an inoculation than a process" (Kidder, 1995). This doesn't mean that at some point in life we might not need (or find useful) certain principles to effectively guide our behavior in certain situations. They may serve us well, for example, when we don't have time to analyze various aspects of a situation. But the idea that practicing ethical behavior is a journey rather than a destination means that if life reveals a situation in which broad principles do not seem to be useful, we are willing to consider alternative behaviors. It is not that a person taking this approach doesn't have a mor-al stance—only that he or she recognizes that their current stance may be improved. Because we are human beings we should not expect to live a mistake-free life. But if we can "learn to learn" from our mistakes, then we might at times manage to become better versions of our-selves. We don't need to be good all the time to be a good person, but we need to struggle toward that goal. What follows are some tips to help guide that struggle.

truth we are told

Very often we rely on other people as our source of truth. These may be the result of personal communication (face-to-face or electronic, such as e-mail or messaging), but are much more likely to be from various mediated sources, from television and radio to Internet sites and smartphone apps (and the occasional printed book, magazine, or newspaper). They become powerful sources of what we believe to be true—firmly anchored to the reality perceived by other people that we have chosen to believe. And because we do not have the time to person-ally investigate and verify the millions of bits of information we experience as the reality of our everyday existence, relying on truth from other sources is a pragmatic necessity as much as it is a choice. Modern society is complex and the amount of information we are expected to process on a daily basis is huge. In fact, most of it is out of reach; by one estimate, more data are now being created every cou-ple of years than existed in all of human his-tory (Marr, 2015). We simply do not have the time or energy to investigate the truth of everything we hear or see. So we take things on faith that what is told to us by others is usually correct, a natural human tendency known as the "truth bias" (Levine, 2014).

deleting

Visual images can also be altered by taking something away from the original—a person, an object, a sound, etc. Cherished photos of yourself that also happen to have hated in-laws or an ex-partner in them can be realistically preserved with the offending parties removed. All prisoners in the New York State prison system are required to have their photograph taken when they are clean shaven. But Rabbi Shlomo Helbrans, who was sentenced to 12 years for kidnapping, requested an exemption on religious grounds. Eventually, the case was settled by sending a photo of the bearded Rabbi to a company that digitally eliminated the beard ( James, 1994). In Stalin's Russia, the deletion of personal and political rivals from photos was a common practice (King, 1997), but it has been done by other political leaders including Hitler and China's Mao Zedong. The hope was that these deletions would permanently alter the histor-ical record of the country. Benito Mussolini had the handler removed from a photo so that he and his sword could look more heroic posing on the handler's horse. The iconic photo of the Kent State Massacre by John Filo shows a woman mourning over a body lying face down in the street. In the original (Google it), a fence post was very awkwardly positioned behind her head. The distracting piece of hardware was conveniently deleted before the image was published in numerous magazines. The altered version won the Pulitzer and is considered one of the iconic photos of 20th century American history.

awareness level

We live in a culture that reveres the idea that we are fully aware of everything we think and do. This belief provides a feeling of self-assurance and control. In turn, it makes us fully responsible for what we do. There's only one problem. There is a vast amount of mental activity that takes place without conscious awareness (Lynch, 2014; Wegner, 2003). In fact, this ability to deal with thoughts at different levels of awareness can be quite functional. We often manage life's trials and tribulations by striving not to know certain things. It enables us to cope with uncertainty, anxiety, fear, confusion, and powerlessness. And it facilitates self-deception. Aware-ness should be conceptualized as a series of gradations, from fully conscious to com-pletely unconscious. The more aware we are of something, the harder it is to self-deceive about it, and vice-versa. In this sense, self-deception is the flip-side of self-awareness. If we were fully conscious of everything in our mind, different forms of self-deception would be difficult, if not impossible, to enact. Self-deception requires a mental environ-ment in which thoughts move among varying states of consciousness. Our ability to access thoughts and memories can change over time; some thoughts are further removed from our consciousness and those that are well hidden are less accessible to us, though they remain in our consciousness. We may, for example, be highly critical of some behavior exhibited by another person while being blissfully unaware that we exhibit the same behavior. Over time, however, we may develop an awareness that we, too, act the same way. Self-deception and self-awareness often occur gradually (Rorty, 1996). For example, spouses may continue having sex with partners they no longer love if they themselves are having difficulty admitting how they feel. A religious person who has gradually lost his or her faith over time may continue to attend services.

truth we feel

We often rely on emotions as a source of truth. These are visceral, instinctive, or intuitive re-sponses arrived at without the use of reason or logic. When this is the case, we "know" something is true because it "feels" true, even though the specific causes of our feelings may be difficult to identify or explain. And it may have little to do with the available information or apparent facts (a phenomenon Steven Colbert sarcastically refers to as "truthiness" [QR]). When people observe and process the nonverbal behavior of others, for example, they sometimes accurately describe the messages being communicated but are unable to verbally articulate what cues they used or how they went about constructing inferences about the other person's behavior. They may say, for exam-ple, that their judgment was based on a "hunch" or "just a feeling" (Smith, Archer & Costanzo, 1991). Interestingly, Katkin, Wiens, and Öhman (2001) found in an experiment that some people are better at detecting their own internal visceral reactions to stimuli than others (i.e., there are people who have increased abilities to rely on their "gut feelings"). The truth we experience through feelings can occur in a variety of ways. They may be a source of truth for an abstract belief (such as religious faith) or something very tangible (believing your doggo is the best pup in the world). They may precede a thought or they might follow it. While considering solutions to a problem, for example, it is not uncommon for business executives to report that the intuition associated with particular alternatives helped them determine which one to choose (the correct one, presumably; Sadler-Smith & Shefy, 2004). Whatever the source or the process, the truth we feel, at least initially, is uniquely owned by the person experiencing it.

executive functions

What psychologists call "executive functions" (QR) are higher order cognitive skills that emerge in late infancy and continue developing throughout childhood. Three of these functions are critical to the development of deceptive ability: • Inhibitory control is the ability to suppress interfering thoughts or actions (Carlson, Moses, & Breton, 2002). To successfully mislead someone, children must not only utter false information that differs from reality but also conceal the true information it contra-dicts. To maintain the lie, they must inhibit thoughts and statements contrary to the lie and remember the contents of the lie, at least in the short term. • Working memory is a system for temporarily holding and processing information, for whatever purposes or tasks are at hand. • Planning is also required to maintain a lie in that liars must prepare the contents of a lie prior to uttering it in order to appear convincing to their audience. Carlson, Moses, and Hix (1998) found that preschool children who experience difficulty with learning tasks that require a high level of inhibitory control, working memory, and planning also have difficulty with deception tasks. Clemens et al. (2010) argue that individual differ-ences in deceptive skill are strongly related to one's ability to regulate behavior and handle the increase in cognitive load a lie creates. Thus, children's maturing executive functions seem to facilitate their increasing success at lie-telling.

Neo-Impressionism

neo-impressionism: a sunday afternoon on the island of la grande Neo-impressionism is characterised by the use of the divisionist technique (often popularly but incorrectly called pointillism, a term Paul Signac repudiated). Divisionism attempted to put impressionist painting of light and colour on a scientific basis by using an optical mixture of colours.


Conjuntos de estudio relacionados

History Test 17.4 - King Henry VIII (8)

View Set

Biology A: Preparing for Quiz 1 Unit 2

View Set

Personality Final Exam review (Quizzes)

View Set

Chapter 6: Body Composition & Chapter 7: Putting Together a Complete Fitness Program

View Set

PrepU Chapter 68: Neurologic Trauma

View Set

AP Psychology: Unit 9 --- Clinical Psychology

View Set