unit 4 ch 10-11

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

3. Stereotypes Justify the Status Quo

Stereotypes don't justify only our emotions and behavior; they also justify the status quo. Evidence suggests that the stereotypes we have of groups are often ambivalent—including positive traits alongside negative traits. High-status groups that are assumed to be competent are also more likely to be stereotyped as cold. Lower-status groups might be stereotyped as being less intelligent or successful but are also often seen as warm and friendly. According to system justification theory, these ambivalent stereotypes help maintain the status quo by justifying the way things are (Jost & Banaji, 1994). In some ways, this is the flip side of social role theory: We not only assume the traits people have by the roles they enact, we also assert that they should be in those roles because they have the traits that are needed for those roles. System justification theory suggests that those who have high status in a society will often come to view those with lower status as being less intelligent and industrious than their own group to justify their own superior economic position. If advantaged members of a society didn't generate such justifications, they would have to admit that deep injustices exist that they should all be working to rectify; such rectification would alter the status quo, with advantaged groups potentially losing their advantages and everyone experiencing upheaval. Although higher-status people show a stronger tendency to justify the status quo, those who are disadvantaged sometimes do so as well. For example, when people are made to feel that the stability of their nation is in question, members of both lower- and higher-status ethnic groups more strongly endorse the belief that the higher-status group is relatively more competent and that the lower-status group is relatively warmer (Jost et al., 2005). How do these complementary stereotypes play into the motivation to justify existing status differences among groups? By favoring ambivalent stereotypes, groups that are disadvantaged in terms of their status in society can still pride themselves on their warmth. With that positive stereotype to hold on to, the negative stereotypes don't seem so bad. Similarly, groups with power and status can assuage any guilt by acknowledging the warmth of those with lower status. We see this most strikingly with gender. Modern theories of gender bias point to ambivalent sexism (Glick & Fiske, 1996), which pairs hostile beliefs about women (that women are incompetent or push too hard for gender equality) with benevolent beliefs (that women are pure and more compassionate than men). Although women primed to think about hostile sexism are motivated to fight for greater gender equality, reminders of benevolent sexism seem to only encourage their support for the status quo (Becker & Wright, 2011). Research also suggests that people prefer outgroup members to conform to prevailing stereotypes. Women who are assertive and direct are often judged negatively, whereas the same actions by men lead to admiration (Rudman, 1998). Terror management researchers (Schimel et al., 1999) have shown that reminding people of their mortality, which motivates people to want their worldviews upheld, leads white heterosexual Americans to prefer Germans, African Americans, and gay men who conform to prevailing American stereotypes of these groups over those who behave counterstereotypically.

Categorization

The categories we attend to most readily for people are gender, age, and other cues that might signal how we should treat one another (Fiske & Taylor, 2008; Kurzban et al., 2001). Because telling friend from foe was a life-or-death decision for our evolutionary ancestors, our brains have also adapted to form these categorizations using whatever cues will quickly do the job. We may be particularly likely to categorize an individual as an ingrouper or outgrouper by relying on cues such as accent, mode of dress, and adornment, along with other physical features, such as skin tone, body shape, and hair color. But our social categories are flexible enough to be cued by a host of things. We identify sports teams using different-colored uniforms and can guess sexual orientation based on how a person walks (Johnson & Tassinary, 2005; Johnson et al., 2007). The categorization process isn't entirely objective. A perceiver's stereotypes and prejudices can shape how someone is categorized, especially when a person's group identity is ambiguous (Freeman & Johnson, 2016). For example, to the degree that people tend to stereotypically associate young Black men with anger, they are quicker to categorize an angry face as being Black if the person's race is rather ambiguous. Mixed-race individuals are often categorized as being members of a minority group even when they are half White (Blascovich et al., 1997; Halberstadt et al., 2011; Ho et al., 2011). Once we categorize a person as an outgroup member, we tend to view that person in stereotypic ways. One reason this happens is that the very act of categorizing makes us more likely to assume that all members of the outgroup category are alike. Merely by categorizing people into outgroups, we tend to view those individuals as being more similar to each other—that is, more homogeneous—than they really are and as being more similar to each other than ingroup members are to each other (Linville et al., 1989; Park & Rothbart, 1982; Quattrone, 1986). This tendency is called the outgroup homogeneity effect. If you've ever heard someone say "Those people are all alike," you have probably witnessed this effect. The primary explanation for the outgroup homogeneity effect is that we are very familiar with members of our own group and therefore tend to see them as unique individuals. We have less detailed knowledge about members of outgroups, so it's easier simply to assume that they are all alike. In addition, we often know outgroup members only in a particular context or role. For example, a suburban White American might know African Americans mainly as sports figures, hip-hop artists, and criminals on TV. This role-restricted knowledge also encourages viewing outgroup members as being less diverse than they actually are. In one demonstration of the outgroup homogeneity effect, psychologists (Quattrone & Jones, 1980) asked university students to watch a video of a student from the participant's own university or from a different university make a decision (e.g., between listening to rock or classical music). The participants were then asked to estimate what percentage of people from that person's university would make the same decision. They estimated that a higher percentage of the person's fellow students would have the same musical preference when they were from a different university than when they were from the participants' own university. So when you assume that "they are all alike," you can infer that what one likes, they all like, but you probably also like to believe that "we" are a diverse assortment of unique individuals. The outgroup homogeneity effect not only extends to the inferences we make about a person's attitudes but also leads to very real perceptual confusions. We actually do see outgroup members as looking more similar to each other, a phenomenon that can have profound consequences for the accuracy of eyewitness accounts (Wells et al., 2006). This type of perceptual bias was first illustrated in a series of studies in which participant ingroup and outgroup members interacted in a group discussion (Taylor et al., 1978). When later asked to remember who said what—that is, to match a comment with a person—the participants made an interesting pattern of errors. They were more accurate at remembering ingroup statements than outgroup statements. But more telling, they were likelier to mistake one outgroup member for another. These confusions happen when we group other people together on the basis of visible categories, such as gender, race, age, skin tone, and attractiveness, but they even happen when we group others on the basis of nonvisible categories, such as sexual orientation and attitudes (e.g., Klauer & Wegener, 1998; van Knippenberg & Dijksterhuis, 2000).

stigma consciousness

The expectation of being perceived by other people, particularly those in the majority group, in terms of one's group membership.

A Dual Process View of Prejudice

The issue of controlling prejudice takes us back to the dual process approach (Devine, 1989; Fazio, 1990), first introduced in chapter 3. In Process 1, stereotypes and biased attitudes are brought to mind quickly and automatically (through a reflexive or experiential process). In Process 2, people employ reflective or cognitive processes to regulate or control the degree to which those thoughts and attitudes affect their behavior and judgment. Because prejudicial thoughts are often reinforced by a long history of socialization and cues in one's environment, they can come to mind easily, but this does not mean they cannot be controlled. For one thing, controlling one's biases requires an awareness that those biases are present, and some people are more aware of their biases than are others (Perry et al., 2015). Education can also raise awareness. Although interventions are mostly unsuccessful at changing people's implicit biases (Forscher et al., 2019; Lai et al., 2016), research is beginning to show that teaching people cognitive strategies to control their biases can improve their attitudes and intentions (Burns et al., 2017; Devine et al., 2017). Such education efforts can only be successful, however, if individuals are motivated to control their biases, which isn't always the case (Forscher et al., 2015). Even when people are motivated, their motivations can stem from different goals. When a motivation to avoid being biased stems from an internalized goal of being nonprejudiced, people can proactively keep implicit biases from influencing their decisions and judgment (Amodio & Swencionis, 2018). In many cases, though, the motivation to control prejudice stems from the perception of external pressures, such as the pressure to be politically correct or to avoid making others angry (Plant & Devine, 1998, 2009). Those who have little internal motivation to control their biases but feel externally coerced to keep quiet end up being resentful about having to censor themselves and have a stronger motivation to express their prejudice (Forscher et al., 2015; Plant & Devine, 2001). Many have posited that this built-up resentment helps explain the spike in explicit acts of prejudice in the days following Donald Trump's election (Okeowo, 2016) and the apparent rise in White supremacy (BBC News, 2017). Given the negative consequences of extrinsically motivated efforts to suppress prejudice, how is it possible to increase people's intrinsic motivation to control prejudice? One way is to impress on them the necessity of cooperating with those with whom they are working. When people realize that they need to cooperate with an outgroup person, they can be motivated to be nonbiased in their interactions with the outgroup and even show improved memory for the unique or individual aspects of that person (Neuberg & Fiske, 1987). At some level, people realize that falling back on stereotypes to form impressions might not provide the most accurate assessment of another person's character and abilities. The need to work together on a common goal helps to cue this motivation to be accurate and allows people to set aside their biases. More recent research taking a neuroscience perspective has uncovered the neurological mechanisms that support these two processes (Lieberman et al., 2002). Bartholow and colleagues (2006) examined specific electrical signals emitted from the brain that are indicative of efforts at cognitive control. They found that when White participants were presented with pictures of Black targets, the more of these signals that their brains emitted, the lower the accessibility of stereotypic thoughts. However, this occurred only when people's cognitive-control abilities were intact. When they were impaired through the consumption of alcohol, fewer of these specific signals were emitted, and participants were less able to control their tendency to stereotype others. Additional research shows that when White participants were exposed very briefly (for only 30 milliseconds) to pictures of Black faces, they showed increased activation in the amygdala (the fear center of the brain) that correlated with the degree to which they associated "Black American" with "bad" on an implicit association test (FIGURE 11.5) (Cunningham, Johnson et al., 2004; Phelps et al., 2000). With such a brief exposure, people can do little to override knee-jerk reactions. What is interesting is that lengthening exposure to the faces to 250 milliseconds increased activation in the dorsolateral prefrontal cortex (DLPFC), the region of the brain responsible for more effortful and controlled processes of judgment and decision making. Furthermore, the more DLPFC activation people experienced, the lower the amygdala activation they exhibited. These findings suggest that automatic negative attitudes that might have sprung to mind initially can be modified by more controlled processes (Cunningham, Johnson et al., 2004).

Sexual Objectification.

The tendency to think about women in a narrow way as objects rather than as full humans, as if their physical appearance were all that matters.

The Robbers Cave Study

To examine these ingredients for change in more detail, let's consider a classic study by Sherif and colleagues (Harvey et al., 1961), which dramatically demonstrates both how to create a prejudice and how to use the power of superordinate goals to reduce it. Sherif and colleagues invited 22 psychologically healthy boys to participate in a summer camp in Oklahoma. Because the camp was at the former hideout of the noted Old West outlaw Jesse James, this study has come be known as the Robbers Cave study. As the boys arrived at the camp, Sherif assigned them to one of two groups: the "Rattlers" or the "Eagles." During the first week, the groups were kept separate, but as soon as they learned of each other's existence, the seeds of prejudice toward the other group began to grow (thus showing how mere categorization can breed prejudice). During the second week, Sherif set up a series of competitive tasks between the groups. As realistic group conflict theory would predict, this competition quickly generated remarkable hostility, prejudice, and even violence between the groups as they competed for scarce prizes. In the span of a few days, the groups were stealing from each other, using derogatory labels to refer to each other (calling the rival group sissies, communists, and stinkers; the study was conducted during the 1950s!), and getting into fistfights. Was all lost at the Robbers Cave? It certainly appeared that way until, during the third week, Sherif introduced different types of challenges. In one of these challenges, he sabotaged the camp's water supply by clogging the faucet of the main water tank. The camp counselors announced that there was in fact a leak and that to find the leak, all 22 boys would need to search the pipes running from the reservoir to the camp. Thus, the campers were faced with a common goal that required their cooperation. As the Eagles and Rattlers collaborated on this and other such challenges, their hostilities disintegrated. They were no longer two groups warring with each other but rather one united group working together. Successfully achieving common goals effectively reduced their prejudice. Another way of looking at these challenges is that the Rattlers and Eagles faced a shared threat. In the example described above, it was the shared threat of going without water. Can you think of a historical example that led to a similarly cooperative spirit, only on a much grander scale? Many observers have suggested that the events of September 11, 2001, had a similar impact in reducing some types of intergroup biases in America. During and after this tragedy, the American people were confronted with the shared threat of terrorism at the hands of Osama bin Laden and al Qaeda. How did they react? In a rousing display of patriotism and goodwill, they united. Although prejudice against Muslims and Arab Americans increased, previous divisions among other groups of people were set aside—at least for a time. Research shows that the shared threat of global warming can also reduce prejudice against outgroups (Pyszczynski et al., 2012). As people think about the fate they share with others, a sense of common humanity can help reduce prejudice.

ethnocentrism

Viewing the world through our own cultural value system and thereby judging actions and people based on our own culture's views of right and wrong and good and bad.

Why Do We Apply Stereotypes?

We have looked at where stereotypes come from, and now we consider why we apply and maintain them. Research reveals that stereotypes have four primary psychological functions.

common ingroup identity

A recategorizing of members of two or more distinct groups into a single, overarching group.

multicultural theory

A worldview in which different cultural identities and viewpoints are acknowledged and appreciated.

colorblind ideology

A worldview in which group identities are ignored and people are judged solely on their individual merits, thereby avoiding any judgment based on group membership.

Hostile Feelings Linked to a Category

Allport viewed the first fundamental cause of prejudice to be a result of two basic human tendencies. First, people are likely to feel hostility when they are frustrated or threatened, or when they witness things they view as unpleasant or unjust. Second, just as we routinely categorize objects (see chapter 3), we also categorize other people as members of social groups, such as women, Asians, and teenagers, often within milliseconds of encountering them (Dickter & Bartholow, 2007; Fiske, 1998; Ito & Bartholow, 2009). Prejudice often results from linking hostile feelings to such salient categories of people (e.g., Crandall et al., 2011; O'Donnell et al., 2019). Let's consider a few examples of how this might occur. Imagine a French man robbed at gunpoint by another French man in Marseilles. The victim will likely experience fear and anger, hate the robber, and hope the thief is caught and imprisoned. Now imagine a French man who is robbed by an Algerian man. He will experience the same emotions but is more likely to direct his hatred toward Algerians and may therefore want all Algerians expelled from his country. Why? When we encounter outgroup members, what is salient to us is their group membership rather than their individual characteristics. So in the latter example, the victimized individual views his experience as being mugged by an Algerian; thus, his negative feelings are overgeneralized to the category rather than applied only to the individual mugger whose actions caused his negative experience. In a similar vein, an Afghan woman whose niece was killed by an American guided missile is likely to hate Americans. A European American kid hassled by a Mexican American in a middle school restroom may decide he hates "Mexicans." In each of these examples, negative experiences with a single individual or a small sample of individuals leads to a sweeping negative feeling that is applied to literally millions of people who are perceived to be members of the salient group. In a finding consistent with these examples, Rosenfield and colleagues (1982) showed that when White participants were asked for money by a shabbily dressed Black panhandler, they were later less willing to volunteer to help promote a racial brotherhood week compared to those who were initially approached by either a well-dressed Black graduate student or a shabbily dressed White panhandler. This idea of negative feelings generalized to an entire group can help explain sudden increases in prejudice after particularly threatening circumstances arise. For example, after the terrorist attacks of September 11, 2001, Americans exhibited more negative attitudes and behavior toward Muslim and Arab Americans. Although these reactions were sadly predictable, they are classic examples of prejudice: The Arab and Muslim Americans targeted had nothing to do with the attacks on the United States but were judged negatively because of their perceived group membership. Similarly, as the deadly novel coronavirus spread to the United States in early 2020, there was a substantial increase in verbal and physical attacks directed at Asian Americans (Tavernise & Oppel, 2020). The negative feelings associated with the virus were linked to China because the first major outbreak occurred there, and then President Trump reinforced this association by referring to it as "Chinese virus." With the category linked to the negative feelings, prejudice and discrimination became the all too predictable consequence. Sometimes, frustrations people experience fuel negative feelings and actions toward outgroups even in the absence of any inciting behavior by a member of that group. This is known as displaced aggression, and it can explain why in tough economic times, prejudice, stereotyping, and discrimination tend to increase (e.g., Hepworth & West, 1988; Hovland & Sears, 1940; Krosch et al., 2017). Experimental research confirms this process. When European American participants are led to believe resources are scarce, their brain activity indicates that they engage in less processing of African American faces, and they reduce their resource allocations to African Americans but not European Americans (Krosch & Amodio, 2019; Krosch et al., 2017). Realistic group conflict theory (Levine & Campbell, 1972) adds to Allport's idea of hostility generalized to a group by arguing that the initial negative feelings between groups are often based on a real conflict or competition over scarce resources. If individuals in one group think that their access to land, water, jobs, or other resources is being threatened or blocked by another group, the resulting sense of threat and frustration is likely to generate negative emotions about the perceived rival group. Recent research has shown that people are more likely to harbor and express prejudice toward a particular outgroup when they view their own group as cohesive and as having collective interests possibly threatened by that outgroup (Effron & Knowles, 2015). Unfortunately, these negative feelings are often culturally transmitted from generation to generation so that intergroup hostilities are perpetuated even if the initial conflict is no longer pertinent. As a result of protracted intergroup conflict, members of the conflicting groups come to feel anxious around each other, and that intergroup anxiety can further fuel prejudice toward the outgroup (Stephan & Stephan, 1985).

Social dominance orientation (SDO)

An ideology in which the world is viewed as a ruthlessly competitive jungle where it is appropriate and right for powerful groups to dominate weaker ones.

Devaluing the Domain

Another coping strategy that people turn to in dealing with discrimination is to devalue areas of life where they face pervasive experiences of prejudice and discrimination. If you decide that you really don't care about working on a naval submarine, then you might be relatively unaffected by the U.S. Navy's long-standing ban (not repealed until 2010) on women serving on submarines The tendency to devalue those areas where your group doesn't excel seems like a pretty effective strategy for managing bad outcomes. But the whole story is more complicated. It turns out that it is not so easy to devalue those domains in which higher-status groups are more accomplished. For example, on learning that women score higher on a new personality dimension described only by the name surgency, men readily devalue this trait as something that is not important to them personally (Schmader, Major, Eccleston, & McCoy, 2001). But when women learn that men score higher in surgency, they assume that this trait is at least as valuable as when women are higher on it. This pattern is reflective of a general asymmetry in how stereotypes constrain men's and women's interests (Croft et al., 2015). Although women are increasingly fighting to be respected in traditionally high-status male-dominated domains, men are generally less concerned that they are underrepresented in what are more likely to be lower-status female-dominated domains (Block et al., 2019). These pressures on groups with lower status can leave them with a difficult choice: Continue to strive for success in arenas where they are socially stigmatized because these are the domains that society considers important or call into question the very legitimacy of that society by devaluing those domains (e.g., making the decision to drop out of school). For example, although Black and Latino college students get lower grades on average than their White and Asian peers, they report valuing education at least as much if not more (Major & Schmader, 1998; Schmader, Major, & Gramzow, 2001). However, those who regard the ethnic hierarchy in the United States as unfair and illegitimate are more likely to call into question the value and utility of getting an education (Schmader, Major, & Gramzow, 2001). If the deck is stacked against you, you might very well decide to leave the game. One extreme form of devaluing is to create a group identity that opposes the majority group and its characteristic behaviors, ideas, and practices, in what is labeled an oppositional culture (Ogbu & Simons, 1998). For example, ethnic minority students (e.g., African Americans, Mexican Americans, Native Americans) may consider doing well in school or conforming to school rules as "acting White" (Fordham & Ogbu, 1986). When students engage in these "White" behaviors, they may face opposition from their peers and from other members of the minority community. They may respond by identifying with their peers' oppositional culture and consequently devaluing any behavior or goal that seems to represent the majority culture. Some Black students may not put their best effort into school-related activities, or they may even avoid school altogether. This strategy can increase their sense of belonging in the oppositional culture, but it also can lead them to reject opportunities for self-improvement and economic success simply because they don't want to resemble the majority culture.

Blaming the Bias, Not Oneself

As mentioned earlier, the dilemma of modern-day prejudice is that it can be very subtle. Consider an instance in which a woman is passed over for a promotion in favor of a male colleague. Is that discrimination? Or is she simply less qualified? It's often quite difficult if not impossible to know, and this situation puts those who are targeted by bias in a state of attributional ambiguity (Crocker et al., 1991). Crocker and her colleagues pointed out that the upside of attributing a negative outcome to prejudice is that it allows one to shift blame onto the biases of others and escape the negative feelings that might otherwise result. For example, if the woman in the example can dismiss the boss who rejected her as a sexist bigot, then she can maintain her opinion of herself as competent and intelligent. In one experiment, when Black college students learned that a White student was not that interested in becoming friends with them, their self-esteem was reduced when they didn't think the other person knew their race but was buffered when they believed their race was known (Crocker et al., 1991). You might be wondering when perceiving discrimination is or is not psychologically beneficial after we outlined all of its negative consequences. First, attributing an isolated incident to prejudice might buffer self-esteem from negative outcomes, but perceiving that discrimination is pervasive can be harmful to well-being (Eliezer et al., 2010; McCoy & Major, 2003; Schmitt et al., 2003). Second, when people blame themselves for their stigmatizing condition in the first place, they get no comfort from being the target of bias. When overweight female college students learned that a man wasn't interested in meeting them, they felt worse, not better, if they thought their weight played a factor in his evaluation (Crocker et al., 1993). Because society continues to perceive weight as something that can be controlled, these women felt responsible for being rejected. Finally, acknowledging that prejudice exists can reduce the shock when it happens to you. In one set of studies, women and minorities who generally believed that the world is unfair (compared with those who didn't) showed less physiological threat when they met and interacted with someone who was prejudiced against their group (Townsend et al., 2010). People can also protect their self-esteem more effectively by claiming discrimination when they can be certain that discrimination did occur (Major et al., 2003). In 2017, dozens of actresses, including Gwyneth Paltrow, Ashley Judd, Rose McGowan, and Angelina Jolie, publicly shared their stories of sexual harassment and abuse by Hollywood film producer Harvey Weinstein. The power in numbers has given other women the certainty and support to come forward to tell their horrific stories of casting calls with Weinstein. One of these women, Tomi-Ann Roberts, was an aspiring actress in her 20s when she met with Weinstein about a possible film role. She was shocked and appalled to find him naked in a bathtub, insisting that she would need to remove her top to be considered for the role. Roberts not only left the hotel suite, she gave up her plan to go into acting and instead pursued a successful career as a social psychologist studying objectification and sexism. In March 2020, Weinstein was found guilty of criminal sexual assault and rape in New York State and sentenced to 23 years in prison.

SOCIAL PSYCH AT THE MOVIES Remember the Titans

Capturing the complexities of racial integration on film is no easy feat. Many movies tackle themes of racial prejudice, but the 2000 film Remember the Titans (Yakin, 2000) provides what might be the best cinematic example of how to reduce prejudice by applying Allport's formula for successful intergroup contact. This movie is based on the true story of separate high schools in Alexandria, Virginia, that were forced to merge in 1971 as part of a rather delayed effort to desegregate Virginia's public schools. Integrating the student body also meant integrating the football teams, and the movie chronicles the growing pains of this newly diversified group and its struggle to put together a winning season. The film centers around the head coach of the Titans, Herman Boone, played by Denzel Washington, who faces an uphill battle in training a unified team of White and Black players who previously attended separate schools, played on rival teams, and still hold deeply entrenched racial prejudices. The film clearly depicts the conflict on the football field as a microcosm of the conflict in American culture in the immediate aftermath of the civil rights movement. The movie just as effectively portrays how Coach Boone pulls his team together to clinch the state championship in 1971. Recall that one of the elements for effective intergroup contact is the presence of institutional support. In the movie, this support is established at the outset when the school board decides to give the head coaching job to the former coach of the Black high school rather than to the coach of the White high school (played by Will Patton). This decision sends a clear message to the players and their parents that the school board has good intentions to integrate not only the school and the athletic programs but also the staff. Although tensions occasionally flare among the coaches, they generally work together for successful integration. The second element for effective contact is establishing equal status. Coach Boone makes his hard-as-nails coaching style crystal clear to the players' parents, to the other members of his coaching staff, and to his team. But perhaps most importantly, he quite visibly metes out punishment equally to Black players and to White players. As a result, the players quickly learn that earning a starting position on the team will have nothing to do with the color of their skin. Anyone who wants to play on the team will have to work hard. Still, the players struggle to get past their mistrust of one another. Seeing how his team continues to default to self-segregation by race, Coach Boone intervenes. When the team heads off to a training camp in two buses, he divides the players not by race but by offensive or defensive positions. To encourage contact further, he pairs White and Black players to room together for the duration of the intensive training. The overall message is that all the players, regardless of race, need to work together as a team to achieve the same superordinate goal of winning games. Does this strategy of forcing players to room together work? Not at first. A White player objects to his Black roommate's iconic poster of the track and field champions Tommie Smith and John Carlos giving the raised-fist black power salute during the medal ceremony at the 1968 Olympic Games. Not surprisingly, tempers flare, and a fight breaks out. Sharing a room in the dormitory also doesn't translate into socializing during meal times. Realizing that an important ingredient—intimate and varied contact—is still missing, Coach Boone mandates that each player interview his roommate to further break down the barriers of misunderstanding and mistrust. As Allport would have predicted, the players finally begin to cooperate as a unified team after this final element of friendship is established. Remember the Titans shows these important components of contact at work. If any of these components were missing, do you think that T. C. Williams High School still would have won the state championship in 1971? Why or why not? What lessons can we learn for creating more effective integration today?

Perspective Taking and Empathy

Earlier, we mentioned that one of the reasons optimal contact can be so effective is that it creates opportunities to take the perspective of members of the other group and see the world through their eyes. Direct contact isn't the only way for people to learn this lesson. To see why, let's go back in time to 1968, just a few days after Dr. Martin Luther King Jr. was assassinated. Jane Elliott, a third-grade teacher in Riceville, Iowa, was watching the news of this tragedy and dreamed up a remarkable classroom exercise to teach her all-White class of children about the injustice of racial prejudice (Peters, 1987). Over the next couple of days, she divided the class into two groups: students who had brown eyes and those who had blue eyes. She spent one day defining one group as the privileged and the other as the downtrodden. These designations were reflected in her actions and demeanor to the class, telling them, for example, that brown-eyed individuals are special and careful, whereas blue-eyed individuals are lazy and forgetful. What Elliott observed from this and subsequent implementations of the exercise was a remarkable—and apparently enduring—sensitivity to prejudice. Her students became acutely aware of the harmful effects that their own prejudices could have (see Cobb & Peters, 1985). It is powerful stuff, and we encourage you to view a portion of the video in LaunchPad or search the Internet (look for "Jane Elliott" plus "A Class Divided" on Google or YouTube) to check out some video clips. In having her third-graders spend a day being stigmatized for the color of their eyes, Jane Elliott implemented an impressive exercise in perspective taking. Perspective taking is a powerful tool for increasing empathy for the target's situation and creates a sense of connection between oneself and an outgroup. This strategy reduces prejudice against a single individual, and those positive feelings are often likely to generalize to other members of the outgroup (Dovidio et al., 2004; Galinsky & Moskowitz, 2000; Vescio et al., 2003; Vorauer & Sasaki, 2009). For example, in one study, participants who were asked to imagine vividly the experiences of a young woman who had been diagnosed with AIDS (as opposed to taking a more objective viewpoint toward her plight) felt more empathy for AIDS victims in general as well as for her (Batson et al., 1997). Perspective taking not only reduces explicit types of prejudice but also might reduce more implicit and subtle forms of bias we described earlier. For example, imagine that you are White and that you are asked to write about a day in the life of a young Black man (Todd et al., 2011). If you are in the perspective-taking condition, you will be told to visualize what the young man might be thinking and feeling as he goes about his day. If you are in the control condition, you will be told to take a more objective approach to writing about his day. After doing your respective assignment as well as some other unrelated surveys, you are led to a different room and asked to grab two chairs from a stack and set them up for a mock-interview task between you and an assistant named either "Jake," a typical White name, or "Tyrone," a typical Black name. Unknown to the participants who actually were in this study, the researchers measured the distance between the two chairs as an implicit measure of prejudice. They reasoned that if people had a more positive attitude toward the interviewer, they would set the chairs closer together. As you can see in FIGURE 11.9, participants in the control condition elected to sit farther away from Tyrone than from Jake. But if they first had to take the perspective of another young Black man during the earlier task, they sat at the same distance from the assistant, regardless of his race. When we think about what it's like to walk a day in the life of someone else, our biases are often diminished. In fact, in one very clever study conducted in Barcelona, Spain, researchers used virtual reality to have light-skinned female participants see and feel what it would look like to walk around with darker skin. Participants who spent about 20 minutes inhabiting a virtual body with darker skin subsequently exhibited a weaker implicit negative attitude toward Blacks on an IAT than did participants who had a light-skinned virtual body; those who had an alien-looking, purple-skinned virtual body; or those who did not have a virtual body and merely saw a dark-skinned person walk in the background of their virtual world (Peck et al., 2013). These benefits of perspective taking are impressive, but it is important to note that although perspective taking can reduce prejudicial attitudes, it is not always effective at changing people's stereotypes (Skorinko & Sinclair, 2013; Sun et al., 2016). One reason might be that people often do not accurately guess how others truly feel (Eyal et al., 2018). To become accurate in perceiving others, it's better to take the time to learn what they feel and think than to assume that we can imagine what their experience is truly like.

A Kernel of Truth

Even when stereotypes are broad overgeneralizations of what a group is like, some (but not all) stereotypes may be based on actual differences in the average traits or behaviors associated with two or more groups. This is what Allport called the kernel of truth hypothesis. Even though this kernel might be quite small, with much more overlap between groups than there are differences, as perceivers, we tend to exaggerate any differences that might exist and apply them to virtually all members of the groups; indeed, the most prominent stereotypic attributes ascribed to a group are sometimes the most exaggerated (Eyal & Epley, 2017). However, Lee Jussim and colleagues (2015) have been particularly active in making the provocative case that many of the stereotypes people hold about groups that have to do with specific facts, such as the percentage of Asian Americans who complete college relative to the percentage other Americans who do so, are often quite accurate. In fact, they sometimes even underestimate (rather than overestimate) group differences. Consistent with the idea of stereotypes reflecting some level of accuracy, a recent large sample study of Americans showed that the Black-violent stereotype is stronger in states in which Blacks have a higher rate of having been convicted of violent crimes (Johnson & Chopik, 2019). However, this is just a correlation, and the causality could run the other way. It's possible that in states in which Blacks are viewed more negatively, they experience more poverty and prejudice, which contributes to their being convicted of more violent crimes. But when it comes to personality traits, there is little support for the kernel of truth hypothesis. Consider a set of studies by Robert McCrae and colleagues (Terracciano et al., 2005). They assessed actual personalities in samples from 49 nations and then assessed the stereotypes about the personalities of people from those nations. There was good agreement across nations about what each nationality is like (e.g., Italians, Germans, Canadians). But the researchers found no correspondence between these stereotypes and the actual personalities of the people in those nations! You might think that Germans are more conscientious than Italians, but there's no evidence from the personality data that this is actually the case. A complicating factor with the kernel of truth hypothesis is that even when facts seem to support an overall group difference, those facts don't necessarily imply innate differences. For example, it may be true that a disproportionate percentage of African American males are convicted of crimes. However, this does not mean that African Americans are more violent or immoral by nature. In most cultures, minority groups that are economically disadvantaged and targets of discrimination are more likely to get in trouble with the law. Minority-group members who are low in socioeconomic status also tend to do less well in school, but again, an attribution of innate intellectual inferiority is an unwarranted leap. So even in cases in which there is a kernel of truth, the stereotype usually leads to an unjustified jump to assumptions about essential differences in traits and abilities. And this causes problems because research suggests that attributing negative attributes to genetic differences increases prejudice (e.g., Suhay et al., 2017).

Reducing Prejudice with a More Multicultural Ideology

In part, bolstering how people see themselves reduces prejudice because it makes people more open minded and less defensive (Sherman & Cohen, 2006). This leads us to consider perhaps a more straightforward strategy for reducing prejudice: reminding people of their tolerant values so that they are more willing to accept, if not embrace, others' differences (Greenberg, Simon et al., 1992). Sometimes people believe that the best way to be tolerant is to embrace a colorblind ideology—that is, view people only on their individual merits and avoid any judgment based on group membership. One concern with the colorblind approach is that it encourages efforts simply to control any biases or prejudices that one has toward an outgroup. Although this can sometimes be an effective way to avoid engaging in discrimination, our earlier discussion of controlling prejudice revealed that these efforts can also backfire. Another criticism of the colorblind approach is that it can imply that everyone should conform to the status quo and act as if ethnic differences don't matter (Plaut et al., 2018). As we alluded to previously, the colorblind approach is a much more comfortable stance for the advantaged majority group than for currently disadvantaged minority groups. Whites in the United States tend to take this to the extreme, sometimes failing to mention a person's race, even when doing so is simply stating a descriptive fact about an individual that could help describe the person to whom they are referring (Apfelbaum et al., 2008; Norton et al., 2006). An alternative is to embrace cultural pluralism, or a multicultural ideology, which acknowledges and appreciates different cultural viewpoints. This view emphasizes not just tolerating but actively embracing diversity. To understand the distinction between these two ideologies, consider the metaphors used in the United States and Canada, two countries that were formed largely as a result of immigration. The United States is typically referred to as a melting pot, a place where people of different ethnicities and former nationalities might converge and blend to form a single group. In Canada, the prevailing metaphor is the salad bowl, where citizens form an integrated collective while still maintaining their distinct ethnic heritage. These two approaches can have distinct effects for marginalized groups. When institutions signal that they value the diverse perspectives and contributions from minority students, for example, those students tend feel a greater sense of belonging and perform better (Brannon et al., 2015). From a psychological perspective, these different ideologies suggest different ways of approaching intergroup relations. A colorblind approach suggests that we should avoid focusing on group identity, whereas multiculturalism suggests that we should approach group differences as something to be celebrated (Chen et al., 2016). Members of advantaged groups who endorse a multicultural ideology tend to be less implicitly and explicitly prejudiced and are more likely to seek out contact with other groups (Leslie et al., 2020; Plaut et al., 2018; Rosenthal & Levy, 2013; Whitley & Webster, 2019). Furthermore, going into an interaction with a multicultural mind-set might sidestep all of the problems we see when people are focused on avoiding being biased. This is just what Trawalter and Richeson (2006) have found. When White participants were told to avoid being biased during an interaction with Black students, they became cognitively depleted from the effort and probably less receptive to future intergroup interactions. But when they were instead told to approach the interaction as an opportunity to have a positive interracial exchange, those effects weren't present, and the interaction went more smoothly. In a clever application of a similar idea, Kerry Kawakami and her colleagues (Kawakami et al., 2007) showed that these approach tendencies can be trained quite subtly. In one of their studies, participants completed an initial task in which they simply had to pull a joystick toward them when they saw the word approach displayed on a screen or push it away from them when they saw the word avoid. Unknown to the participants, faces were subliminally presented just before the target words appeared. Some individuals always were shown a Black face when they were cued to approach; others were shown a Black face when they were cued to avoid. After completing this task, participants had an unconscious association to approach or avoid Blacks. When they were asked to engage in an interracial interaction with a Black confederate, those in the approach condition behaved in a friendlier and more open way than those in the avoid condition. These results show that our goals for interactions can be cued and created unconsciously as well as consciously and that an approach orientation toward diverse others can be quite beneficial. Although these findings are encouraging, embracing diversity is not without challenges. Even majority group members who try to take a multicultural approach run the risk of being ham-handed in their efforts (You're Asian, do you know of a good sushi restaurant I could try?, Zou & Cheryan, 2015). Promoting diversity also makes salient both group categories and differences between groups. And in cultural-diversity training, the line between teaching about valid cultural differences and promoting unwarranted stereotypes is sometimes crossed.

Is Perceiving Prejudice Bad for Your Health?

Living in a society that devalues you because of your ethnicity, gender, sexual preferences, or religious beliefs can take a toll on both your mental and physical health. Several studies have shown that people who report experiencing more prejudice in their daily lives also show evidence of poorer psychological health (Branscombe et al., 1999; Schmitt et al., 2014; Sutin et al., 2015). In one study of 392 African Americans, an increase in their experience of discrimination over a 10-year time period predicted chromosomal changes (i.e., shortening of telomeres) that are indicative of early aging and a shortened life expectancy (Chae et al., 2020). The intersection of two or more devalued identities can be particularly associated with negative health outcomes (Lewis & Van Dyke, 2018). Negative consequences, such as increased depression and lower life satisfaction, are especially extreme when people blame themselves for their stigma or the way people treat them Because our culture is infused with stereotypic portrayals of various groups, these negative effects on mental health can be quite insidious. For example, there is an ongoing debate in the United States about the use of Native American images as mascots for school and sports teams. Do these images honor the proud history of a cultural group? Or do they present an overly simplistic caricature that debases a segment of society? Research shows that exposure to these mascots might reinforce people's stereotypes of Native Americans (Angle et al., 2017) and have negative effects on Native Americans. When Native American children and young adults were primed with these images, their self-esteem was reduced, they felt worse about their community, and they imagined themselves achieving less in the future (Fryberg et al., 2008). Many of these same participants believed that these mascots are not bad but that these images might contribute to the sense that Native Americans are invisible in mainstream society except as caricatures. In response to the evidence that mascots might have insidious effects on well-being, in 2014, the U.S. Patent and Trademark Office canceled the trademark that the Washington Redskins had on the football team's name and logo because both were deemed to be disparaging to Native Americans (Vargas, 2014). However, the name and logo are still used by the team. Prejudice can have long-term consequences for physical health as well (Contrada et al., 2000). Like any other chronic stressor, the experience of prejudice elevates the body's physiological stress response. For example, women who report being frequent targets of sexism show a greater physiological stress response (i.e., increases in cortisol, a stress-related hormone) when they believe they personally might have been targeted by bias (Townsend et al., 2011). Over time, this stress response can predict poorer cardiovascular functioning, the buildup of plaque in the arteries, and artery calcification, which increase the risk for coronary heart disease (Guyll et al., 2001; Lewis et al., 2006; Troxel et al., 2003). Although perceiving frequent discrimination predicts poorer well-being, this correlation also implies that those who do not perceive frequent experiences of prejudice fare much better psychologically. Later, we will discuss how particular ways of perceiving and reacting to discrimination can sometimes buffer people against its negative psychological consequences (Crocker & Major, 1989).

implicit prejudice

Negative attitudes or affective reactions associated with an outgroup for which the individual has little or no conscious awareness and that can be automatically activated in intergroup encounters.

Stereotypes

Overgeneralized beliefs about the traits and attributes of members of a particular group.

Working from the Top Down: Changing the Culture

Prejudice exists within a cultural context, legitimized (albeit subtly at times) by the laws, customs, and norms of a society (Hatzenbuehler, 2014; Salter et al., 2018). Thus, one of the great challenges in reducing prejudice lies in changing these laws, customs, and norms. One dramatic example occurred when the Supreme Court's 1954 decision in Brown v. Board of Education declared public school segregation unconstitutional. Desegregation fostered integration and reduced prejudice (Pettigrew, 1961). As we discussed in chapter 6, a change in behavior (in this case, by law) often can lead to a change in attitudes because people strive for consistency between the two. Changing public attitudes can also lead to institutional change. In June 2020, at a time when public support of LGBTQ+ rights had never been higher, the Supreme Court ruled that it is unconstitutional to be fired for one's sexual orientation or transgender status. Around the same time, thousands of Americans across the country were coming together to protest policing practices that systematically and tragically disadvantage Black Americans (Hetey & Eberhardt, 2018) (see FIGURE 11.4). Although in mid-2020 it was too soon to be sure this new social movement would lead to better policing, unprecedented shifts in cultural norms and attitudes were being exposed. For example, within three weeks after George Floyd was killed by a police officer kneeling on his neck, eight U.S. cities and three states had ordered bans on neck restraints (Kaur & Mack, 2020), and the Minneapolis city council had voted to create a more humane system of law enforcement (Milman, 2020). To counter the role of implicit racial bias, many police departments educate their employees about how to detect, avoid, and change their biases (Spencer et al., 2016). Alongside these changes, there has been a dramatic shift in public opinion, and a majority of Americans now believe that there is a larger problem of racial bias in law enforcement (Voytko, 2020). In addition to changing intergroup attitudes, institutional changes can help break down stereotypes. After desegregation, when the educational structure became somewhat more (though not completely) equal, many more Black Americans were able to be successful. The more such counterstereotypic narratives pervade the cultural landscape, the more people encounter those who defy their preconceived ideas about certain groups. As discussed in chapter 10, when President Obama is the example that people bring to mind when thinking of Black people, they are less likely to be prejudiced (Columb & Plant, 2011; Plant et al., 2009). By increasing the diversity of different groups, affirmative action policies can help change stereotypes (Allport, 1954; Morgenroth & Ryan, 2018). The less a group is associated with poorer neighborhoods and jobs, lower academic performance, increased crime, and the like, the better. Recognizing this cycle of group images and prejudice, we see how powerfully the mass media affect how majority group members perceive minority group members. The Jeffersons in the 1970s and 1980s, Murphy Brown in the early 1990s, and Glee in 2010 were important in bringing into mainstream awareness the issues faced by African American families, single working moms, and gay teenagers. And research confirms that the more people are exposed to counterstereotypic fictional examples of minority groups, the less they show automatic activation of stereotyped associations (Blair et al., 2001; Dasgupta & Greenwald, 2001). In fact, an ambitious field experiment in Rwanda exposed people to one of two radio shows over the course of a year: either a soap opera with health messages or a soap opera about reducing intergroup prejudice (Paluck, 2009). Those exposed to the show about reducing prejudice displayed more positive attitudes about and behavior toward interracial marriage.

Coping with Stereotype and Social Identity Threat

Research has pointed to several ways in which the negative psychological effects of prejudice and stereotypes can be reduced. These findings have important implications for educational and social policies.

Individual Differences in Perceiving Prejudice

As you might suspect, not all minority-group members share equally the expectation of being the target of prejudice. People's sensitivity to perceiving bias depends on the extent to which they identify with their stigmatized group. If people normally don't think about themselves as being members of disadvantaged groups, then discrimination might not seem like something that happens to them. In contrast, people who are highly identified with their stigmatized group are more likely to recognize when prejudice and discrimination might affect their lives (Major et al., 2003; Operario & Fiske, 2001). Members of minority groups also differ in their stigma consciousness—their expectation that other people, particularly those in the majority group, will perceive them in terms of their group membership (Pinel, 1999). People higher in stigma consciousness are more likely to expect their interactions with others to go poorly. Unfortunately, these expectations can sometimes lead to self-fulfilling prophecies. For example, when women particularly high in stigma consciousness had reason to think that a male stranger might be sexist, they evaluated an essay he had written more negatively, which then led him to evaluate their essays more negatively (Pinel, 2002). The negative evaluations they received might have confirmed their assumption of the man's sexism, yet his evaluations might have been more positive if they had not criticized his essay first. But as we will discuss shortly, self-fulfilling prophecies are a two-way street. They also affect how those who are nonstigmatized perceive and interact with stigmatized targets.

Seeking Social Support

At the other end of the spectrum from concealment is creating and celebrating a shared identity with others who are similarly stigmatized. Earlier we mentioned that those who report encountering frequent or ongoing discrimination show signs of psychological distress. But according to rejection identification theory, the negative consequences of being targeted by discrimination can be offset by a strong sense of identification and pride with a stigmatized group (Branscombe et al., 1999; DeMarco & Newheiser, 2019; Postmes & Branscombe, 2002). Although pride in one's ethnic identity is likely supported by one's family and social circle, other identities can be stigmatized even by parents, siblings, and friends. That is why gay pride and similar movements can be so critical to a feeling of social support. In certain cases, marginalized groups band together to form broader coalitions against bias and discrimination (Craig & Richeson, 2016). For example, the term people of color (POC) is increasingly a label preferred by members of non-White groups because it creates a common identity united by shared experience of bias in America. When minority groups become allies, they not only gain greater social support, they also become a more powerful force for social change.

The Harmful Impact of Stereotypes on Behavior

Being the target of prejudice also affects how people behave and perform. When you hold a stereotypic expectation about another person (because of group membership, for example), you may act in a way that leads the stereotyped person to behave just as you expected. For example, say that you suspect that the clerk at the café is going to be rude, so you are curt with her. She responds by being curt back to you. Voilà! Your initial judgment seems to be confirmed. Yet you may be ignoring the fact that, had you approached the interaction with a different expectation in mind, the clerk might not have acted rudely. This was demonstrated in a classic study of self-fulfilling prophecy, a topic we introduced in chapter 3 (Word et al., 1974). In the first of a pair of studies, White participants were asked to play the role of an interviewer with two different job candidates, one White and the other Black. When the job candidate was Black, the interviewer chose to sit farther away from him, was more awkward in his speech, and conducted a shorter interview than when the candidate was White. The racial identity of the candidate affected the way in which the interview was conducted. But does this difference in the interviewer's manner affect how the job candidate comes across during the interview? The answer is "yes." In a second study, the researchers trained their assistants to conduct an interview either using the "good interviewer" style that was more typical of the interviews with White candidates (e.g., sitting closer) or the "bad interviewer" style that was more typical of the interviews with Black candidates (e.g., sitting farther away). When the trained assistants interviewed unsuspecting White job candidates, an interesting pattern emerged: The job candidates assigned to a "bad" interviewer came across as less calm and composed than those assigned to the "good" interviewer. More recent research shows just how subtle these effects can be. In one set of studies, when female engineering students were paired with male peers to work together on a project, a male partner's implicit sexist attitudes about women predicted the female partner's poorer performance on an engineering test (Logel et al., 2009). What were the more implicitly sexist guys doing? They were not more hostile or dismissive toward their female partners. Rather, they were more flirtatious with them, and in fact the women reported liking these men. Yet the men's flirtatious behavior led women to perform more poorly on the engineering test. Other research has shown that self-fulfilling prophecy effects are stronger when more people hold the stereotypes (Madon et al., 2018). What might be a small effect when considering the stereotyped expectations of just one perceiver becomes much larger when aggregated across many perceivers and experiences.

aversive racism

Conflicting, often nonconscious, negative feelings about African Americans that Americans may have, even though most do in fact support principles of racial equality and do not knowingly discriminate.

Confronting Those with Biases

Consider the following scenario: You are working on a class project as part of a small group, and you and your team members have to take turns choosing what kinds of people you would want with you on a deserted island. One young man in the group consistently makes sexist choices (e.g., "Let's see, maybe a chef? No, one of the women can cook."). Would you say anything to him? In a study that presented women with this scenario, most said they would confront the guy in some way, probably by questioning his choice or pointing out how inappropriate it is (Swim & Hyers, 1999). But when women were actually put in this situation, more than half of them did nothing at all. People often find it difficult to confront episodes of prejudice or discrimination they observe or experience. This "do-nothing effect" isn't limited to targets put in the position of confronting an outgroup member (Crosby, 2015). White Americans often stay silent when they overhear another White person use a racial slur when referring to a Black person (e.g., Greenberg & Pyszczynski, 1985b; Kawakami et al., 2009). Confronting those who express prejudice is a lot harder than we might imagine it to be. Being silent in these situations is particularly troubling because expressions of prejudice can rub off on the observer. In one study, White participants who heard a racial slur used to describe an African American became more negative in their evaluation of the person targeted by the slur, despite the fact that in debriefings, the participants reported being appalled by the remark (Greenberg & Pyszczynski, 1985b). Why do racist and sexist remarks often go unchallenged? One reason is because those who do the confronting are often viewed as complainers (Kaiser & Miller, 2001). This kind of "blame the victim" reaction happens even when the evidence supports the student's claim that discrimination actually occurred! In other research, when Whites were confronted with the possibility that they were biased in their treatment of others, they tried to correct their biases in the future but also felt angry and disliked the person who confronted them (Czopp et al., 2006). Even members of your own stigmatized group can be unsympathetic when you point to the role of discrimination in your outcomes (Garcia et al., 2005). These social costs can make it difficult to address bias when it does occur, particularly if you are the person targeted by the bias and in a position of relatively little power. ..."Fools", said I, "You do not know Silence like a cancer grows Hear my words that I might teach you Take my arms that I might reach you" But my words, like silent raindrops fell And echoed In the wells of silence —Simon & Garfunkel Despite the costs of confrontation, real social change requires it. This raises a question: Are other options available that might get a similar message across but in a way that minimizes these costs? According to the target empowerment model, the answer is "yes" (Focella et al., 2015; Stone et al., 2011). This model suggests that targets of bias can employ strategies that deflect discrimination, as long as those actions aren't perceived as confrontational. And even those that are confrontational can still be effective if they are preceded by a strategy designed to put a prejudiced person at ease. Let's illustrate how this model works. In post-9/11 America, Muslims have too often been targeted by stereotypic perceptions that they endorse or are involved in terroristic activities, and perhaps as a result, they are often victims of assault (Pew Research Center, 2017). If you are Muslim, you understandably might want others to see your perspective on the world and appreciate how hurtful these misconceptions can be. However, when prejudiced White Americans were asked by an Arab American student with a Muslim-sounding name to take his perspective, they perceived him as confrontational, stereotyped him more negatively, and reported a decreased interest in getting to know him (Stone et al., 2011). But if he first asked White perceivers to think about something they value, thereby allowing them to self-affirm, his plea for empathy worked. By getting those who are highly prejudiced to reflect on their own values or positive attributes, targets can encourage majority members to take their point of view in a less threatening manner.

The Downsides of Control Strategies

Even when people succeed in controlling their biases, some downstream consequences of these efforts can be negative. First, exerting mental effort in one context might make people less willing or able to exert effort afterward in another context. For example, when White college students had any kind of conversation with a Black peer, regardless of whether the conversation was even about race, they performed more poorly on a demanding computer task right afterward than when they had this conversation with another White student (Richeson et al., 2003; Richeson & Shelton, 2003; Richeson & Trawalter, 2005). In addition, trying to push an unwanted thought out of mind often has the ironic effect of activating that thought even more. As a result, the more people try not to think of a stereotypic bias, the more it can eventually creep back in, especially when cognitive resources are limited (Follenfant & Ric, 2010; Gordijn et al., 2004; Macrae, Bodenhausen et al., 1994). Failure of control strategies can happen even when it seems that one has gotten past initial stereotypes to appreciate the outgroup person's individual qualities. In one study, participants who watched a video of a stigmatized student talking showed stereotype activation within the first 15 seconds, but after 12 minutes, the stereotype was no longer active or guiding judgment (Kunda et al., 2002). This might seem to be good news. However, if participants later learned that the person in the video disagreed with them, the stereotype was reactivated. The implication is that, in our own interactions, we might often succeed in getting past initial stereotypes, but those stereotypes still might lurk just offstage, waiting to make an appearance if the situation prompts negative or threatening feelings toward that person. We've seen that conscious efforts to control prejudice, although well intentioned, can fail or backfire completely. The implication is that reducing prejudice requires more than employing strategies to control prejudice; it also requires going to the source and changing people's prejudicial attitudes. How do we do this?

Stereotypes Distort Memory

Finally, stereotypes bias how we recall information. Back in chapter 3, we described a study in which White participants were shown a picture of a Black man in a business suit being threatened by a young White man holding a straight razor (Allport & Postman, 1947). As that participant described the scene to another participant, who described it to another participant, and so on, the story tended to shift to the razor being in the Black man's hand and the business suit being on the White man. Rumors often can distort the facts because our stereotypes bias what we recall (and what we retell) in ways that fit our expectations. Since that initial demonstration, similar findings have also been shown even when the stereotype isn't evoked until after the information has been encoded—and for a wide range of stereotypes regarding ethnicity, occupation, gender, sexual orientation, and social class (e.g., Dodson et al., 2008; Frawley, 2008).

SOCIAL PSYCH AT THE MOVIES Gender Stereotypes in Animated Films, Then and Now

Have you ever stopped to think about how the stories you learned as a child might have formed a foundation for the gender stereotypes you hold today? Children become aware of their own gender and begin showing a preference for gender-stereotypical toys and activities between two and three years of age (Encyclopedia of Children's Health, n.d.). Some of these beliefs and preferences are learned from observing their parents, peers, and siblings (e.g., Tenenbaum & Leaper, 2002), but children's books, movies, and other media also play roles in reinforcing cultural messages about gender. Consider some popular pre-women's movement children's movies. In the classic 1959 film Sleeping Beauty, the protagonist, Aurora, pretty and kind, cannot even regain consciousness without the love and assistance of her prince (Geronimi, 1959). Snow White cheerily keeps house and cleans up after the seven dwarfs in the 1937 animated film (Hand, 1937) until her status is elevated through marriage to a prince. Cinderella, released in 1950, feels more obviously oppressed by the forced domestic labor and humiliation by her stepmother and stepsisters, but again, she can only escape her fate through the love of a wealthy prince (Geronimi et al., 1950). The common theme in these films is that beauty and innocence are the qualities a young woman should possess to achieve her Happily Ever After, which can happen only through marriage to a handsome and well-heeled man. And older, unmarried, or widowed women are often cast as the villains in these stories, spurred to evil acts by jealousy of their younger rivals. Reflecting cultural shifts that encourage greater agency in women, princess characters in films released more recently have become noticeably more assertive. Ariel, from The Little Mermaid (Clements & Musker, 1989), is willful and adventurous, eager to explore the world beyond her ocean home. But even so, she needs the permission of her authoritative father and the love of a man to realize her dreams. Along the way, she even trades her talent (her voice) to undergo severe changes to her body (legs instead of a tail) for the opportunity to woo her love interest. In other modern animated films, the portrayals of princesses have become more complex and counterstereotypic. First, there has been an effort to present characters from different cultures, with protagonists who are Middle Eastern (Jasmine in Aladdin, 1992), Native American (Pocahontas, 1995), Chinese (Mulan, 1998), African American (Tiana, The Princess and the Frog, 2009), Scottish (Merida, Brave, 2012), and Pacific Islander (Moana, 2016). Second, the modern princesses in animated films are more often cast as heroic. In Mulan, the protagonist disguises herself as male so that she can use her fighting skills to save and rescue the male characters in the movie. In Moana, the titular character is an adventurous teenager who leaves the safety of her island to embark on a treacherous journey to save her people. Finally, the two Frozen films (2013 and 2019) tell the story of two strong and determined sisters. Elsa, the older sister, embraces her power to control ice and become a strong leader to Arendelle, and her little sister Anna bravely risks her own life to find and save Elsa. These newer princess stories highlight autonomy, strength, and independence for young women. Of course, before we get too encouraged by these messages of equality, we might ponder whether these modern fairy tales reflect lower levels of hostile sexism toward women (gone are the evil witches and stepmothers in these more contemporary films) but still reinforce benevolent sexist beliefs about women. The female characters are still young, beautiful, and good, and their Happily Ever After still often involves getting the guy. In fact, these benevolent views of women are manifested in children's movies more generally—if girls and women are portrayed at all, that is. Studies of G-rated family films have found that only about 30% of the speaking characters are female (Smith et al., 2010), a disparity that is also evident in prime-time television and has remained largely unchanged over 15 years (Sink & Mastro, 2017). Female characters are more likely to wear sexy or revealing clothing than their male counterparts (Sink & Mastro, 2017). Whereas male characters are more often portrayed as having power and/or being funny, female characters are more commonly portrayed as having good motives and being attractive, although in an encouraging trend, they are also portrayed as being equally or even more intelligent (Smith et al., 2010). It's likely that these stereotypic portrayals shape our gender schemas. A meta-analysis of more than 30 studies suggested that up to the mid-1990s, children and adults who watched more television also had more traditional views about gender (Herrett-Skjellum & Allen, 1996). Longitudinal studies have suggested that the causal arrow goes from exposure to television to gender stereotypes because the more television children watch, the more they accept gender stereotypes when they are much older (e.g., Kimball, 1986). Increasing scrutiny of these subtle ways that stereotypes are perpetuated raises questions for policy makers. Should films, television shows, and other media be rated on the basis of their stereotypic messages? The Swedish Film Institute thought so back in 2013. Swedish theaters began employing a feminist rating system known as the Bechdel test (Rising, 2013), which awards an A rating to films that portray two female characters talking to each other about something other than a man. It's not a perfect system, but it's a start in calling needed attention to gender bias at the movies.

Responding to and Reducing Prejudice

History is littered with examples of the harm that prejudice can cause. This harm can be obvious and severe, as in atrocities such as genocide, enslavement, and colonization. These atrocities often continue to affect the targeted groups many generations after their occurrence (Salzman, 2001). But prejudice can also lead to less visible discrimination in hiring, career advancement, health care, legal proceedings, and loan opportunities that exacerbate social problems (Nelson, 2009; Riach & Rich, 2004; Stangor, 2009). Psychologically, prejudice can lead members of targeted groups to feel devalued within their culture (Frable et al., 1990; Inzlicht et al., 2006). Chronically feeling socially devalued can have detrimental effects on health and well-being (Major & Schmader, 2017). In all these ways, prejudice, stereotyping, and discrimination contribute to poverty, physical, behavioral, and mental health problems, as well as a sense of being excluded from mainstream society (e.g., Anderson & Armstead, 1995; Kessler et al., 1999; Klonoff et al., 1999; Schmader & Sedikides, 2018; Williams, 1999; Williams et al., 1999). In this chapter, we focus on: What happens psychologically to people who are targeted by prejudice and how they cope The processes that influence how and whether people perceive prejudice and how they respond to it How even subtle encounters with prejudice and stereotypes can affect one's health, behavior, and performance and how members of stigmatized groups can remain resilient despite bias and discrimination Some promising strategies for reducing prejudice

Interpreting Behavior

If stereotypes actually can lead us to sometimes see something that isn't there, it should come as no surprise that they also affect how we interpret ambiguous information and behaviors (e.g., Kunda & Thagard, 1996). Research shows that people interpret the same behavior differently when it is performed by individuals who belong to stereotyped groups. In one study (Duncan, 1976), White students watched a videotape of a discussion between two men that ended just after one of the men shoved the other. Was the shove harmless horseplay, or was it an act of aggression? If participants (who were White) watched a version of the tape in which the man delivering the shove was White, only 17% described the shove as violent, and 42% said it was playful. However, if they watched a version in which the same shove was delivered by a Black man, 75% said it was violent, and only 6% said it was playful. In fact, stereotypes influence the interpretation of ambiguous behaviors even when those stereotypes are primed outside conscious awareness. When police and probation officers were primed beneath conscious awareness with words related to the Black stereotype and then read a vignette about a shoplifting incident, they rated the offender as being more hostile and deserving of punishment if he was Black than if he was White (Graham & Lowery, 2004). Many other studies have similarly shown that stereotypes associated with race, social class, gender, or profession can lend different meanings to the same ambiguous information (e.g., Chaxel, 2015; Darley & Gross, 1983; Dunning & Sherman, 1997). Evidence indicates that stereotypes set up a hypothesis about a person, but because of the confirmation bias, we interpret ambiguous information as evidence supporting that hypothesis.

illusory correlation

A tendency to assume an association between two rare occurrences, such as being in a minority group and performing negative actions.

symbolic racism

A tendency to view members of a racial outgroup as a threat to one's way of life and to express this view by rejecting social policies seen as benefiting that group.

Realistic group conflict theory

A theory which asserts that the initial negative feelings between groups are often based on a real conflict or competition regarding scarce resources.

objectification theory

A theory which proposes that the cultural value placed on women's appearance leads people to view women more as objects than as full human beings.

Implementing Optimal Contact in a Jigsaw Classroom

Although each of Allport's conditions can improve racial attitudes (at least among the majority group), the best recipe for success is to combine all the ingredients in the contact setting (Pettigrew & Tropp, 2006). Because the desegregation of schools seldom included all of these components for effective contact, initial evaluations of school desegregation found little success in reducing prejudice and intergroup conflict (Stephan, 1978). For example, school settings tend to emphasize competition rather than cooperation; authority figures are often mainly from the majority group, and the minority students don't feel they have equal status; and ethnic groups often segregate within the school, minimizing the opportunity for intimate contact and cooperation. How can schools do better? Consider a cooperative learning technique developed by Elliot Aronson and colleagues called the jigsaw classroom (FIGURE 11.8) (Aronson et al., 1978). In this approach, the teacher creates a lesson that can be broken down into several subtopics. For example, if the topic is the presidency of the United States, the subtopics might include influential presidents, how the executive branch relates to other branches of the government, how the president is elected, and so on. The class is also subdivided into racially mixed groups, and one person in each group is given the responsibility of learning one of the subtopics of the lesson. This student meets with other students from other groups assigned to that subtopic so that they can all review, study, and become experts in that topic and create some kind of artifact such as a poster or a presentation to summarize their newly gained knowledge. The experts then return to their original group and take turns teaching the others what they have learned. The power of this approach is its potential for embodying all of Allport's conditions for optimal contact. First, because the task is assigned by the teacher, it is authority sanctioned. Second, because the students are all in charge of their own subtopics, all the kids become experts and thus have equal status. Third, the group is graded both individually (recall our discussion from chapter 9 on accountability and social loafing) and as a group. Thus, the students share a common goal. And fourth, to do well and reach that common goal, they must cooperate in intimate and varied ways, both teaching and learning from each other. All the pieces must fit together, like the pieces in a jigsaw puzzle. The jigsaw classroom program is generally successful, so much so that one wonders why it is not implemented more widely. One reason is that some topics in school may not lend themselves to this kind of learning approach, but still, a lot do. Compared with children in traditional classrooms, children who go through the program show increased self-esteem, intrinsic motivation for learning, and, most crucially, increased peer liking across racial and ethnic groups (Blaney et al., 1977; Hänze & Berger, 2007; Slavin, 2012).

Motivations to Avoid Perceptions of Prejudice

Although stigma consciousness might lead people to sometimes overestimate their experience of prejudice, this is not the norm. Instead, it is more common for people to estimate that they personally experience less discrimination than does the average member of their group (Taylor et al., 1990). This effect, called the person-group discrimination discrepancy, has been documented in many groups, including women reporting on their experience of sexism and racial minorities reporting on their experience of racism. This effect has even been found among inner-city African American men, a group that is probably most likely to experience actual discrimination in employment, housing, and interactions with police (Taylor et al., 1994). Why is the tendency to avoid seeing prejudice and discrimination directed at oneself so pervasive? People may fail to see the prejudice targeted at them because they are motivated to deny that prejudice and discrimination affect their lives. Why? For one thing, this denial may be part of a more general tendency to be optimistic. Experiencing discrimination, having health problems, and being at risk for experiencing an earthquake all are negative events, and people are generally overly optimistic about their likelihood of experiencing such outcomes (Lehman & Taylor, 1987; Taylor & Brown, 1988). It might be beneficial to one's own psychological health to regard discrimination as something that happens to other people. Another reason is that people may be motivated to sustain their faith that the way society is set up is inherently right and good, thereby justifying the status quo (Jost & Banaji, 1994). Buying into the status quo brings a sense of stability and predictability, but it can lead stigmatized individuals to downplay their experience of discrimination. In one experiment, White and Latino students were put in the same situation of feeling that they had been passed over for a job that was given to someone of another ethnicity (Major et al., 2002). To what extent did they view this as discrimination? The results depended on the students' ethnicity. Among Whites, those most convinced that the social system in America is fair and that hard work pays off thought it was quite discriminatory for a Latino employer to pass them over to hire another Latino. After all, if the system is fair, and Whites have been very successful in the system, an employer has no justification for choosing a minority-group member over themselves. But among Latinos, those who saw the social system as fair were least likely to feel that it was discriminatory for a White employer to pass them over in favor of a White participant. Believing the system is fair might keep people motivated to do their best, but for members of minority groups in society, it can reduce the likelihood of recognizing discrimination when it does occur.

Why Does Optimal Contact Work?

Although the Robbers Cave experiment is usually described as an example of how superordinate goals can help break down intergroup biases, Allport's other key ingredients for optimal contact were present as well: The boys had equal status, the cooperative activities were sanctioned by the camp counselors, and there were plenty of activities where the boys could get to know one another. But knowing that these factors reduce prejudice doesn't tell us much about why. Other research has isolated a few key mechanisms by which optimal contact creates positive change: Reducing stereotyping. Consider that one of the most effective forms of contact involves members of different groups exchanging intimate knowledge about each other. This allows the once-different other to be decategorized. As a result, people are less likely to stereotype members of the outgroup (Kawakami et al., 2000). Reducing anxiety. Optimal contact also reduces anxiety that people may have about interacting with people who are different from themselves (Stephan & Stephan, 1985). The unfamiliar can be unsettling, so by enhancing familiarity and reducing anxiety, contact helps to reduce prejudice. Fostering empathy. Finally, optimal contact can lead someone to adopt the other person's perspective and increase feelings of empathy. This helps people to look past group differences to see what they have in common with others (e.g., Galinsky & Moskowitz, 2000).

Objectification

Although the consequences of being stigmatized often apply broadly to different groups, some are more specific. One important example is the objectification that can result from the strong focus in many cultures on women's bodies. In chapter 10, we discussed how the sexual objectification of women promotes certain stereotypes and prejudice against them. But Fredrickson and Robert's (1997) objectification theory also proposes that this intense cultural scrutiny of the female body leads many girls and women to view themselves as objects to be looked at and judged, a phenomenon that the researchers called self-objectification. Being exposed to sexualizing words or idealized media images of women's bodies, hearing other women criticizing their own bodies, and undergoing men's visual scrutiny of their bodies all prompt self-objectification, which increases negative emotions such as body shame, appearance anxiety, and self-disgust (e.g., Aubrey, 2007; Calogero, 2004; Gapinski et al., 2003; Roberts & Gettman, 2004). The more shame they feel, the more vulnerable they are to disordered eating, depression, and sexual dysfunction (Fredrickson & Roberts, 1997). These effects of self-objectification have likely contributed to the obsession with weight that has led 73% of American women to make some serious effort at some point to lose weight, compared with only 55% of men (Saad, 2011). Self-objectification also disrupts concentration and interferes with cognitive performance (Fredrickson et al., 1998). In one study, male and female college students were first asked to try on and evaluate either a swimsuit or a sweater. Then, wearing the particular garment while alone in a makeshift dressing room, they completed a short math test. Men were unaffected by what they were wearing, but women who were wearing the swimsuit were drawn to monitoring their appearance and consequently performed worse than if they were wearing a sweater.

Affirming Broader Values

Another possible coping strategy is self-affirmation. Self-affirmation theory (for a refresher, see chapter 6) posits that people need to view themselves as good and competent. When they encounter a threat to their positive self-view in one area of life, they can compensate by affirming other deeply held values. On the basis of this theory, people who are reminded of their core values might be protected from the negative effects of stereotypes. This hypothesis has been supported in several longitudinal studies (Cohen et al., 2006; Cohen et al., 2009; Miyake et al., 2010). In one study (Cohen et al., 2006; Cohen et al., 2009), students were assigned to write about either a personally cherished value or a value that others might care about but that was not central to their own lives. The researchers then tracked students' grades. This simple affirmation task had no effect on White students' academic performance. But Black students who affirmed their values were far less likely to earn low grades over the course of that semester. The positive effects on their academic performance persisted up to two years later (see FIGURE 11.3). Although other researchers have not always replicated this effect (Hanselman et al., 2017), recent evidence suggests that self-affirmation works best for students who take the affirmation task seriously and are most at risk of experiencing stereotype threat (Borman et al., 2018).

4. Stereotypes Are Self-Esteem Boosters

As described previously, self-esteem threats not only increase negative feelings about outgroup members but also lead to negative beliefs about them and make negative stereotypes of such groups more accessible to consciousness (Spencer et al., 1998). Viewing members of outgroups as stupid, lazy, cowardly, or immoral can help people feel better about themselves (Fein & Spencer, 1997). Other evidence also supports the role of stereotyping in boosting the perceiver's self-esteem. For example, if a member of a disliked outgroup praises us, we shouldn't be too motivated to apply a negative stereotype. But what happens when that person gives us negative feedback? Research by Lisa Sinclair and Ziva Kunda (1999) showed that we selectively focus on different ways of categorizing people, depending on these self-serving motivations. After all, people belong to myriad different social categories, and the intersectionality of multiple identities means that motivation can play a determining role in shaping when a person is categorized in one group or another. In their study, White Canadian participants imagined receiving either praise or criticism from a Black doctor or a White doctor. The researchers measured whether stereotypic knowledge was automatically brought to mind. Participants who were praised by the Black doctor activated positive stereotypes of doctors but not negative stereotypes about Blacks. However, participants who were criticized by the Black doctor activated the negative stereotype of Blacks and not the positive stereotype of doctors. Further research suggests that once activated, these stereotypes likely bias people's judgments. In one study, for example, female and male faculty members received similar course evaluations from students who did well in their courses, but students who received lower grades evaluated female instructors as less competent than their male peers (FIGURE 10.12; Sinclair & Kunda, 2000).

Connecting Across a Divide: Controlling Prejudice in Intergroup Interactions

As society's laws change, popular portrayals of groups become less stereotypic, and individuals within that society feel a greater responsibility to control their biased attitudes and beliefs. Indeed, research finds that people are less likely to express their prejudice publicly if they believe that people in general will disapprove of such biases (Crandall et al., 2002). As students were faced with the reality of desegregation during the 1960s and 1970s, they were also faced with the reality of needing to control, at least to some extent, their prejudicial biases and stereotypic assumptions about outgroups.

Reducing Prejudice Without Contact

As we've just seen, Allport provided us with an excellent playbook for reducing intergroup prejudices through positive and cooperative contact. But sometimes people hold prejudices about groups with which they never interact. When the opportunities for contact are infrequent, can other psychological strategies reduce intergroup biases? The answer is "yes."

Social Role Theory

If stereotypes don't arise from real differences in the underlying traits of different groups, where do they come from? One possibility is that they come from the roles and behaviors that societal pressures may impose on a particular group. Because of the fundamental attribution error, when people see us in a role, they jump to the conclusion that we have the traits implied by the behaviors we enact in that role. This is the basic assumption of Alice Eagly's (1987) social role theory: We infer stereotypes that describe who people are from the roles that we see people play. Social role theory primarily has been used to explain the existence of persistent stereotypes about men and women. Men are stereotyped to be agentic—assertive, aggressive, and achievement oriented. Women are stereotyped to be communal—warm, empathic, and emotional. Are these stereotypes supported by gender differences in behavior? Yes. Men are more likely to be the CEOs of Fortune 500 companies. Women are more likely to be the primary caregivers of children. If we look only at these statistics, we will find more than a kernel of truth to the stereotype. But does this gender segregation in the boardroom and at the playground really imply sex differences in traits? Not necessarily. FIGURE 10.9 shows what happened when people were asked to rate the traits listed in a brief description of an average man or an average woman, with either no information about the person's occupation, information that he or she was a full-time employee, or information that he or she was full-time homemaker (Eagly & Steffen, 1984). With no information, people readily applied their stereotypes, assuming that a woman is more communal than a man and that a man is more agentic than a woman. But this may just result from assumptions about social roles of men and women because occupation completely trumps anatomy: A homemaker is judged to be more communal and less agentic than an employee, regardless of that person's sex. The point here is that social roles play a large part in shaping our stereotypes. Social pressures can shape the roles in which various groups find themselves, and differences in stereotypes follow suit (Croft et al., 2015). The traditional stereotype of African Americans as lazy and ignorant was developed in the pre-Civil War South, when the vast majority of them were forced to work as slaves and excluded from schools. Similarly, Jews have been stereotyped as money hungry or cheap, a stereotype that developed in Europe at a time when Jews were not allowed to own land, and they needed to become involved in trade and commerce in order to survive economically. The particular stereotypes attached to groups are often a function of such historical and culturally embedded social constraints.

Does Contact Increase Positive Attitudes Toward the Majority Group?

It should be noted that our discussion so far has focused largely on how contact can help members of more advantaged social groups develop more positive intergroup attitudes and become invested in working toward equality (Tropp & Barlow, 2018). What about the other side of the coin? Does optimal contact also improve intergroup attitudes for the minority group member, such as the African American woman or the gay man put into contact with members of the majority group? A small body of research on this question shows that contact is more of a mixed bag for those in the minority (Tropp & Pettigrew, 2005). Contact situations often are framed from the perspective of reducing biases held by a majority group. The risk is that minority-group members can feel stripped of an important minority identity. Furthermore, when minority-group individuals are exposed to prejudice against their group, which is more likely to occur in the initial stages of contact, this prejudice can intensify their negative attitudes toward the majority group (Tropp, 2003). Contact situations might need to be designed specifically to reduce minority-group members' own biases against the majority.

Stereotypes Influence Perception

Just after midnight on February 4, 1999, four New York City police officers were in pursuit of a serial rapist believed to be African American. They approached a 23-year-old African immigrant, Amadou Diallo, in front of his Bronx apartment building. Assuming that the police would want to see his identification, Diallo reached into his jacket and pulled out his wallet. One of the officers saw the situation differently and called out, "Gun!" The officers fired 41 bullets, 19 of which struck Diallo, killing him. Bruce Springsteen wrote a song about the incident, "American Skin (41 Shots)." The officers were acquitted of any wrongdoing by a jury in Albany, New York (about 150 miles from New York City), a decision that sparked public protest. The city eventually settled a wrongful-death lawsuit by Diallo's family for $3 million. Many factors likely played roles in the tragedy, but one thing is clear: In his hand Diallo held a wallet that was mistaken for a gun. Can research on stereotyping help us understand how this could happen? Yes. In fact, this event inspired a line of research on what has come to be called the shooter bias. This bias has to do with the stereotyped association of Blacks with violence and crime (e.g., Eberhardt et al., 2004; Payne, 2001). We know that people process stereotype-consistent information more quickly than stereotype-inconsistent information, all else being equal. What is surprising is how quickly stereotypes can exert this influence on perception. In three studies (Correll et al., 2002), White American participants played a video game in which they were shown photographs of Black and White men holding an object (sample images appear in FIGURE 10.13) and were asked to press the "shoot" button if the individual was holding a gun and the "don't shoot" button if the individual was not holding a gun. The experimenters predicted that White participants would be faster to shoot an armed person if he were Black than if he were White. In addition, they should be faster to make the correct decision to not shoot an unarmed person if he was White rather than Black. The bar graph on the left in FIGURE 10.13 shows that this is just what happened. When in another study (shown in the right graph of FIGURE 10.13) participants were forced to make decisions under more extreme time pressure, they made the same kind of error that the police made when they shot Diallo. That is, participants were more likely to shoot an unarmed Black man than they were to shoot an unarmed White man. Evidence from these studies suggests that these effects resulted more from the individual's knowledge of the cultural stereotype that Blacks are dangerous than from personal prejudice toward Blacks. In fact, in a follow-up study, the researchers found that even Black participants showed these same shooter biases. Further studies using the same shooter-game paradigm have revealed that the shooter bias is affected by a number of additional factors. People show a stronger shooter bias if the context itself is threatening—say, a dark street corner rather than a sunlit church (Correll et al., 2011). It's also stronger when the Blacks in the photos look more prototypically Black—in other words, when they have darker skin and more typically Afrocentric features (Ma & Correll, 2010). This finding reveals part of a general tendency for stereotypes to be applied more strongly to those who seem most prototypical of a group. In fact, anything that reinforces, justifies, or increases the accessibility of a racial stereotype strengthens the likelihood that the stereotype will be applied (Correll et al., 2007). Research suggests that people make these mistakes of misidentifying an object as a gun when it's in the hands of someone who is Black rather than White even when perceiving 5-year-old children (Todd et al., 2016). No wonder women like Iesha Evans, a 28-year-old mother of a 5-year-old son, came out to protest police brutality in the wake of several high-profile police shootings of Black men in the summer of 2016. The stereotype of African American men as threatening leads to another erroneous perception that may contribute to police overreacting to African American men they encounter. A series of studies has shown that non-Black Americans tend to overestimate the physical size and strength of young Black men (Wilson et al., 2017). Law-enforcement officials across the nation have become interested in the problem of racial bias, and some have teamed up with researchers to combat these effects. In one shooter-game study of police officers and community members, both were faster to shoot an armed target if he was Black than if he was White. But police officers were less likely than community members to shoot an unarmed Black target (Correll et al., 2007; also see Correll et al., 2014). It is fortunate that many law-enforcement personnel receive training that has some effect in reducing these biases. Nevertheless, tragic errors resulting from such biases still occur. In several of the cases that sparked protests in 2016, police officers shot an African American because they feared that he was pulling a gun on them or might do so. For example, in 2014, 12-year-old Tamir Rice was shot dead by a police officer who mistook the African American child's pellet gun for a handgun (Almasy, 2015). In another incident, a police officer shot a 47-year-old African American therapist, Charles Kinsey, who was trying to assist his severely autistic patient who had wandered away from a group home and was sitting in the middle of the street playing with a toy truck. Lying on the ground with his hands in the air and a bullet in his leg, Kinsey asked the officer why he had just shot him. The officer responded, "I don't know" (Silva, 2017).

Physiological Measures of Bias

Measures of implicit prejudice tap into people's automatic affective response to a person or a group. Some measures do this by indexing an immediate physiological reaction that people are unlikely to control or may find difficult to control. For example, when Whites are asked to imagine working on a project with a Black partner or a White partner, they often report a stronger preference for working with the Black partner. But their faces tell a different story. Electrodes connected to their brows and cheeks pick up subtle movements of the facial muscles that reveal a negative attitude when they think about working with a Black partner (Vanman et al., 2004). Similarly, when Whites are actually paired up to work with a Black partner, they show a cardiovascular response that is associated with threat: Their hearts pump more blood, and their veins and arteries contract (Mendes et al., 2002). The brain also registers the threat response. The amygdala is the brain region that signals negative emotional responses, especially fear, to things in our environment. Whites who have a strong racial bias exhibit an especially pronounced amygdala response when they view pictures of Black men (Amodio, 2014; Phelps et al., 2000). Interestingly, however, if given more time, this initial negative attitude tends to get downregulated by the more rational dorsolateral prefrontal cortex (Cunningham, Johnson et al., 2004; Forbes et al., 2012). We'll further discuss why and how people go about controlling their prejudiced attitudes and emotions in chapter 11. For now, the primary point is that automatic negative bias leaks out in people's physiological responses.

Perceiving Prejudice and Discrimination

Membership in a group that is viewed or treated negatively by the larger society is bound to affect people in some way (Allport, 1954). Yet, as you learned in the previous chapter, for many stigmatized groups in the United States, prejudice is sometimes a lot subtler and harder to detect than it was 50 years ago. Although this might be a sign of progress, it makes it harder to pinpoint when one is the target of prejudice. Anyone who feels marginalized in society has probably faced this dilemma. Take the following quote from Erving Goffman's classic 1963 book Stigma: "And I always feel this with straight people [people who are not ex-convicts]—that whenever they're being nice to me, pleasant to me, all the time really, underneath they're only assessing me as a criminal and nothing else" (p. 14). This individual's reflection reveals the master status that can accompany stigmatizing attributes—the perception that others will see a person solely in terms of one aspect rather than appreciating the person's total self. As a result, stigmatized individuals are persistently aware of what sets them apart in their interactions with others. For example, when asked to describe themselves, students from an ethnic-minority background are more likely to make mention of their group identity than are students from the ethnic majority (McGuire et al., 1978). When people are conscious of being stigmatized, they become more attuned to signs of prejudice. In one study, women expecting to interact with a sexist man were quicker to detect sexism-related words (e.g., harassment, hooters, bitch) during a computer task and were more likely to judge ambiguous facial expressions as showing criticism (Inzlicht et al., 2008; Kaiser et al., 2006).

How Do Stereotypes Contribute to Bias?

Once stereotypes are activated, we use them to perceive and make judgments about others in ways that confirm, rather than disconfirm, them. Stereotypes influence information processing at various stages, from the first few milliseconds of perception to the way we remember actions years in the future. Let's take a closer look at how stereotypes color people's understanding of others in ways that can have very important consequences.

Reducing Prejudice by Bolstering the Self

Perspective taking reduces prejudice by changing the way people think about others. But can we also reduce prejudice by changing how people think about themselves? Because some prejudices result from people's deep-seated feelings of insecurity, when their feelings about themselves are bolstered, they often can become more tolerant and compassionate toward those who are different. You may recall a couple of theories suggesting that people take on negative attitudes toward others to protect their positive view of themselves (Fein & Spencer, 1997). For example, according to terror management theory (Solomon et al., 1991), encountering someone who holds a very different cultural worldview can threaten the belief system that upholds one's sense of personal value, which can increase fears about death. When people feel that their self-esteem is threatened, or when they are reminded of their mortality, they cling more tightly to their own worldview, which can mean derogating those with a different belief system. Therefore, one remedy for prejudice might be to bolster an individual's sense of self-esteem (Harmon-Jones et al., 1997; Schmeichel et al., 2009). Similarly, self-affirmation theory (Steele, 1988) also predicts that prejudice can be a defensive reaction to feelings of personal insecurity. In the Fein and Spencer (1997) study discussed in chapter 10, participants who received negative feedback were more likely to derogate a Jewish student. However, if participants first had the chance to think about how they lived up to their own values, they showed no such pattern of discrimination. Although bolstering a person's self-esteem can reduce prejudice, there is one caveat to this effect: If the value system being bolstered is the cultural worldview threatened by the outgroup, then the effects of self-affirmation can backfire (Arndt & Greenberg, 1999). For example, although you might be able to reduce antigay prejudice by affirming people's values and abilities in areas such as athletics or sense of humor, an affirmation of their traditional family values will do little to decrease this prejudice (Lehmiller et al., 2010; Vescio & Biernat, 2003).

Reducing Prejudice

Reducing prejudice essentially entails changing the values and beliefs by which people live. This is tricky for a number of reasons. One is that people's values and beliefs are often a long-standing basis of their psychological security. Another is that prejudice often serves specific psychological functions for people, such as allowing them to displace their hostile feelings or buttress their shaky self-esteem. A third difficulty arises because, once established, prejudiced views and stereotypes constitute schemas, and like other schemas, they tend to bias perceptions, attributions, and memories in ways that are self-perpetuating. Finally, people sometimes are not even aware of their prejudices and their influence. All these factors make prejudice difficult to combat. However, although there is no one-size-fits-all solution, a number of encouraging approaches are available. We will start from the top, so to speak, and examine how prejudice can be reduced at the societal or institutional level. Given that the effectiveness of institutional change sometimes hinges on people controlling their expressions of prejudice, we will turn next to whether and when people are able to effectively do so. Finally, we discuss how to go beyond controlling the expression of prejudice to actually change people's prejudiced attitudes and ease intergroup conflict.

Social Identity Threat

Research on stereotype threat reveals that it's mentally taxing to perform under the pressure of presumed incompetence. A more general version of this threat is called social identity threat, the feeling that your group is not valued in a domain and that you do not belong there (Steele et al., 2002). For example, women working in engineering report greater social identity threat and job burnout on days when they feel their male colleagues do not respect their contributions (Hall et al., 2019). To cope with social identity threat, people might find themselves trying to juggle their various identities. For example, women who go into male-dominant domains find themselves having to suppress their more feminine qualities (Pronin et al., 2004; von Hippel et al., 2011). A minority student who excels in academics can be accused of being an "Uncle Tom" or of "acting White" (Fordham & Ogbu, 1986). Older adults struggle to feel committed to their job when their age feels at odds with their identity as an employee (Manzi et al., 2019). On the one hand, repeated exposure to stereotype threat and social identity threat can eventually lead to disidentification, which occurs when people no longer feel that their performance in a domain is an important part of themselves, and they stop caring about being successful (Steele, 1997). This can be a serious problem if, for example, minority children disidentify with school. In fact, being the target of negative stereotypes can steer people away from certain opportunities if those stereotypes lead them to assume they will experience a lack of fit and belonging (Aday & Schmader, 2019; Schmader & Sedikides, 2018). For example, women continue to be underrepresented in science, technology, engineering, and math, and this is particularly true in computer science, where the percentage of women has actually decreased over the past three decades. One factor is that students have a very specific stereotype of what a computer scientist is like, and women are much more likely than men to think that it isn't like them. In one study, women expressed far less interest in majoring in computer science when they completed a survey in a computer scientist's office filled with reminders of the computer-geek stereotype than did those who completed the same questionnaire in a room that did not reinforce the conventional stereotype of computer scientists (Cheryan et al., 2009, 2017). Other research has shown that girls experience greater feelings of fit in science when they interact with successful female role models in the field (O'Brien et al., 2017). The takeaway message is that the ability to identify with similar others plays an important role in attracting women and minorities to fields where they have been historically underrepresented.

What We've Learned from Measuring Implicit Bias.

Research using the IAT has shown that although implicit bias has been trending down (Charlesworth & Banaji, 2019), as of 2015, 48% of White and 42% of Biracial adults showed at least a slight implicit bias toward Whites, whereas 45% of Black adults showed at least a slight bias toward Blacks (Morin, 2015). What is less clear is what these associations mean. Some researchers have criticized the measure for confounding the tendency to associate "Black" and "bad" with the tendency to associate "White" and "good" (Blanton & Jaccard, 2006; Blanton et al., 2006). However, other evidence suggests that IAT scores do reliably assess responses that are predictive of behavior (Greenwald, Smith et al., 2009; Greenwald et al., 2015; Kurdi et al., 2019; LeBel & Paunonen, 2011). Even if we grant that the IAT is a reliable measure, some dispute continues about what it taps into. For example, people actually show stronger racial biases when they know the measure is supposed to reveal their racial biases (Frantz et al., 2004). Anxiety about being labeled racist might actually make it more difficult for people to perform the task. In addition, some researchers have noted that an association of "Black" with "bad" could mean a variety of things, such as the acknowledgment that Blacks are mistreated and receive bad outcomes or simply cultural stereotypes that might have little to do with one's personal attitudes (Andreychik & Gill, 2012; Olson & Fazio, 2004). Other theorists suggest that implicit associations primarily tap into biases in the surrounding cultural context more than in the minds of individuals (Payne et al., 2017). Even if we set aside the debate about the IAT in particular, a broader pattern emerges from the literature examining both implicit and explicit measures of prejudiced attitudes. Most notably, although they can be correlated, they are often quite distinct. In other words, people who have an implicit negative attitude toward a group might still explicitly report having positive feelings. But even more interesting is that people's implicit attitudes seem to predict different kinds of behavior than their explicit attitudes. Explicit prejudice predicts overt or controllable expression of prejudice, whereas implicit prejudice better predicts subtler negative reactions to outgroup members. For example, when researchers have analyzed interracial interactions between strangers, they have found that Whites' explicit prejudice predicts what they say to a Black partner, but it's their implicit prejudice that predicts how they say it (Dovidio et al., 2002). What this means is that even when explicitly well-intentioned Whites might try to say the right thing, their body language may communicate discomfort and avoidance (see also Amodio & Devine, 2006; McConnell & Leibold, 2001).

Complexities of Modern Prejudice

Social psychologists have developed a number of related concepts to explain the subtler, more complex forms of prejudice that have emerged. Each in its own way emphasizes the need to understand how and why people might explicitly reject prejudiced attitudes but still harbor subtle biases. Here we focus on two: ambivalent racism and aversive racism.

Final Thoughts

Social psychology has taught us a lot about where prejudice comes from and how it is activated, and it has also shown us how it affects others and how it can be reduced. Yes, there is a long and varied list of cures, but that's because bias has many different causes and manifestations. Our interest in being egalitarian can motivate us to control our biases and become more tolerant of diversity in society (Verkuyten & Yogeeswaran, 2017). However, moving beyond a "live and let live" brand of tolerance to embracing the value of different viewpoints and perspectives may be the most effective way of achieving intergroup harmony. Embracing the value of diversity assumes that people want to achieve intergroup harmony. Broader changes in cultural norms can play a powerful role in helping people internalize these motivations. The more we see others behave and interact in an egalitarian way, the more we follow suit (MacInnis et al., 2017). Reducing prejudice doesn't happen overnight. All of us will suffer relapses on the way, but cultures can shift gradually toward equality. Reducing prejudice against a segment of the population can benefit everyone in the end. For example, cross-national data from the World Bank reveals a strong positive correlation between equivalent educational opportunities for both girls and boys and the economic prosperity of a country (Chen, 2004). We can all benefit from maximizing the well-being and opportunities of everyone in society.

The Ultimate Attribution Error

Stereotypes also bias our explanations of interpretation after events have played out. You may remember that we tend to make self-serving attributions for our own experiences: Good things happen because of us, and bad things happen because of the situation. We show a similar bias when we make attributions for fellow ingroup members and exactly the opposite tendency when explaining the behavior of outgroup members (Hilton & von Hippel, 1996). This is called the ultimate attribution error (Hewstone, 1990; Pettigrew, 1979). When an outgroup member does something negative, or when an ingroup member does something positive, this is consistent with our automatic preference for ingroups over outgroups (Perdue et al., 1990). We infer that it's the dispositional character of the groups that caused the behaviors: We do good things because we are good people. They do bad things because they are bad people. Of course, every now and again, we might be forced to admit that an outgroup member performed well or behaved admirably and an ingroup member performed or behaved poorly. But in such cases, the attribution veers toward the situation. Attributing negative outgroup behavior to the person but positive outgroup behavior to the situation reinforces negative stereotypes about the outgroup and belief in the superiority of the ingroup. Not surprisingly, this tendency is strongest in ingroup members highest in prejudice against the outgroup (e.g., Greenberg & Rosenfield, 1979). The ultimate attribution error has been applied primarily to ethnic prejudice, but stereotypes also influence how people make attributions for men's and women's behavior (e.g., Deaux, 1984). When men succeed on a stereotypically masculine task, observers tend to attribute that success to the men's dispositional ability, but when women perform well on the same task, observers tend to attribute that success to luck or effort. Likewise, men's failures on stereotypically masculine tasks are often attributed to bad luck and lack of effort, whereas women's failures on the same tasks are attributed to their lack of ability. In this research, both men and women often exhibit this pattern of attributions: Regardless of their gender, people tend to explain men's and women's behaviors in ways that fit culturally widespread stereotypes.

Compensating for Other's Biases

Targets of prejudice also cope with stigma by compensating for the negative stereotypes or attitudes they think other people have toward them. For example, when overweight women were making a first impression on a person and were led to believe that that person could see them (and thus knew their weight), they acted in a more extraverted way than if they were told that they could not be seen. They compensated for the weight-based biases they expected others to have by being extra-friendly. And it worked: Those who thought they were visible were rated as friendlier by the person with whom they were interacting (Miller et al., 1995). In a similar finding, Black college freshmen who expected others to have racial biases against them and their group reported spending more time disclosing information about themselves when talking with their White dormitory mates (Shelton et al., 2005). Self-disclosure is a powerful way of establishing trust and liking, so it is not surprising that Black participants who self-disclosed a great deal were liked more by their White roommates. Unfortunately, these kinds of compensation strategies can come with costs. Black students who reported engaging in a lot of self-disclosure with a White roommate also reported feeling inauthentic in this relationship. By trying to put their White roommates at ease, they might feel unable to be true to themselves. Another potential cost of compensation is that it can disrupt the smooth flow of social interaction as people work to manage the impressions they are making (Bergsieker et al., 2010; Shelton & Richeson, 2006). For people who belong to the more advantaged group, interactions with outgroup members can bring to mind concerns about appearing prejudiced and may lead them to increase their efforts to come across as likable and unbiased (Vorauer et al., 1998). People who belong to the disadvantaged group might be most concerned about being stereotyped as incompetent and compensate by trying to self-promote. The problem here is that interactions tend to go more smoothly when people's impression-management goals are matched. If one person cracks jokes to show how warm and likable she is while the other wants to have an intellectual conversation to bolster her perceived competence, each party might walk away from the interaction feeling misunderstood, disconnected from the other, and a bit cognitively exhausted (Richeson et al., 2003; Richeson & Shelton, 2003; Richeson & Trawalter, 2005).

stereotype threat

The concern that one might do something to confirm a negative stereotype about one's group either in one's own eyes or the eyes of someone else.

Where Do People's Stereotypic Beliefs Come From?

The cultural perspective suggests that we learn stereotypes over the course of socialization as they are transmitted by parents, friends, and the media. These stereotypes are often quite blatant in the media, but they may be represented subtly as well. For example, in American print ads, men tend to be higher in the page, and this positioning contributes to perceiving men as more dominant than women (Lamer & Weisbuch, 2019). Even small children have been shown to grasp the prevailing stereotypes of their culture (e.g., Aboud, 1988; Williams et al., 1975). People who don't endorse or actively believe stereotypes about other ethnic groups can still report on what those cultural stereotypes are (Devine, 1989). So even if we try not to accept stereotypes ourselves, we are likely to learn cultural stereotypes through prior exposure. For example, people who watch more news programming—which tends to overreport crime by minority perpetrators—are more likely to perceive Blacks and Latinos in stereotypic ways as poor and violent (Dixon, 2008a, 2008b; Mastro, 2003). This process of social learning explains how an individual picks up stereotypes both consciously and unconsciously. But how do these beliefs come to exist in a culture in the first place?

What's a Target to Do? Coping with Stereotyping, Prejudice, and Discrimination

The evidence we've reviewed on the effects of prejudice and stereotyping might lead us to expect targets of bias to feel rather lousy about themselves. Interestingly, a review of the literature revealed surprisingly little evidence that people stigmatized based on race, ethnicity, physical disability, or mental illness report lower levels of self-esteem than those who are not normally stigmatized (Crocker & Major, 1989). Even in the face of negative treatment and social devaluation, people can be remarkably resilient. Let's look at a few of the ways people cope with the daily jabs of stereotyping and prejudice, as well as the trade-offs these strategies can have.

Confirming Stereotypes to Get Along

The findings just discussed point to a powerful dilemma. Stereotypes are schemas. You'll remember from chapter 3 that schemas help social interactions run smoothly. People get along better when each individual confirms the other person's expectations. This suggests that the more motivated people are to be liked, the more they might behave in ways that are consistent with the other person's stereotypes, a form of self-stereotyping. In one study of self-stereotyping (Sinclair, Huntsinger et al., 2005), women had a casual conversation with a male student whom they were led to believe had sexist or nonsexist attitudes toward women. In actuality, he was a member of the research team trained to act in a similar way with each woman and to rate his perceptions of her afterward. Those women who generally had a desire to get along with others and make new friends (i.e., they were high in affiliative motivation) rated themselves in more gender-stereotypic ways when interacting with the guy they believed to be sexist, and as shown in FIGURE 11.1, he also rated their behavior to be more stereotypically feminine. Women who were low in this general motivation to affiliate with others did just the opposite: If they thought their conversation partner would be sexist, they rated themselves as being more counterstereotypic, and the researcher also rated them as coming across in less stereotypical ways during their interaction. The motivation to get along can sometimes lead people to act in stereotypical ways.

justification suppression model

The idea that people endorse and freely express stereotypes in part to justify their own negative affective reactions to outgroup members.

ambivalent racism

The influence of two clashing sets of values on White Americans' racial attitudes: a belief in individualism and a belief in egalitarianism.

ambivalent sexism

The pairing of hostile beliefs about women with benevolent but patronizing beliefs about them.

master status

The perception that a person will be seen only in terms of a stigmatizing attribute rather than as the total self.

Infrahumanization.

The perception that outgroup members lack qualities viewed as being unique to human beings, such as language, rational intelligence, and complex social emotions.

Stereotypes Tend to Be Self-Confirming

The phenomena we've discussed are just a few of the many ways in which stereotypes systematically bias how we think and make judgments about other individuals and groups. A harmful consequence of this influence is that stereotypes reinforce themselves, which makes them relatively impervious to change (Darley & Gross, 1983; Fiske & Taylor, 2008; Rothbart, 1981). Stereotypes lead us to attend to information that fits those stereotypes and to ignore information that does not. When we do observe behaviors that are inconsistent with our stereotypes, we tend to explain them away as isolated instances or exceptions to the rule (Allport, 1954). Because stereotypes can be activated unconsciously, people may not even be aware that stereotypes are biasing what they perceive. Instead, they believe that their reactions to and interpretations of stereotyped individuals are free of prejudice because they assume that they are looking at the world objectively. When it comes to stereotypes, believing is seeing.

Prejudice Isn't Always Easily Controlled

The research just described sounds pretty encouraging, but marshalling resources for mental control takes effort and energy. As a result, people face a few limitations when they attempt to control their biases. The first limitation is that sometimes people make judgments of others when they are already aroused or upset. In these situations, cognitive control is impaired, so people likely will fall back on their prejudices and stereotypes. Consider, for example, a study in which White participants were asked to deliver shocks (that were not actually administered) to a White or Black confederate under the pretext of a behavior-modification study (Rogers & Prentice-Dunn, 1981). Half the White participants were angry about an overheard insult directed toward them by the confederate. When not angered, the White participants actually chose a less severe shock for the Black confederate than they did for the White confederate. However, after the White participants were angered, they shocked the Black confederate more strongly than they shocked his White counterpart. The arousal and negative emotion caused the participants to regress to gut-level negative attitudes. People also can have difficulty regulating their automatically activated thoughts when they are pressed for time, distracted, or otherwise cognitively busy. Teachers are more likely to be biased in their evaluations of students if they have to grade essays under time pressure. If instead they have ample time to make their judgments, they are better able to set aside their biases to provide fairer assessments of students' work (Kruglanski & Freund, 1983). People are also more capable of setting aside biases when they are most cognitively alert. This fact leads to the idea that a tendency to stereotype might be affected by circadian rhythms, the individual differences in daily cycles of mental alertness that make some people rise bright and early and make others night owls. In a study of how circadian rhythms can affect jury decision making, Bodenhausen (1990) recruited participants to play the roles of jurors in an ambiguous case where the offense either was or was not stereotypical of the defendant's group (e.g., a student athlete accused of cheating on an exam). Did participants allow their stereotypes of the defendant to sway their verdicts? Not if they were participating in the study during their optimal time of day. But if morning people were participating in the evening or evening people were participating early in the morning, their verdicts were strongly colored by stereotypes.

Coping with Prejudice and Discrimination: Psychological Strategies

The social strategies discussed above offer examples of how those who are stigmatized can manage their interpersonal interactions in ways that minimize their experience of bias and discrimination. But in addition to directly altering interpersonal interactions, people also rely on a host of psychological strategies that can help them remain resilient in the face of social devaluation.

The Stereotype Content Model

The stereotype content model posits that stereotypes develop on the basis of how groups relate to one another along two basic dimensions (Fiske et al., 2002). The first is status: Is the group perceived as having relatively low or high status in society, relative to other groups? The second is cooperation in a very broad sense that seems to encompass likability: Is the group perceived to have a cooperative/helpful or a competitive/harmful relationship with other groups in that society? The answers to these questions lead to predictions about the traits that are likely to be ascribed to the group. Higher status brings assumptions about competence, prestige, and power, whereas lower status leads to stereotypes of incompetence and laziness. Groups that are seen as cooperative/helpful within the society are seen as warm and trustworthy, whereas groups that are viewed as competitive/harmful within the larger society are seen as cold and conniving (Cuddy et al., 2008). These two dimensions of evaluation, warmth and competence, have long been acknowledged to be fundamental to how we view others. When we consider that these dimensions are largely independent, we see that stereotypes can cluster together in one of four quadrants in a warmth-by-competence space (FIGURE 10.10). People have different emotional reactions to groups whose stereotypes fit into one of these quadrants (Cuddy, Fiske, & Glick, 2007). Groups that are stereotyped as personally warm but incompetent (e.g., elderly people, physically disabled people) elicit pity and sympathy. Groups perceived as low in warmth but high in competence (e.g., rich people, Asians, Jews, minority professionals) elicit envy and jealousy. Groups stereotyped in purely positive terms as both warm and competent tend to be ingroups or groups that are seen as the cultural norm in a society. To the degree that these groups are valued, they generally elicit pride and admiration. Finally, groups stereotyped in purely negative terms as both cold and incompetent (e.g., homeless people, drug addicts, welfare recipients) elicit disgust and scorn. Researchers have argued that this model is too simplistic because it fails to consider another fundamental dimension of stereotyping: perceptions of morality (Leach et al., 2007). In addition, stereotypes are often more complex than the model implies; for example, a group may be stereotyped as high in competence in some domains (e.g., sports) and low in others (e.g., academics).

shooter bias

The tendency to mistakenly see objects in the hands of Black men as guns.

outgroup homogeneity effect

The tendency to view individuals in outgroups as being more similar to each other than they really are.

Symbolic Racism

The theory of symbolic racism (Sears & Kinder, 1971) posits that the tendency to reject groups that don't conform to one's own view of the world underlies much of the racial prejudice that European Americans have against African Americans. From this perspective, many European Americans have internalized traditional, conservative Eurocentric moral values and view African Americans as a threat to the American way of life. People who exhibit signs of symbolic racism don't think they are prejudiced toward outgroups. Rather, their negative attitudes toward these groups are expressed symbolically as opposition to policies that are seen as giving advantages to minority groups. They might deny that minorities continue to face discrimination and believe that racial disparities result from the unwillingness of people in minority groups to work hard enough. Those who are high in symbolic racism feel justified in opposing social programs that rectify social inequalities and supporting those that might curtail civil liberties of certain groups. The theory of symbolic racism has shown how prejudice is symbolically represented in diverging political opinions (Sears & Henry, 2005). In these studies, those who score high on measures of symbolic racism are more likely to support punitive anticrime policies that discriminate against minority groups (e.g., the death penalty or "three strikes and you're out" laws; Green et al., 2006).

right-wing authoritarianism (RWA)

An ideology which holds that the social world is inherently dangerous and that maintaining security requires upholding society's order and tradition. It predicts prejudice against groups seen as socially deviant or dangerous.

Has Prejudice Become Less Prevalent over Time?

Dial back time to 1954. The U.S. Supreme Court had just announced the historic decision Brown v. Board of Education, which struck down state laws enforcing racial segregation in the public schools. The Court ruled that "separate but equal" schools for Black and White students were inherently unequal. The ruling was met with stark and at times violent opposition in a number of states. In the decades since then, Americans have enacted antidiscrimination laws and elected (2008) and reelected (2012) an African American man as president of the United States. In some respects at least, the United States has made tremendous strides in race relations. In 1958, 94% of Americans surveyed opposed interracial marriage, but that number dropped to 17% by 2007 (FIGURE 10.4a; Carroll, 2007), although, unfortunately, some White Americans still are prone to negative reactions to interracial couples (Skinner & Hudac, 2017). And, an analysis of millions of American Internet respondents found that explicit and implicit attitudes regarding race became less negative between 2007 and 2016 (FIGURE 10.4b; Charlesworth & Banaji, 2019). Similar changes can be seen with other prejudices. Countries such as Germany and Great Britain have elected female leaders over the past few decades. In 2016—almost 100 years after the 19th Amendment (1920) guaranteed women the right to vote—a major political party in the United States nominated a female presidential candidate, Hillary Clinton. The #MeToo movement, which is associated with bringing to justice powerful and wealthy men who have harassed and abused women, has also brought much-needed progress in fighting mistreatment of women (Bennett, 2020; MacKinnon, 2019). In addition, a recent study of questionnaire responses of more than 15,000 New Zealanders found that sexism was reduced between 2009 to 2016 (Huang et al., 2019). What about prejudice against LGBTQ people? In 2009, the United States added perceived gender, gender identity, sexual orientation, and disability to the federal definition of hate crimes through the Matthew Shepard Act. In 2010, the U.S. Senate voted to repeal the "Don't Ask, Don't Tell" policy and struck down the ban on openly gay men and women serving in the military. In 2013, the U.S. Supreme Court decided a landmark case that opened the door for same-sex couples to qualify for federal benefits previously only extended to heterosexual couples (U.S. v. Windsor, 2013). Evidence suggests that states making same-sex marriages legal may have contributed to these reductions in antigay bias (Ofosu et al., 2019). In 2001, 43% of Americans supported same-sex marriage, but in 2019, 61% said they supported it (FIGURE 10.5a; Masci et al., 2019). And from 2007 to 2016, American Internet respondents exhibited a reduction in explicit and implicit bias against gays and lesbians (FIGURE 10.5b; Charlesworth & Banaji, 2019). There have also been signs of progress toward acceptance of transgender individuals. In 2014, for the first time, an openly transgender person, Laverne Cox, was nominated for an Emmy for her performance in the television show Orange Is the New Black. Perhaps because of this relative progress, many Americans believe that prejudice, especially against Blacks in the United States, is a thing of the past (Norton & Sommers, 2011). Blacks, however, are far from convinced. In a Pew survey, nearly half of Blacks surveyed said that they have had the experience of others treating them as if they are suspicious or unintelligent compared to only about 10% of Whites reporting having had this experience. Blacks were also six times as likely as Whites to report having been stopped unfairly by police (Pew Research Center, 2016). So reports of the death of prejudice have been greatly exaggerated. Let's consider some of the evidence. Even though segregation is against the law, we still live in a society that is quite segregated. While attitudes toward some groups have become more favorable over time, social and political contexts can bring about new hostilities. Whereas Jews were the most salient target of religious prejudice in the United States when Allport wrote his book in 1954, in 2010 "only" about 15% of Americans surveyed reported having even a little prejudice against Jews, compared with nearly 43% who reported having at least some prejudice toward Muslims (Gallup Center for Muslim Studies, 2010). Anti-Semitism is far from gone, however; in New York City, 234 hate crimes against Jews were reported in 2019—more than a 25% increase over the prior year (Frehse, 2020)—and anti-Semitism is currently quite prevalent in many European and Asian countries (Baum et al., 2016). Although overt expressions of discrimination and racial injustice are certainly declining, they are far from absent. Since the fall of 2014, protests have been cropping up throughout the United States in response to police killings of African Americans that have been viewed as outrageous and unwarranted. The Black Lives Matter movement aims to actively assert something that shouldn't need to be asserted. Yet it still does. On May 25, 2020, a video widely viewed on YouTube showed a White Minneapolis police officer with his knee on the neck of handcuffed African American George Floyd for over eight minutes, resulting in his death. During this time, Mr. Floyd repeatedly asserted that he could not breathe. Three other officers were present and did nothing to stop it. This killing sparked massive protests in the U.S. and around the world, with some in the U.S. becoming violent. A few days after the incident, the officer who killed Mr. Floyd was charged with murder (Iati et al., 2020). Beyond such tragic overt acts, there is ample evidence of less visible forms of discrimination that are harder to see. Beginning in the late 1950s, the civil rights campaign brought to public awareness the problem of institutional discrimination, unfair restrictions on the opportunities of certain groups of people by institutional policies, structural power relations, and formal laws (e.g., a height requirement for employment as a police officer that excludes most women). This form of discrimination has been so deeply embedded in the fabric of American society that it has often taken place without people even being aware that institutional practices had discriminatory effects (Pettigrew, 1958). At a broader cultural level, societies have assigned less economic value to occupations traditionally held by women (e.g., nurse, teacher, administrative assistant), with the net result that women earn less than their male contemporaries (Alksnis et al., 2008). Particularly in higher-paying jobs (e.g., hospital administrator), equally qualified women may earn only about 79 cents to every dollar that men earn (Eagly & Carli, 2007; Semega, 2009). Are women paid less than men for the same jobs? Sometimes that is the case. But even when it is not, keep in mind that women are more likely to be represented in jobs that have lower earning potential (FIGURE 10.6). Other below-awareness evidence of the greater valuing of males was brought to light in a recent study of more than 600,000 social media posts from St. Petersburg, Russia, in which parents mentioned their sons more than their daughters, and posts of sons received more likes than posts of daughters (Sivak & Smirnov, 2019). Clear signs of racial discrimination can also be found in everything from employment to housing, credit markets, the justice system, and consumer pricing (Pager & Shepherd, 2008). For example, the way in which lawyers select or exclude jury members can lead to juries that are biased against a Black defendant (Morrison et al., 2016). Once convicted, the more a Black male has "Afrocentric" facial features—in other words, the more they look like Whites' stereotypes of Blacks—the harsher their prison sentences for the same crime tend to be (Blair et al., 2004; Kleider-Offutt et al., 2017), and the more likely they are to receive the death penalty for capital offenses when the victim was White (Eberhardt et al., 2006). These findings indicate that in the contemporary world, we often see these subtler—or what are termed modern—forms of prejudice that disadvantage minorities. Because of America's sordid history with slavery and explicit discrimination against African Americans, the study of modern forms of prejudice in the United States has focused largely on racial prejudice.

The Roots of Prejudice: Three Basic Causes

Given all the harm that has come from prejudice, stereotypes, and discrimination, why are these phenomena so prevalent? This is one of the central questions that Gordon Allport addressed in his classic book The Nature of Prejudice (1954). Allport proposed three basic causes of prejudice, each of which is an unfortunate consequence of some very basic aspects of human thought and feeling.

SOCIAL PSYCH OUT IN THE WORLD Do Americans Live in a Postracial World?

History was made in 2008, when the United States elected its first Black president. Less than 50 years after Martin Luther King Jr. spoke of his dream that his children would be judged not by the color of their skin but by the content of their character, this dream seemed much closer to reality. With a multiracial president having served two terms in the White House, many Americans began to wonder: Do we now finally live in a postracial world? Probably not. Granted, the research we've reviewed in this chapter demonstrates that racial prejudice has changed considerably over time. The election of Barack Obama certainly signaled more positive attitudes toward African Americans. However, we've also learned in this chapter that in the contemporary world, prejudice often is ambivalent and manifests in subtle ways. Elections might be times when people try to set aside biases to weigh the more established merits of different candidates. However, among undecided voters, implicit biases seem to play a stronger role in predicting decisions at the polls (Galdi et al., 2008; see also Greenwald, Smith, et al., 2009). Such findings suggest that negative biases still lie beneath the surface of people's consciously held values, beliefs, and intentions. On the other hand, Barack Obama's presidency meant that every American citizen had a highly visible exemplar of a successful Black political leader. In this way, he may have tilted Americans' implicit associations of Blacks in a more positive direction. Indeed, there is some evidence that the election of Obama and exposure to his campaigns helped to reduce people's implicit racial bias, in part by providing a positive example of an African American that may counter many of the negative stereotypes that are so pervasive in mass media (Columb & Plant, 2011; Plant et al., 2009). When President Obama was the example that people brought to mind when they thought of Black people, they were less likely to be racially prejudiced. However, we must be careful not to look at data like these and feel that we need do no more to rectify racial inequality. There are signs that the election of Obama in fact fostered a belief that Americans had achieved racial equality (Kaiser et al., 2009). Such a perception might allow people to justify keeping the status quo and not trying to change the disparities that do exist. Obama's election also may have given Whites who are high in prejudice the moral credentials, so to speak, to think that enough has been done to improve racial equality, and they can therefore show stronger favoritism to Whites (Effron et al., 2009). For example, in one study (Effron et al., 2009), participants who varied in their level of racial prejudice indicated whether they would vote for Obama or McCain in the 2008 election or indicated whether they would have voted for Bush or Kerry in 2004. (This condition was included to control for priming political orientation.) Subsequently, participants imagined that they were on a community committee with a budget surplus that could be allocated to two community organizations: one that primarily served a White neighborhood and one that primarily served a Black neighborhood. When participants indicated that they would vote for Obama, especially those higher in prejudice turned around and allocated significantly less money to the organization that would serve the Black neighborhood and more money to the organization that would serve the White neighborhood. For people with strong racial biases, acknowledging and endorsing the success of a single outgroup member seems to come at a cost to broader policies that could benefit more people. The visible success of one person does not imply the success of the group as a whole. Clearly, more work needs to be done to fully realize Dr. King's dream.

SOCIAL PSYCH OUT IN THE WORLD One Family's Experience of Religious Prejudice

In this chapter, we are considering the scholarly evidence on how people experience, cope with, and try to deflect discrimination. But personal experience with prejudice can cut very deep. Let's examine prejudice from the perspective of one family's account, told as part of the radio program This American Life (Spiegel, 2006, 2011). We begin with a love story in the West Bank in the Middle East. A young Muslim American woman named Serry met and fell in love with a Muslim man from the West Bank. As they got to know one another, he told her how difficult it was for him and everyone he knew to grow up in the middle of the deep religious and political conflict between Israel and the West Bank. So when they decided to marry and make a life together, she convinced him that their children would have a better life in the United States, a country where she spent a much happier childhood and where people from different religious backgrounds easily formed friendships. They settled down in the suburbs of New York City, had five children, and became a very typical American family. But when terrorists attacked the World Trade Center and the Pentagon on September 11, 2001, their lives changed forever. Like everyone around them, they were horrified and deeply saddened by what had happened. But their friends, neighbors, and even strangers on the street began to treat them differently. Drivers would give Serry the finger, and someone put a note on her minivan, telling her family to leave the country. The situation escalated when their fourth-grade daughter came home from school in tears on the one-year anniversary of 9/11, after the school district presented a lesson for all fourth-graders, explaining that 9/11 happened because Muslims hate Christians and also hate Americans. From that day on, their once-popular daughter was the target of taunting and bullying by other kids. The situation grew worse when her teacher told the class that non-Christians and nonbelievers would burn in hell. Her nine-year-old classmates began calling her "Loser Muslim" after her teacher said that she should be transferred to another classroom. Soon her younger siblings were targeted by bullying, too. Eventually even her best friend turned her back on her. This heart-wrenching story reveals how prejudice can flare up when people feel that their worldview has been threatened. As we discussed in chapter 10, because the events of 9/11 were viewed as an attack on American values by Islamic extremists, the attacks led some Americans to view all Muslims with hate and suspicion—even those with whom they had previously been friendly. But this story also reveals how different people in the same family can respond very differently to others' prejudice. The oldest daughter's response was to renounce her religion, to try to escape that part of her identity that her peers and her teacher so clearly devalued. When she moved to a new school, she chose to conceal her religious background to try to avoid further discrimination. For Serry, the mother of the family, her religion was deeply important to her, but being American was even more central to her identity. She was shocked and saddened to find that she was no longer viewed as an American, but she still believed that American values of freedom would win out in the end. As Serry explained, "I was born and raised in this country, and I'm aware of what makes this country great, and I know that what happened to our family, it doesn't speak to American values. And I feel like this is such a fluke. I have to believe this is not what America is about. I know that." In line with system justification theory, her belief in American values led her to minimize these events as aberrations. Serry's husband found his vision of America as a land free of religious prejudice shattered. Like every other immigrant before him in the history of the United States, he had traveled to a new and different culture in the hope of making a better life for himself and his family. Once a very happy man with a quick sense of humor, he slipped into depression and eventually decided to return to the West Bank, where he died a few years later. Not much is said about his death, so it's not known how his experience with anti-Islamic prejudice might have eroded his health. But his choice was to return to his homeland, a place that is far from being free of discrimination from religious intolerance but where at least he could live among others who share the same stigmatized identity. Consistent with rejection identification theory, his identification as a Muslim from the West Bank seemed to offer him a source of psychological security.

Coping with Prejudice and Discrimination: Social Strategies

Just as there are a number of ways to counter the effects of stereotype threat, there are also a number of behavioral response options for dealing with interpersonal encounters with prejudice.

Discrimination

Negative behavior toward an individual solely on the basis of that person's membership in a particular group.

Setting the Stage for Positive Change: The Contact Hypothesis

One strategy that seems to be an intuitive way to foster more positive intergroup attitudes is to encourage people actually to interact with those who are the targets of their prejudice. In the late 1940s and the 1950s, as American society started to break down barriers of racial segregation, some interesting effects on racial prejudice were observed. For example, the more White and Black merchant marines served together in racially mixed crews, the more positive their racial attitudes became (Brophy, 1946). Such observations suggest that if people of different groups interact, prejudice should be reduced. There is certainly some truth to this. Research on the mere exposure effect (see chapters 8 and 14) shows that familiarity does increase liking, all other things being equal. The problem with this strategy is that only rarely are all other things equal! If you look around the world and back in history, you quickly notice countless examples of people of different groups having extensive contact—yet their prejudices remain and even intensify. For example, recent research finds that in states where a high proportion of residents are Black, both White and Black participants have a stronger tendency to favor their own racial group over the other, as measured by an implicit association test (IAT; Rae et al., 2015; see FIGURE 11.6). Why did interracial contact in the merchant marines reduce prejudice, whereas other forms of contact do not? In considering such questions, Allport (1954) proposed that contact between groups can reduce prejudice only if it occurs under optimal conditions. According to Allport's original recipe, four principal ingredients are necessary for positive intergroup contact: Equal status between groups in the situation Contact that is intimate and varied, allowing people to get acquainted Contact involving intergroup cooperation toward a superordinate goal—that is, a goal that is beyond the ability of any one group to achieve on its own Institutional support, or contact that is approved by authority, law, or custom In the time since Allport laid out this recipe for reducing prejudice, hundreds of studies with thousands of participants have examined whether intergroup contact that meets these requirements can reduce prejudices based on such distinctions as race and ethnicity, sexual orientation, age, and physical and mental disabilities. These studies range from archival studies of historical situations to controlled interventions that manipulate features of the contact setting. Despite the diversity of methodologies, research generally finds that the more closely the contact meets Allport's requirements, the more effectively it reduces a majority group's prejudice against minorities (Pettigrew & Tropp, 2006; Tropp & Pettigrew, 2005).

Stereotype Threat

Self-fulfilling prophecies and self-stereotyping are examples of how stereotypes affect behavior of members of stereotyped groups during social interactions. Other research shows that even when a person is not interacting with someone, the immediate context can bring to mind stereotypes that can interfere with a person's ability to perform at his or her best. This was the discovery made by the Stanford researchers Claude Steele and Joshua Aronson (1995) when they conducted pioneering work on what they called stereotype threat, a phenomenon you were first introduced to in chapter 1, when we covered research methods. Stereotype threat is the concern that one might do something to confirm a negative stereotype about one's group either in one's own eyes or in the eyes of someone else. Although this phenomenon has far-reaching consequences for a variety of situations, it has been studied primarily as an explanation for racial and ethnic differences in academic performance and for gender differences in standardized math test scores. Other explanations for these performance gaps have focused on whether nature (genetics, hormones, even brain size) or nurture (upbringing, educational values, access to educational resources) offers a better explanation of these performance gaps (Nisbett, 2009). Research on stereotype threat takes a distinctly social psychological view, indicating that performance can be influenced by aspects of the situation, such as the person's experience of the classroom in which he or she is taking a test. In one of their original studies, Steele and Aronson (1995) gave Black and White undergraduates a challenging set of verbal problems to solve. For half of the sample, the problems were described as a diagnostic test of verbal intelligence (similar to the SAT or GRE). For the other half, the same problems were described as a simple lab exercise. Although White students were unaffected by how the task was described, Black students performed significantly worse when the task was presented as a diagnostic test of intelligence (see FIGURE 11.2). When Black students were reminded of the stereotype that their group is intellectually inferior, they performed more poorly on the test. In addition to undermining performance on tests of math, verbal, or general intellectual ability of minorities, women, and those of lower socioeconomic status (Croizet & Claire, 1998), stereotype threat has also been shown to impair memory performance of older adults (Chasteen et al., 2005); driving performance of women (Yeung & von Hippel, 2008); athletes' performance in the face of racial stereotypes (Stone et al., 2012); men's performance on an emotional sensitivity task (Leyens et al., 2000); and women's negotiation skills (Kray et al., 2001). As mentioned in chapter 1, meta-analyses suggest that these effects are small to medium in size (Armstrong et al., 2017; Doyle & Voyer, 2016; Gentile et al., 2018; Nadler & Clark, 2011; Picho et al., 2013). Theoretically, stereotype threat is thought to impair performance under some conditions more than others (Schmader et al., 2008). The effect is strongest when: The stigmatized identity is salient either because of the situation (e.g., being the only women in a high-level math class) or due to stigma consciousness or group identification. The task is characterized as a diagnostic measure of an ability for which one's group is stereotyped as being inferior (as in Steele & Aronson, 1995). Individuals are led to believe that their performance is going to be compared with that of members of the group stereotyped as superior on the task. Individuals are aware of the stereotype and are concerned that others (or even themselves) might believe it to be true. Researchers also have learned a great deal about the processes that contribute to the deleterious effects of stereotype threat. First, it's important to point out that those who care the most about being successful feel stereotype threat most acutely (Steele, 1997). In fact, it's partly because people are trying so hard to prove the stereotype wrong that their performance suffers (Jamieson & Harkins, 2007). When situations bring these stereotypes to mind, anxious thoughts and feelings of self-doubt are more likely to creep in (Bosson et al., 2004; Cadinu et al., 2005; Johns et al., 2008; Spencer et al., 1999). Efforts to push these thoughts away and to stay focused on the task can hijack the very same cognitive resources that people need to do well on tests and in other academic pursuits (Johns et al., 2008; Logel et al., 2009; Schmader et al., 2008). For other kinds of activities (e.g., trying to sink a golf putt, shoot a basket, or parallel park), becoming proficient means relying on skills that have become automatic over hours or even years of practice. When the situation reminds people of a negative group stereotype about those activities, they end up scrutinizing the behaviors that they normally do automatically; as a result, they trip themselves up (Schmader & Beilock, 2011).

2. Stereotypes Justify Prejudice and Discrimination

Stereotypes aren't mere by-products of our limited cognitive capacities. People also are sometimes motivated to hang on to beliefs to justify their prejudices. One example is that once a country has declared war on another nation, stereotypes of that nation become more negative. In addition, encountering members of outgroups sometimes automatically elicits potent negative feelings, such as fear and disgust (e.g., Esses et al., 1993). People may generate a negative stereotype of a group to justify their feelings. According to the justification suppression model of prejudice expression (Crandall & Eshleman, 2003), stereotypes can provide people with supposedly acceptable explanations for having negative feelings about a group. If, for example, a person stereotypes all Hispanics as aggressive, then he can justify why he feels frightened around Hispanics. From this perspective, the negative feelings sometimes come first, and the stereotypes make those feelings seem acceptable—or even rational. To test this idea, Chris Crandall and colleagues (2011) set up a situation in which they induced some people to have a negative feeling toward a group prior to forming a stereotype about that group. They did this by repeatedly pairing a group that participants knew nothing about—people from the country Eritrea—with unrelated negative words or images (e.g., sad faces) to create an implicit negative reaction to the group. In this way, half of the participants developed a negative affective association toward Eritreans, whereas those in a control group did not. Afterward, participants were given a list of traits, such as dangerous, violent, and unfriendly, and asked to indicate whether those traits were descriptive of people from Eritrea. Participants trained to have a negative affective reaction toward Eritreans were more likely than those in the control condition to stereotype Eritreans as cold and threatening. After all, if the people of Eritrea are perceived as cold and threatening, then one's negative feelings suddenly seem justified.

disidentification

The process of disinvesting in any area in which one's group traditionally has been underrepresented or negatively stereotyped.

Ingroup Bias: We Like Us Better Than Them

The second cause of prejudice, according to Allport, is the tendency to prefer what is familiar over what is not. As the mere exposure effect discussed in chapter 8 shows, the more familiar we are with a stimulus, the more we like it. We like—indeed, usually love—our own families, our own towns, our own stuff, and our own group. In contrast, outgroups are less familiar, stranger, less known. They make us feel uneasy, anxious. They are harder to predict and understand. Taking an evolutionary perspective, some psychologists have argued that a preference for familiar others is probably something adaptive that has been selected for (e.g., Park et al., 2003). Our ancestors, living in small groups, were probably safer if they stayed close to their own. If they ventured away from their own group and encountered other groups, they may have experienced peril, including exposure to germs. In fact, when thoughts of disease are made salient, people become particularly negative toward ethnically different others (Faulkner et al., 2004). A recent study of identical twins suggested that how prone an individual is to favor their ingroup is partly genetically determined (Lewis & Bates, 2017). Allport noted that because of common backgrounds, it's also just easier to know what to say and how to behave around those who are members of the ingroup. In addition to this familiarity-based preference for the ingroup over outgroups, most of us like ourselves and demonstrate a self-serving bias, as you'll recall from our coverage of self-esteem (chapter 6). So if I am great, then my group must be great also. Surely groups I am not a member of can't be as great as those to which I belong! Indeed, research has shown that ingroup pronouns such as us are associated automatically with positive feelings and that outgroup pronouns such as them are associated automatically with negative ones (Perdue et al., 1990). So pride in one's own group and preference for one's own group over others may be a natural extension of self-serving bias. This ingroup bias can affect even political beliefs (Kosloff, Greenberg, Dechesne et al., 2010). In the lead-up to the 2008 U.S. presidential election, when undecided White voters were reminded of their race, they were more likely to believe negative rumors about the African American Democratic candidate, Barack Obama. Similarly, when undecided young voters were reminded of their age, they were more likely to believe negative rumors about the 65-year-old Republican candidate at the time, John McCain. Social identity theory (see chapter 9) (Tajfel & Turner, 1986) looks at the relationship between self-esteem and groups the other way around, reversing the causal direction. This theory proposes that a considerable portion of our self-esteem actually derives from our group memberships. Not only is my group great because I'm in it, but I am great because I am in this group! So I gain self-esteem by thinking highly of my own group and less highly of outgroups. And sure enough, wherever you travel, you meet people who are proud of their own cultures and ethnicities and think more highly of them than they do of other cultures and ethnicities. A large body of experimental research supports the existence of ingroup bias and the validity of social identity theory. One important line of inquiry has examined whether arbitrarily formed groups immediately exhibit ingroup bias. This idea was anticipated in Jonathan Swift's (1726/2001) classic satire Gulliver's Travels, which describes wars breaking out between those who believe eggs should be cracked at the big end and those who believe they should be cracked at the small end. Henri Tajfel and colleagues demonstrated this phenomenon in a seminal study in which high school students were asked to estimate how many dots were displayed on a screen (Tajfel et al., 1971). The researchers told one random set of students that they were "overestimators" and the other set that they were "underestimators." Even in such minimal newly formed groups, researchers found bias in favor of distributing more resources to members of one's own group than to the outgroup (Tajfel & Turner, 1986). We should note, however, that recent replication efforts (Kerr et al., 2018) have shown that this bias may not even occur in some cases: if it is clear to the participants that the groups were formed randomly; if people make their resource allocations in private rather than in the presence of their group members; and in collectivist cultures, like Japan, as well as in cultures that highly value equality, like Australia. Theory and research also suggest that in most cases, the liking for the ingroup is stronger and more fundamental than the disliking of the outgroup (e.g., Allport, 1954; Brewer, 1979). Allport noted that in many contexts, people are very accepting of bias in favor of their own children and families and of pride in their own nations. However, this "love prejudice" often has negative consequences for outgroups. An African American woman who is having trouble finding employment would feel little comfort in knowing that it's not so much that White employers are biased against African Americans but just that they prefer to hire their "own kind." In addition, if we view an outgroup as threatening our beloved ingroup, our ingroup love can fuel outgroup hate. A second important line of research has tested the prediction from social identity theory that ingroup bias serves self-esteem needs. From a social identity perspective, people should be especially likely to laud their own group and derogate outgroups after a threat to their personal self-esteem. In a series of studies (FIGURE 10.2), Fein and Spencer (1997) gave non-Jewish American participants positive or negative feedback on a test of social and verbal skills and then had them evaluate a woman after seeing a résumé and a videotape. For half the participants, the job candidate was depicted as Italian American; for the other half, she was depicted as Jewish American. Participants given self-esteem-threatening negative feedback rated the woman more negatively if they thought she was Jewish. In addition, participants given negative feedback and who had the opportunity to derogate the Jewish American woman showed an increase in self-esteem. And the more negatively they evaluated the Jewish American woman, the more their self-esteem increased. Subsequent studies have provided further support for the role of self-esteem threat in prejudice and stereotyping, showing, for example, that threatening Whites' self-esteem brings negative stereotypes of African Americans and Asian Americans closer to mind (Spencer et al., 1998). When people feel bad about themselves, they seem to compensate through downward comparison by thinking more harshly of outgroups. Another example of this kind of self-esteem-protecting prejudice is scapegoating, a phenomenon whereby people who feel inferior, guilty, anxious, or unsuccessful blame an outgroup for their troubles (Allport, 1954; Jung, 1945/1970; Miller & Bugelski, 1948). Captain Ahab blamed the white whale, the Nazis blamed the Jews, unsuccessful North Americans blame immigrants. Experiments have shown that reminding people of the threat of natural disasters leads them to view outgroups as enemies with evil intentions (Landau et al., 2012; Sullivan et al., 2010). This tendency toward scapegoating provides someone to blame for one's own problems and increases a sense of control over one's life (Rothschild et al., 2012).

Implicit Prejudice

The term implicit prejudice refers to negative attitudes toward a group of people for which the individual has little or no conscious awareness. Some people may choose not to admit their prejudices, whereas others may not be aware of them. Measures of implicit prejudice tap into attitudes that lie beneath the surface of what people report (Nosek et al., 2011). And, indeed, whereas a majority of White Americans don't report being prejudiced on explicit measures, most do show signs of having biases when their attitudes are assessed implicitly with either cognitive measures of implicit associations or physiological measures of affective responding (Cunningham, Johnson et al., 2004; Dovidio et al., 2001; Hofmann et al., 2005; Mendes et al., 2002).

Ethnocentrism, the Cultural Worldview, and Threat

The third basic cause of prejudice identified by Allport stems from the fact that each of us is raised within a particular cultural worldview and therefore has specific ideas about what is good and what is bad. If the worldview that we internalize from childhood explicitly portrays particular groups negatively, we will likely follow suit. So simply conforming to the norms and values of one's worldview can lead to prejudice. Supporting this idea, researchers have found that in places where prejudice is normative, such as the Deep South of the United States in the 1950s, the more people conform in general, the more prejudiced they tend to be (e.g., Pettigrew, 1958). More recently, Chris Crandall and colleagues (2018) showed that, after Donald Trump was elected U.S. president in 2016, Americans viewed prejudice toward groups targeted by the Trump campaign, such as Muslims and immigrants, as more acceptable. Crandall and colleagues suggested that this may explain the increase in bias-related incidents soon after Trump was elected. The internalized worldview contributes to prejudice in another important way. Because this worldview determines our view of what is right and good, we can't help but judge others on the basis of those cultural values. This kind of judgment, called ethnocentrism (Sumner, 1906), often leads us to hold negative attitudes about others who were raised in different cultures. An American who finds out that people in culture Z believe in bathing only twice a year is going to have a hard time not judging the members of that culture negatively: "They're dirty and primitive!" By the same token, members of culture Z may observe the American tendency to bathe or shower virtually every day as bizarre: "They're wasteful and compulsive!" These kinds of judgments often get more serious, such as when North Americans learn of cultures that practice female circumcision or believe that women should never leave the house without covering every part of their bodies.

The Nature of Prejudice: Pervasiveness and Perspective

Virtually every person currently living on this planet has been profoundly affected by prejudice. In most if not all cultures, women are to varying degrees targets of violence and restricted in their freedoms and opportunities. Likewise, every ethnic and cultural group has been powerfully influenced by historical intergroup conflicts and oppression. Japan and China have exchanged many acts of hostility and violence over a long period of time. So have France and England. And as of this writing, tens of thousands of refugees have fled a violent civil war in Syria, only to find themselves isolated into camps as surrounding countries close their borders to new immigrants. Pick a group, and you could read volumes about how that group has been affected by prejudice. In social psychology, prejudice is defined as a negative attitude toward an individual based solely on that person's presumed membership in a particular group. Thus the person is disliked not because of personal attributes or actions but simply because of being perceived to be in some supposedly undesirable group. An interesting aspect of prejudice is that, on the one hand, many if not most people seem to be prejudiced against some group—and they usually feel that their particular prejudice is justified. On the other hand, social psychologists generally assume that prejudice against a person based simply on membership in a group is never justified. This assumption is based on three characteristics of prejudice. First, prejudice involves judging an individual negatively without considering the person's actual attributes or actions. Social psychologists follow the hope famously articulated by Dr. Martin Luther King Jr. (1963/1992): "I have a dream that my four little children will one day live in a nation where they will not be judged by the color of their skin but by the content of their character." If someone harms you or someone you care about, you are justified in disliking that person. If a person simply practices a religion different from your own, has a different skin tone, or comes from a different country, you are not justified in disliking that person. Second, any large category of people will include tremendous variability in virtually every possible attribute by which one might judge another person positively or negatively (Allport, 1954). There may be a group mean (what the average member of a group is like), but there also is always a normal distribution that captures the range along which most people vary from that mean. Because of this variation, assuming anything about all members of such groups will necessarily lead to many errors. To use an example where measurable data are available, consider that although the average American (male, ½½5′9½″, female, 5′4″) is taller than the average Chinese person (male, 5′7″; female, ½½5′2½″) (Yang et al., 2005), literally millions of Americans are shorter than the average Chinese person, and millions of Chinese people are taller than the average American (FIGURE 10.1). The third reason social psychologists judge prejudice negatively is that it has all too often led to appalling acts of violence against innocent people—including children—who happened or were presumed to be members of particular groups. Many early social psychologists were inspired to focus on prejudice because of one of the most egregious examples of what prejudice can lead to: the Nazi Holocaust, which resulted in the deaths of an estimated 6 million Jews and 5 million members of other groups despised by the Nazis (e.g., Gypsies, Slavs, physically disabled individuals). So that's the case for prejudice being a bad thing. People who hold prejudices usually justify them with stereotypes—overgeneralized beliefs about the traits and attributes of members of a particular group, such as "African Americans are violent," "Jews are cheap," "White men are racists," "Latinos are lazy," and so forth. Not all stereotypic traits attributed to a group are negative, but overall, stereotypes of outgroups tend to be negative. Later in this chapter, we will consider where these stereotypes come from, how they affect us, and how they are perpetuated. As we will learn, stereotypes provide justifications for prejudice and lead to biases against outgroups. People holding prejudices and stereotypes often leads to discrimination—negative behavior toward an individual solely on the basis of membership in a particular group. Discrimination comes in many forms, ranging from cold behavior at a party to declining someone's loan application to torture and genocide. Discrimination is often the consequence of the negative attitudes (prejudice) and beliefs (stereotypes) a person holds. But because of laws, norms, and values to be egalitarian, people's behaviors are not always biased by their prejudice and stereotypes.

The Costs of Concealing

When people are concerned about being discriminated against, they sometimes choose to cope by concealing their stigma, if this is an option. This strategy is common for those who identify as gay, lesbian, bisexual, or transgender. For example, Jason Collins played professional basketball in the NBA for 12 years before coming out of the closet in April 2013. He described his experience concealing his sexual orientation in an interview with Sports Illustrated: It takes an enormous amount of energy to guard such a big secret. I've endured years of misery and gone to enormous lengths to live a lie. I was certain that my world would fall apart if anyone knew. And yet when I acknowledged my sexuality I felt whole for the first time. (COLLINS, 2013) When Jason Collins joined the Brooklyn Nets in the spring of 2014, he became a true trailblazer—the first openly gay male athlete actively playing a major professional sport in the United States. Yet some retired players have noted that they are sure they played with gay teammates over the years. An ESPN story from 2011 quoted the Hall of Famer and basketball analyst Charles Barkley as saying, "It bothers me when I hear these reporters and jocks get on TV and say: 'Oh, no guy can come out in a team sport. These guys would go crazy.' ... I'd rather have a gay guy who can play than a straight guy who can't play" (ESPN.com news services, 2011). For those who are particularly aware of and worried about how others judge them, concealment can sometimes be a beneficial way to cope (e.g., Cole et al., 1997), but as Jason Collins's quote reveals, concealment comes with its own costs. Those who conceal an important aspect of their identity might struggle with the inability simply to be their authentic selves. Also, the effort it takes to be vigilant about what you say and how you act and to monitor whether others have figured out your secret can be emotionally and cognitively draining (Frable et al., 1990; Smart & Wegner, 1999). So although concealing a stigma might be one way to sidestep discrimination, it's often not an optimal solution. Fortunately, highly publicized examples of people living more authentically can help others feel they can do the same. In 2020, another NBA star, Dwyane Wade, announced that his 12-year-old daughter, Zaya, is transgender (i.e., born as a male, now identifying as female) saying, "She's known it for nine years. She's known since she was 3 years old. Along this way we've asked questions and we've learned. But she's known" (Wells, 2020). Wade's motivation for publicizing Zaya's gender identity is to help other families support their own children who identify as transgender or nonbinary. Public examples of support and acceptance are perhaps an important reason suicide attempts among adolescents who identify as sexual minorities have been decreasing in recent years (Raifman et al., 2020). Still, gay, lesbian, and bisexual teens are three times more likely to attempt suicide than their straight peers, and a 2018 study in the United States suggested that 30 to 50% of transgender and nonbinary adolescents had attempted suicide (Raifman et al., 2020; Toomey et al., 2018). Because stigma is a threat to one's very sense of identity, it might not be a coincidence that the negative consequences of prejudice are particularly high during adolescence and young adulthood, when people are still forming an identity (Erikson, 1968). The It Gets Better Project (www.itgetsbetter.org), started by the columnist and author Dan Savage and his partner, Terry Miller, is an effort to communicate to LGBTQ+ teens that the stress of embracing their sexual identity, coming out to others, and experiencing bias will get better over time. In fact, research suggests that attitudes toward sexual minorities are generally becoming more positive over time (Charlesworth & Banaji, 2019).

institutional discrimination

Unfair restrictions on opportunities for certain groups of people through institutional policies, structural power relations, and formal laws.

Identifying with Positive Role Models

When individuals are exposed to role models—people like themselves who have been successful—the stereotype is altered, and they feel inspired to do well (Dasgupta & Asgari, 2004; Marx & Roman, 2002; McIntyre et al., 2003; O'Brien et al., 2017; Stout et al., 2011). In one study (Stout et al., 2011), college students were randomly assigned to either a female or a male calculus professor, and their performance over the course of the semester was tracked. The gender of the professor had no effect on men's attitudes or behavior. But women with a female professor participated more in class over the course of the semester and became more confident in their ability to do well.

Stereotyping: The Cognitive Companion of Prejudice

A stereotype is a cognitive schema containing knowledge about and associations with a social group (Dovidio et al., 1996; Hamilton & Sherman, 1994). For example, our stereotype of the group librarians may contain our beliefs about the traits shared by members of that group (e.g., librarians are smart and well read), theories about librarians' preferences and attitudes (e.g., librarians probably like quiet), and examples of librarians we have known (e.g., Ms. Smith, from my school library). We may not want to admit it, but we all probably have stereotypes about dozens of groups, such as lawyers, gays, lesbians, truckers, grandmothers, goths, Russians, immigrants, and overweight individuals. People around the globe often openly endorse certain stereotypes about various groups, but because stereotypes are so prominently promoted in cultures, even people who explicitly reject them may have formed implicit associations between groups and the traits their culture attributes to those groups. At a conscious level, you might recognize that not all librarians, if any at all, really fit the mold of being bookish, quiet women who wear glasses. But simply hearing the word librarian is still likely to bring to mind these associated attributes, even if you're not consciously aware of it. Finally, although we commonly refer to stereotyping as having false negative beliefs about members of a group, it should be clear from the librarian example that stereotypes don't have to be negative. They don't even have to be entirely false (see Jussim et al., 2015). Being well-read is a pretty positive trait, and the average librarian probably has read more than your average nonlibrarian. But even if we grant this possible difference in averages of these two groups, the assertion that all librarians are better read than all nonlibrarians is certainly false. So stereotyping goes awry because people typically overgeneralize a belief about a group to make a blanket judgment about virtually every member of that group. Moreover, even though some stereotypes are positive, they can still have negative effects. Stereotypes can be benevolent on the surface but ultimately patronize the stereotyped group and suggest that negative stereotypes are not far behind (Siy & Cheryan, 2016).

superordinate goal

A common problem or shared goal that groups work together to solve or achieve.

target empowerment model

A model which suggests that targets of bias can employ strategies that deflect discrimination, as long as those methods aren't perceived as confrontational.

Prejudice

A negative attitude toward an individual solely on the basis of that person's presumed membership in a particular group.

self objectification

A phenomenon whereby intense cultural scrutiny of the female body leads many girls and women to view themselves as objects to be looked at and judged.

attributional ambiguity

A phenomenon whereby members of stigmatized groups often can be uncertain whether negative experiences are based on their own actions and abilities or are the result of prejudice.

How Do Stereotypes Come into Play?

So far, we have covered where stereotypes come from and why we tend to rely on them. But how do they actually work? Unfortunately, many young women did just that. They categorized him as you probably did. They had no way of knowing one additional group he belonged to—serial killers. The man in the photo is Ted Bundy, who brutally raped and murdered more than 30 women, mostly college students, during the 1970s. It is likely that the categorizations activated by his appearance helped him carry out his heinous crimes. Research has delved into the process by which we initially categorize a person as belonging to a group, activate stereotypes associated with that group, and then apply those stereotypes in forming judgments of that person. Let's learn more about how this process works.

rejection identification theory

The idea that people can offset the negative consequences of being targeted by discrimination by feeling a strong sense of identification with their stigmatized group.

person-group discrimination discrepancy

The tendency for people to estimate that they personally experience less discrimination than is faced by the average member of their group.

ultimate attribution error

The tendency to believe that bad actions by outgroup members occur because of their internal dispositions and good actions by them occur because of the situation, while believing the reverse for ingroup members.

dehumanization

The tendency to hold stereotypic views of outgroup members as animals rather than as humans.

Reappraising Anxiety

When stereotypes are difficult to change, targets can reinterpret what the stereotypes mean. For example, often when people think that they are stereotyped to do poorly, they are more likely to interpret difficulties and setbacks as evidence that the stereotype is true and that they do not belong. They perform better, though, if they reinterpret difficulties as normal challenges faced by anyone. In one remarkable study, minority college students who read testimonials about how everyone struggles and feels anxious when beginning college felt a greater sense of belonging in academics, did better academically, and were less likely to drop out of school (Walton & Brady, 2020; Walton & Cohen, 2007, 2011). Similarly, other studies have found that getting instructions to reappraise anxiety as a normal part of test-taking improved women's and minorities' performance, and in many cases, these effects persisted even months later, when students took an actual high-stakes test such as the GRE (Jamieson et al., 2010; Johns et al., 2008). In fact, Johns and colleagues (2005) found that simply being able to interpret test anxiety as resulting from stereotype threat improved women's performance on a math test.

Sexual Objectification.

Women as a group are subject to a specific form of dehumanization known as sexual objectification, which consists of thinking about women in a narrow way, as if their physical appearance were all that matters. Based on early theorists such as the psychoanalyst Karen Horney and the philosopher Simone de Beauvoir, Barbara Fredrickson and Tomi-Ann Roberts's (1997) objectification theory notes that in most if not all societies, women are objectified by being judged primarily on the basis of their physical appearance. Although objectification does not involve equating women with animals, it is a way of denying that women possess the psychological characteristics that make them fully human, such as a unique point of view and a complex mental life. In assessing this idea, researchers found that well-known women, but not men, were perceived more like objects—cold, incompetent, and without morality—when participants were asked to focus on the women's appearance than when they were asked to focus on the women as people (Heflick & Goldenberg, 2009; Heflick et al., 2011). For example, in studies carried out during Barack Obama's first term as president, they found that the first lady, Michelle Obama, was perceived as lower in warmth, competence, and morality when participants focused on her appearance. In contrast, focusing on President Obama's appearance did not have a similar effect on ratings of him. Objectification of women can help justify exploitation of them. Integrating objectification theory and terror management theory, Jamie Goldenberg and colleagues proposed that objectification also may help people avoid acknowledging the fact that we humans are animals and therefore mortal (e.g., Goldenberg et al., 2009). Portraying women in an idealized (often airbrushed) way and only as objects of beauty or sexual appeal reduces their connection to animalistic physicality. Supporting this view, Goldenberg and colleagues have shown that reminding both men and women of their mortality, or of the similarities between humans and other animals, increases negative reactions to women who exemplify the creaturely nature of the body: women who are overtly sexual, menstruating, pregnant, or breast-feeding. This line of research suggests that objectifying women as idealized symbols of beauty and femininity and rejecting women who seem to fall short of those ideals helps both men and women deny their own animal nature.

Infrahumanization.

A subtler form of dehumanization is infrahumanization (Leyens, Paladino et al., 2000). When people infrahumanize outgroup members, they do not compare them directly with nonhuman animals. Rather, they perceive those outgroup members as lacking qualities viewed as being unique to humans. These qualities include complex human emotions such as hope, humiliation, nostalgia, and sympathy. People attribute these uniquely human emotions more to members of their ingroup than to outgroup members (Gaunt et al., 2002; Leyens et al., 2001). Infrahumanization has important repercussions for people's treatment of outgroup members. Cuddy, Rock, and Norton (2007) looked at people's desire to help with relief efforts in the aftermath of Hurricane Katrina, which caused massive destruction to parts of the southeastern United States in 2005. Participants in their study were less likely to infer that racial outgroup members who suffered from the hurricane were experiencing uniquely human emotions, such as remorse and mourning, than were racial ingroup members. The more participants infrahumanized the hurricane victims in this way, the less likely they were to report that they intended to take actions to help those individuals recover from the devastation. These effects mirror other evidence indicating that Whites assume that Blacks who have faced hardships feel less pain than Whites who have faced similar hardships (Hoffman & Trawalter, 2016; Hoffman et al., 2016).

Stereotype Activation

After we make an initial categorization, the stereotypes that we associate with that category are often automatically brought to mind, or activated, whether we want them to be or not. Sure, some folks have blatant negative beliefs about others that they are happy to bend your ear about. Others want to believe that they never ever judge people on the basis of stereotypes. Most of us probably are somewhere in the middle. Individuals raised and exposed to the same cultural information all have knowledge of which stereotypes are culturally associated with which groups (Devine, 1989). This information has made it into those mental file folders in our head, even if we have tried to flag it as false and malicious. When we meet someone from Wisconsin, we mentally pull up our Wisconsin folder on the state to be better prepared for discussing the intricacies of cheese making and the Green Bay Packers. We do this unconsciously and without necessarily intending to; the association is ingrained and automatic, based on cultural learning. Patricia Devine (1989) provided an early and influential demonstration of automatic stereotype activation. She reasoned that anything that reminds White Americans of African Americans would activate the trait aggressive because it is strongly associated with the African American stereotype. To test this hypothesis, she subliminally exposed White participants to 100 words. Each word was presented so briefly (for only 80 milliseconds) that participants could not detect the words and experienced them as mere flashes of light. Depending on which condition participants were in, 80% (or 20%) of the words—some very explicit—were related to the African American stereotype (e.g., lazy, ghetto, slavery, welfare, basketball, unemployed), while the rest of the words were neutral. Then, as part of an apparently separate experiment, participants read a paragraph describing a person named Donald, who behaved in ways that could be seen as either hostile or merely assertive. Participants primed with the Black stereotype interpreted Donald's ambiguous behaviors as more hostile than did those who didn't get this prime. Even though aggressive was not primed outright, because it is part of the stereotype schema for African Americans, priming that stereotype cued people to perceive the next person they encountered as being aggressive. Importantly, this effect was the same for those who reported low and high levels of prejudice toward African Americans. However, it is important to clarify that Devine's study primed people directly with stereotypes about Blacks, not simply with the social category "Blacks" or a photo of a Black individual. Other research suggests that some people are less likely to activate stereotypic biases automatically. For example, Lepore and Brown (1997) showed that people with stronger prejudices activate a negative stereotype about Blacks when they are simply exposed to the category information (i.e., the word Blacks), whereas those who are low in prejudice don't show this activation at all. Additional research has suggested that the goal of being egalitarian can itself be implicitly activated when people encounter an outgroup and can help keep negative stereotypes from coming to mind (Moskowitz, 2010; Moskowitz & Li, 2011; Sassenberg & Moskowitz, 2005). The takeaway message seems to be that although low-prejudice individuals may be aware of culturally prevalent stereotypes about outgroups, they often do not activate those stereotypes.

Ambivalent Racism

Contemporary prejudice against African Americans in the United States is often mixed with ambivalence (Katz & Hass, 1988). The term ambivalent racism refers to racial attitudes that are influenced by two clashing sets of values: a belief in individualism, that each person should be able to make it on his or her own, and a belief in egalitarianism, that all people should be given equal opportunities. The core idea of ambivalent racism is that many Whites simultaneously hold anti-Black and pro-Black attitudes that are linked to these contrasting values. Depending on which set of values is primed, ambivalent people are likely to respond more strongly in one direction or the other. So how do you know which value people will affirm? Well, it depends on which value is currently most salient, or active (Katz & Hass, 1988). If people are thinking about values related to individualism, they tend to be more prejudiced, but if thinking about values related to egalitarianism, they tend to be less prejudiced. For example, when White participants were led to think about the Protestant ethic (a belief that emphasizes the individualistic value of hard work), they were more likely to report stronger anti-Black attitudes, but these individualistic thoughts did not influence their pro-Black attitudes (FIGURE 10.7). The pro-Black attitudes were no different from those of participants who received no priming. Thus, ambivalence remained, but the individualism prime shifted participants toward anti-Black attitudes. Conversely, when White participants were led to think about the value of humanitarianism, they were more likely to report pro-Black attitudes, but these humanitarian thoughts did not influence their anti-Black attitudes (again, as compared with the no-priming condition). Thus, again, ambivalence remained, but the humanitarian prime shifted participants toward more pro-Black attitudes.

Cognitive Measures of Implicit Bias

Cognitive measures also tell us something about people's implicit attitudes. These measures take different forms, but they all rely on the same assumption: If you like a group, then you will quickly associate that group with good stuff; if you don't like a group, you will quickly associate that group with bad stuff. To assess such implicit associations, researchers prime people with members of different groups and measure how fast it takes them to identify good stuff and bad stuff (Dovidio et al., 2002; Fazio et al., 1995). For example, Fazio and colleagues (1995) reasoned that if Whites experience an automatic negative reaction to Blacks, then exposure to photographs of African Americans should speed up evaluations of negative words and slow down evaluations of positive words. To test this hypothesis, they presented participants with positive words (e.g., wonderful) and negative words (e.g., annoying) and then asked them to indicate as fast as they could whether each was good or bad by pressing the appropriate button. Each word was immediately preceded by a brief presentation of a photograph of a Black person or a White person. The results revealed substantial individual differences in White participants' automatic reactions: For many White participants, being primed with Black faces significantly sped up reactions to negative words and slowed down reactions to positive words. Other White participants did not show this pattern, and some even showed the opposite. More importantly, the more closely these people seemed to associate "Black" with "bad," the less friendly they were during a later 10-minute interaction with a Black experimenter. Since the late 1990s, the most commonly used measure of implicit attitudes has been the implicit association test, or IAT (Greenwald et al., 1998), which we introduced in chapter 3. The basic logic of this test is that if you associate group A with "bad," then it should be pretty easy to group together instances of group A and instances of bad stuff, and it should be relatively difficult to group together instances of group A and instances of good stuff. This basic paradigm can be used to assess implicit associations with any group you can think of, but it has most commonly been used for race, so let's take that example. (You can take this and other versions of the IAT yourself by visiting the Project Implicit web site, at www.projectimplicit.com.) In the Race IAT, people are instructed to do some basic categorization tasks (FIGURE 10.8). The first two rounds are just practice getting used to the categories. For the first round, you are presented with White and Black faces one at a time, and you just need to click on one button if the face is Black and another button if the face is White. In the second round, you categorize positive ("rainbow," "present") and negative ("vomit," "cancer") words by clicking on one button if the word is positive and a different button if the word is negative. In the third round, the task becomes more complicated as you are presented with faces or words, one at a time. Deciding as quickly as possible, you must then click one button if what you see is either a Black face or a positive word but a different button if you see either a White face or a negative word. When your cognitive network links "Black" with "bad," it's relatively hard to use the same button to categorize Black faces along with positive words without either slowing down or making lots of category errors. In contrast, if "Black" and "bad" are closely associated in your cognitive network, you should find it much easier to use the same button to indicate that you see either a Black face or negative word, the task required in a fourth round. If you're faster with "Black" and "bad" than "Black" and "good," you're showing associations that are predictive of implicit bias

When Do the Effects of Contact Generalize Beyond the Individual?

Does contact reduce only prejudice toward individuals whom you get to know? Or do these effects generalize to that person's group? If Frank, a Christian, develops a friendship with a Muslim roommate, Ahmed, during a stay at summer camp, will this contact generalize and reduce Frank's prejudice against other Muslims when he goes back to school? Here, too, the answer is not a simple "yes" or "no." Rather, it depends on a sequence of stages that play out over time (FIGURE 11.7) (Pettigrew, 1998; Pettigrew & Tropp, 2006). In an initial stage, as two people become friends, their sense of group boundaries melts away. Perhaps you have had this experience of talking to another person and simply forgetting that he or she is from a different group. This is decategorization at work. When sharing their love of music, Frank and Ahmed are not Christian and Muslim; they are simply two roommates and friends. Their liking for each other replaces any initial anxiety they might have felt about interacting with a member of another group. But if Frank is to generalize his positive impression of Ahmed to other Muslims, and if Ahmed is to generalize his positive impression of Frank to other Christians, those different social categories must again become salient during a second stage, after contact has been established (Brown & Hewstone, 2005). Also, Frank's overall impression of Muslims is more likely to change if he regards Ahmed as representative of the outgroup as a whole (Brown et al., 1999). If Frank views Ahmed as being quite unlike other Muslims, then his positive feelings toward his new friend might never contribute to his broader view of Muslims. But if the category differences between them become salient and each considers the other to be representative of his religious group, then both Frank and Ahmed will develop more positive attitudes toward the respective religious outgroup more broadly. You might be noticing a few rubs here. Effective contact seems to require getting to know an outgroup member as an individual, but this process of decategorization can prevent people from seeing that person as also being a representative of their group. There is a tension between focusing on people's individual characteristics and recognizing the unique vantage point of their group or cultural background. But understanding others' group identities is a key step in reducing prejudice against the group as a whole. This might be part of the reason that members of minority groups often prefer and feel more empowered by an ideology of multiculturalism, which endorses seeing the value of different cultural identities, over an ideology of being colorblind, whereby people simply pretend that group membership doesn't exist or doesn't matter (Plaut et al., 2009; Plaut et al., 2018; Vorauer & Quesnel, 2017). Another potential pitfall is that although this second stage of established contact might reduce intergroup prejudice, there is no guarantee that it will promote intergroup cooperation. For this reason, researchers have suggested that a stage of recategorizing outgroups into a unified group, or common ingroup identity, will further reduce prejudice by harnessing the biases people have in favor of their ingroups (Gaertner & Dovidio, 2000). If Frank and Ahmed see each other and other members of their respective religious groups as all being part of the same camp or the same nation, then they are all in the same overarching ingroup. Perhaps this is the final dash of spice needed in the recipe of contact that will not only end intergroup prejudices but also lead to peace and cooperation. Is this vision just pie in the sky? There are hopeful signs that having a common ingroup identity can effectively reduce some manifestations of prejudice. A few years ago, a school district in Delaware instituted the Green Circle program for elementary school students. Over the course of a month, first- and second-graders in this program participated in exercises that encouraged them to think of their social world—which they designated their "green circle"—as getting bigger and bigger to underscore the idea that all people belong to one family, the human family. Students who participated in the program were more likely later to want to share and play with other children who were of different genders, weights, and races than were students in the same school who had not yet gone through the program (Houlette et al., 2004). Studies suggest that adults also can become less prejudiced, more tolerant, and more open to immigration when the common humanity among members of different groups is made salient (Kunst et al., 2015; Motyl et al., 2011). Although these findings are surely encouraging, Allport (1954) was skeptical about people's ability to stay focused on the superordinate identity of humans, as opposed to more circumscribed national, regional, and family identities. For example, some theorists suggest that we are most likely to identify with groups that provide optimal distinctiveness (Brewer, 1991). Such groups are large enough to foster a sense of commonality but small enough to allow us to feel distinct from others. Geographic differences mean different languages, customs, arts, values, styles of living—all useful ways to define what feels like a shared but unique identity. Keeping salient the more abstract identity we all share is no easy chore, but superordinate goals and concerns can help.

Illusory Correlations

In some instances, stereotypes develop from nothing more than a perceptual bias known as an illusory correlation. This is a faulty perception whereby people think that two things are related when in reality they are not. More specifically, an illusory correlation occurs when a person perceives that membership in a certain social group correlates—or goes hand in hand with—a certain type of behavior (Hamilton & Sherman, 1989; Costello & Watts, 2019). These kinds of illusory correlations occur when two things that are generally rare or distinctive co-occur in close proximity to one another. When strange or unusual things happen, our attention is drawn to them because they stand out. And when two unusual things co-occur, our mind automatically assumes a connection. For most majority-group members, minority-group members are distinctive. Also, most people, regardless of their group membership, tend to find socially undesirable behaviors distinctive. (Fortunately, most of the time, people do good things rather than bad things!) So when ingroup members see outgroup members acting negatively—for example, in news reports about Black men accused of violent crimes—two distinctive features of the situation, a minority individual and an undesirable behavior, grab their attention. This doubly distinctive perception results in believing the two attributes go together, even when the minority group is no more likely than the majority group to engage in bad behavior (Hamilton et al., 1985).

1. Stereotypes Are Cognitive Tools for Simplifying Everyday Life

People rely on stereotypes to simplify social perception. It would take a lot of effort to assess every person we interact with solely on the basis of individual characteristics and behaviors. Stereotypes allow people to draw on their beliefs about the traits that characterize typical group members to make inferences about what a given group member is like or how the person is likely to act. Imagine that you have two neighbors, one a librarian and the other a veterinarian. If you had a book title on the tip of your tongue, you would more likely consult the librarian than the vet—unless it was a book about animals! In other words, stereotyping is a cognitive shortcut that allows people to draw social inferences quickly and conserve limited cognitive resources while navigating a pretty complex social environment (Taylor, 1981). If stereotyping does in fact conserve mental resources, then people should be more likely to fall back on their stereotypes when they are stressed, tired, under time pressure, or otherwise cognitively overloaded. Many lines of research have shown that this indeed is the case (e.g., Kruglanski & Freund, 1983; Macrae et al., 1993). And if stereotypes simplify impression formation, using them should leave people with more cognitive resources left over to apply to other tasks. To test this, Neil Macrae and colleagues (Macrae, Milne, & Bodenhausen, 1994) showed participants a list of traits and asked them to form an impression of the person being described. In forming these impressions, people were quicker and more accurate if they were also given each person's occupation. It's easier to remember that Julian is creative and emotional if you also know he is an artist because artists are stereotyped as possessing those characteristics. But having these labels to hang your impression on also frees up your mind to focus on other tasks. In this study, the other task was an audio travelogue about Indonesia that participants were later tested on. Those who knew the occupations of the people they learned about while they were also listening to the audio travelogue not only remembered more about the people but also remembered more about Indonesia.

The Prejudiced Personality

Prejudice is common in most if not all known cultures. However, within a culture, some people are far more prone to prejudice, stereotyping, and discriminating against outgroups than others. What accounts for these differences? One set of answers can be derived from the causes of prejudice we have already discussed. For example, people have different direct experiences with outgroups and are exposed to different kinds of information about them. They also vary in their level of self-esteem and the lessons they learn growing up about how groups differ and what those differences mean. However, research shows that there are particular kinds of people who are especially prone to being prejudiced and that people who tend to be prejudiced against one outgroup also tend to be prejudiced against other outgroups (Meeusen et al., 2018). In response to the Nazi era, Theodor Adorno and colleagues (1950) sought to understand the roots of anti-Semitism; they found that individuals who were prejudiced against Jews were also prejudiced against other groups. Adorno and colleagues determined that these overlapping biases reflected an authoritarian personality. People with this prejudiced personality style possess a cluster of traits including uncritical acceptance of authority, preference for well-defined power arrangements in society, adherence to conventional values and moral codes, and a tendency to think in rigid, black-and-white terms. More modern researchers have refined this idea with a measure of right-wing authoritarianism (RWA) (Altemeyer, 1981, 1998; De Keersmaecker et al., 2018). Individuals high in RWA believe that the social world is inherently dangerous and unpredictable, and the best way to maintain a sense of security in both their personal and social lives is to preserve society's order, cohesion, and tradition. High RWA people dislike ethnic outgroups as well as socially deviant groups that threaten traditional norms, such as feminists, gays, and lesbians. Other contemporary personality approaches to prejudice focus on some features related to the authoritarian personality. Social dominance orientation (SDO), which was mentioned in chapter 9 (Pratto et al., 1994; Sidanius & Pratto, 1999), taps into beliefs that some people and groups are just better than others, and so society should be structured hierarchically, with some individuals and groups having higher social and economic status than others. SDO more strongly than RWA predicts dislike of disadvantaged groups that are perceived to be inferior, such as those who are physically disabled, those who are unemployed, and homemakers (Duckitt, 2006; Duckitt & Sibley, 2007; FIGURE 10.3).

Aversive Racism

Sam Gaertner and Jack Dovidio's (1986) concept of aversive racism proposes that although most Whites support principles of racial equality and do not knowingly discriminate, they may at the same time possess conflicting, often nonconscious, negative feelings and beliefs about Blacks. Although they will consciously try to behave in line with their egalitarian values, in subtler situations, when decisions are complex and biases are easily rationalized, they may fall prey to the influence of prejudice, often without awareness that they are doing so. For instance, discrimination is less likely to occur when applicants have especially strong or weak records (because the applicants' records do the talking) but is more likely to occur when applicants have mixed records. That is, discrimination occurs when a gray area allows race to play a role, but it can be justified through nonracial means (Gaertner & Dovidio, 2000). For example, when a Black applicant had higher SAT scores but lower high school grades than a White applicant, high-prejudice Whites decided that high school grades were an especially important factor in college admissions decisions. But if the Black applicant had higher grades and lower SAT scores, then the high-prejudiced Whites valued those scores instead as the best indicator of future success (Hodson et al., 2002). So prejudice is more likely to occur in situations when it can be justified by some other motive. But do all people do this, or are particular individuals most likely to act this way? To answer this question, a number of theories about prejudice suggest that it is critical to consider not just the attitudes that people are consciously willing or able to report but also those that might reside beneath their conscious awareness. Since the 1990s, there has been an explosion of interest in this concept of implicit prejudice.

Dehumanization.

Stereotypes justify negative behavior as well as negative feelings. One common way people justify negative behavior is by dehumanizing outgroup members. Dehumanization is viewing outgroup members as less than fully human. The most extreme form of dehumanization is to compare outgroup members directly with nonhuman animals. Blatant examples of this can be seen in the way that nations portray groups they intend to kill. During World War II, Nazi propaganda portrayed European Jews as disease-carrying rats, Americans portrayed the Japanese as vermin (FIGURE 10.11a), and the Japanese portrayed Americans as bloodthirsty eagles mauling innocent Japanese civilians. One of our students who served in the American military during the 1991 Persian Gulf War showed us a flyer dehumanizing Iraqi people (FIGURE 10.11b) that was circulated among the soldiers. In two studies of actual police officers, Goff and colleagues (2014) found that the more an officer implicitly associated Black people with apes, the more likely that officer was to have a record of using force on Black children more than on children of other groups. These tendencies to think about members of outgroups as nonhuman animals have likely been partly responsible for fueling many historical examples of horrible treatment of outgroups, such as slavery, bombings, and genocide (Kteily & Bruneau, 2017). One study, for example, showed that hearing about acts of terrorism by Muslims against American and British targets in 2013 made American and British participants more likely to dehumanize Muslims and support violent countermeasures such as bombing entire countries believed to be harboring terrorists (Kteily et al., 2015). As we discussed in our coverage of cognitive dissonance (chapter 6), when people act in ways that fall short of their moral standards, they often attempt to seek justifications. In times of extreme intergroup conflict, when innocent people are being killed, perpetrators of that violence—and even those standing by—often reduce the dissonance by regarding the victims as subhuman and therefore less deserving of moral consideration. Indeed, Castano and Giner-Sorolla (2006) found that when people were made to feel a sense of collective responsibility for their ingroup's mass killing of an outgroup, they viewed members of that outgroup as less human. Once the outgroup has been reduced to animals who do not deserve moral consideration, the perpetrators feel less inhibited about committing further violence (Kelman, 1976; Kteily et al., 2015; Staub, 1989; Viki et al., 2013). Indeed, in one study, people were more likely to administer a higher intensity of shock to punish people described in dehumanizing (i.e., animalistic) terms than people described in distinctively human terms (Bandura et al., 1975).

Understanding Prejudice, Stereotyping, and Discrimination

We live in families, tribes, and nations. Our groups help us survive and provide our lives with structure. They give us bases of self-worth and imbue life with meaning and purpose. But one major problem is inherent in living within groups: It separates us from other human beings who live within other groups. Prejudice is the all-too-common consequence of this distinction between us (the ingroup) and them (the outgroup). Virtually every known culture has been hostile to members of some other culture or oppressed certain segments of its society. Indeed, recorded history is riddled with the bloody consequences of a seemingly endless parade of oppression, persecution, colonization, crusades, wars, and genocides. The violent heritage of our species led a character from James Joyce's classic novel Ulysses to comment, "History ... is a nightmare from which I am trying to awake" (Joyce, 1961, p. 28). We will explore the many reasons that history has been and continues to be such a nightmare of intergroup hatred and violence in two chapters, this one and chapter 11. In chapter 11, we will consider how prejudice, stereotyping, and discrimination affect those targeted by these biases. We will also consider ways in which we might hope someday to awaken from this nightmare to an egalitarian reality in which people treat each other fairly, regardless of their differences. In this chapter, we focus on: The nature of prejudice Three basic causes of prejudice Who is prone to prejudice Prejudice in the modern world How stereotyping arises and affects the way people perceive others and behave toward them

Terror Management Theory

We've seen that people tend to think poorly of those who seem different. But why can't people just leave it at "Different strokes for different folks"? According to the existential perspective of terror management theory, one reason is that people must sustain faith in the validity of their own cultural worldview so that it can continue to offer psychological security in the face of our personal vulnerability and mortality. Other cultures threaten that faith: "One culture is always a potential menace to another because it is a living example that life can go on heroically within a value framework totally alien to one's own" (Becker, 1971, p. 140). Basing their work on this idea, terror management researchers have tested the hypothesis that raising the problem of mortality would make people especially positive toward others who support their worldview and especially negative to others who implicitly or explicitly challenge it (Greenberg et al., 2016). In the first study testing this notion, when reminded of their own mortality, American Christian students became more positive toward a fellow Christian student and more negative toward a Jewish student (Greenberg et al., 1990). Similarly, when reminded of death, Italians and Germans became more negative toward other cultures (Castano et al., 2002; Jonas et al., 2005) and non-atheists became more negative toward atheists (Cook et al., 2015). Hirschberger and colleagues (2005) found that reminders of mortality increased prejudice against physically disabled individuals because they reminded people of their own physical vulnerabilities. In the first of a pair of studies particularly pertinent to the ongoing tensions in the Middle East, researchers found that when reminded of their own mortality, Iranian college students expressed greater support for suicidal martyrdom against Americans (Pyszczynski et al., 2006). The second study showed that politically conservative American college students who were reminded of their mortality similarly supported preemptively bombing countries that might threaten the United States, regardless of "collateral damage" (Pyszczynski et al., 2006). And in yet another troubling study, Hayes, Schimel, and colleagues (2008) found that Christian Canadians who were reminded of their mortality were better able to avoid thoughts of their own death if they imagined Muslims dying in a plane crash.


Ensembles d'études connexes

Vocabulary workshop level d unit 5 antonyms

View Set

V.B: INV. ANALYSIS, REC & ACTION - Communications with Clients and Prospective Clients

View Set

Thermodynamics Reading Quizzes (#2)

View Set

POLS 2305 - Module 13 - The American Legal System and the Courts (The Judiciary)

View Set

ch 5-Cost volume profit relationships

View Set

ORL 577 - Strategy Development and Implementation Test 1

View Set

Microeconomics Chapter 19 Mcgraw Hill

View Set