Thought Processes

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

Examples of Party Cues

1. In 1990 Campbell, Converse, Miller, & Stokes 2. In 2002 Layman and Carsey

Examples of Biased Assimilation

1. Lord, Ross, and Lepper (1979)- capitol punishment 2. Munro and Ditto (1997)- homosexual 3. Ditto and Lopez (2002)-diseases

Two ways we form Opinion

1. Online Processing 2. Memory Based Processing

Backfire Effect

After being presented with false in formation that correlates to a persons views, even if they are presented the correct information they won't believe it and will instead believe the incorrect view even more -Most of the time it does not seem to occur when people receive corrective information. Usually they just say I don't believe the corrective facts, but keep believing their views the same. -Some people say that often when people really present incorrect views they often do it in a flippant manner, but don't actually believe it. - There was a study that said that paying for the correct answers to some ideologically relevant factual questions seem to reduce partisan gap in misinformation. The evidence though is inconsistent and limited to beliefs that are not centrally emphasized in partisan discourse as opposed to centrally emphasized ideological misinformation. Once you give a que to partisanship the difference gets much less. - Also in response to payments people may tell the researchers what they "want to hear", not what they actually believe.

Aggregation View

Any individual judgments have lots of error, but that error is random and therefore the idiocy of the masses averages out to zero. - It is debatable because a lot of error in political judgment isn't random, it is based on directional bias.

Deli Carpini and Keeter (1996)

Can democracy work if people don't know much ? They answered that democracy only works for the politically informed, uninformed make bad political judgments.

Popkin (1991)

Can democracy work if people don't know much? He said it can work because of low information rationality. Meaning people use heuristics (such as party cues) to make reasonably good judgments without too much effort.

How has constraint changed?

Constraint has increased since the 1960s (e.g., Layman and Carsey, 2002). The low levels of constraint in 50s and 60s may have been a product of that political context. - We see that constraint and attitude stability does exist among those high in political knowledge. - In general attitude instability could be an actual change in attitude, but it also could be the result of random error as opposed to attitude change: • However the results are that the longer elapsed time between assessments does not produce much less stability, thus time 3 attitude is not much better predicted by Time 2 attitude than by Time 1 attitude. This suggests random error is the accurate reason. • This random error could be because of non-attitudes as Converse said, but Achen said that people have attitudes but survey questions are unclear and this produces random error due to lack of a good scale to compare the different questionnaires not lack of any good attitude.

Converse (1964)

He said that very few people free report political views that seem influe nced by understanding of ideology. - There is little constraint, being consistently liberal or conservative across the issue that decide if you are in fact liberals or conservative, overall and little attitude stability over time, meaning people weren't consistent with themselves over time. - Argued that most citizens (low to moderate in political sophistication) have "non-attitudes", where they express random opinions that they don't really hold, when prompted by a questioner.

Partisan Cues

How much do the elite's actions affect the masses? Most of the times this is the way it works, but it can go the opposite way. - Good example of this is: 1. Cohen (2003) 2. Goren, Federico, & Kittilson (2009)

Lord, Ross, and Lepper (1979)

In a study on biased assimilation they brought pro- and anti- capital punishment subjects into lab. They presented each subject with one study showing that capital punishment deters murder, and another study showing that capital punishment does not deter murder. They varied the methodology of the studies: "before-after" comparisons, or " adjacent states" comparison. -The results: People perceived as higher quality the study that supported their view, regardless of methodology. People rated themselves as more convinced of their initial view. -Conclusion was that pre-existing views lead people to process information in a biased manner. This biased processing may make people more extreme in their views.

Example of Moral Adjustment to Justify Candidate Support

Mcann 1997- Clinton and Bush

Examples of Idealogically Motivated Factual Misinterpretation

Nyhan and Reifler (2010)

Depth of Opinion

People have opinions that are deeply held and they choose the party which aligns with their opinions to the greatest extent. -EG: 1. Campbell (1960) 2. Converse (1964) -Relvance of constraint- (Layman and Carsey 2002)

Lodge, Steenbergen and Brau (1995)

People read potential info about policy positions (e.g., abortion, death penalty) and candidate characteristics (e.g., served in military, divorced) that might characterize a candidate and rated how desirable or undesirable they would be. They then read a fact sheet about two fiction candidates for congress and then a random half of people were then made to think more about the candidates and their issue stances. - The anywhere from 1 to 31 days later people evaluated each candidate from very positive to very negative and were asked to freely recalled as much about each candidate as they could. - The results showed that the peoples' memory was extremely poor, but they found that subjects ratings of candidate liking were often consistent with the desirable and undesirable features that they couldn't remember!Subjects with depth of processing manipulation, and subjects with shorter delays, recalled more

Example of Partisan Bias Information Processing

Taber and Lodge (2006)

Layman and Carsey (2002)

They showed that during the early 1970s the elites of the two major parties became more polarized on social welfare and racial issues and this was reflected in the populace. This continued into the 1990s, and also by then included cultural issues. - They used ANES 1972-1974-1976 and ANES 1992-1994-1996 panel studies, AKA longitude studies. - Results of the study showed that in 1970: 1. Party identification at time 1, meaning the first study, lead to changes toward party appropriate issue attitudes on social welfare and racial issues by time 2, the next study. 2. Social welfare and racial issue attitudes at time 1 lead to changes toward correct party by time 2 - Results of the study in the 1990s showed above findings replicated AND: 1. Party identification at time 1 lead to changes toward party appropriate issue attitudes on cultural issues by time 2 2. Cultural issue attitudes at time 1 lead to changes toward correct party by time 2 - They said that when party elites stake out distinct positions on an issue, public engages in issue-based partisan change and more so partisan-based issue change

Nyhan and Reifler

They showed that motivated reasoning processes lead people with ideological committments to believe certain pieces of misinformation. This was true to the point where if they were presented the correct information they didn't believe the correct information and instead believed the incorrect view even more, which is called backfire effect. -In their first study in 2005 people read Bush quotes about the WMDs in Iraq. The manipulation was that half the people received a correction. -Results: Conservatives more likely to believe in WMD than liberal and the backfire effect was that conservatives become even more likely to believe this after the correction. - In the second study in 2006 people read about three different misperceptions: 1) WMD in Iraq, 2) Bush banned stem cell research, 3) Bush tax cuts increased revenue. The manipulation was there either being a correction or not being a correction. -Results:1) Conservatives responded to corrective information by believing less in the incorrect views, in contrast with Study 1. 2) Libs did not respond to corrective information by adjusting belief, showing that they were resistant to persuasion. 3) Conservatives responded to corrective information with backfire effect. - The conclusions were that motivated misperceptions can sometimes cause backfire effects when corrections appear.

Goren, Federico, & Kittilson (2009)

They took participants who were Democrats or Republicans rated various personal values relevant to politics. They manipulated the experiment by either telling them that the relevant partisan/ideological group supported the value or were not told this. - The results showed that partisans endorsed values that they would be expected to endorse, even without cues. Partisan and ideological cues increased this support, but only marginally.

Example of Selectively Justifying Moral Choices

Uhlman (2009)- killing blacks vs. whites

Online Processing

When people encounter new information they can use it to update their existing "running tally evaluations", but they don't otherwise store this information in memory. - When asked for their opinions, they have access to these running tallies, through this people can hold informed opinions without knowing the information that has informed them of said opinions.

Lodge (1985)

can Democracy work if people don't know much? He posited that people keep online running tallies and forget factual information

Cohen (2003)

did a study where he brought in liberal dems and conservative reps into a lab. - They read about a social welfare policy and were told, in an alternating sequence, that either dems, reps, or didn't say who supported it. The policy was either liberal or conservative. - The results were that the part information determined the support for the policy, not the policy content. Liberal Democrats supported whichever policy they were told was Democratic; conservative Republicans supported whichever policy they were told was Republican. - They were asked question to make sure that they actually read policy. - They said from this that partisan groups define the meaning of reality and that people conform to their parties because those standards help them understand right and wrong.

Munro and Ditto (1997)

did a study where they brought in people with either high or low prejudice against homosexuals. They read both a prejudice-consistent article and a prejudice inconsistent article and rated: 1. the quality of the studies' methodologies. 2. They also rated level of anxiety and happiness and prejudice against homosexuals 3. If their perception was that their attitude about homosexuals changed after reading each study. -Results: rated methodology that confirmed their viewpoint as better (biased assimilation). This was accounted for by Ps feeling lower anxiety and higher happiness after reading viewpoint-consistent study.

Campbell, Converse, Miller, & Stokes (1990)

did a study which showed that party identity is a "perceptual screen" through which political information is filtered. - The screening varies based on how strongly your are affiliated

Ditto and Lopez

found that people demand a higher standard of evidence for preference-inconsistent information than for preference consistent information. 1. People were told about a fictitious disease called "TAA deficiency", which predicts susceptibility to pancreatic disorders. They were given a self-administered saliva test which showed that all participants had the disease. - The results were that people made to think they had TAA deficiency both rated the disease as less serious and took longer to determine that test was complete. 2. In a third study they told all people that they had rare enzyme condition. They were either told that this condition was harmful or beneficial. -The people who were told it was a harmful condition indicated more reasons test could be invalid.

Campbell (1960)

gave the classic view that people's policy preferences do not strongly influence their vote choice, rather people vote on the basis of party identity. - It is a social attachment. - They're the result of running tally evaluations of the party in general. They then adjust the

The Polarizing Effect

people will end up coming more radical when faced with information from two sides, for which there is mixed evidence

Motivated Reasoning

reasoning that is biased toward reaching preference-consistent conclusions - some scholars argue that this often characterizes political thinking

Zaller and Feldman (1992)

said three things with regard to the Theory of Survey Response: 1) Many people give unstable responses to political items over time. 2) Changes in question wording and order impact responses. 3) Those high in political knowledge give most stable responses across times and across experimental conditions. - They argue that people, especially those moderate to low in political knowledge, are conflicted about many political matters. As they possess a store of often inconsistent considerations, i.e. belief or sentiments held or exposed to, regarding a political issue. Meaning that when asked a political question, people form an opinion based on considerations that are most accessible at the time. This is in accordance with memory-based processing because for these people any response instability is due to the different considerations accessible at diff't times and question order effects could be due to previous content bringing a particular consideration to mind

Taber and Lodge (2006)

showed that partisans, especially high knowledge partisans, employ various information processing mechanisms to uphold their pre-existing views. - People rated attitudes on affirmative action or gun control. Used multi-item composites and also political knowledge. - They sat in front of computer, with opportunities to read 8 out of 16 possible arguments. Argument label indicated position of each argument. The computer recorded order and viewing time for each argument. The people rated strength of each argument and recorded thoughts while reading -Results: those with high knowledge rated congruent arguments as stronger and those same people selected more attitude-congruent than incongruent arguments to read (confirmation bias). Those same people spent more time reading weaker arguments, but this time was spent generating criticisms of these arguments ("disconfirmation bias"). People became more polarized on their attitudes, especially to the extent that they displayed a disconfirmation bias!

Uhlman (2009)

showed that people use moral principles to rationalize the moral choices that they make for other reasons. The idea being that moral principles sometimes selected to rationalize judgments. - He presented them with a situation where they could either sacrifice one person to save others, Consequentialist response, or not sacrifice anyone because killing is always wrong, Deontological response. - The experimental conditions were that they either sacrifice one black to save 100 whites or sacrifice one white to save 100 blacks. - Results: showed that conservatives equally likely to choose options regardless of condition, but liberals choose consequentialist when sacrificing white to save blacks, but deontological when sacrificing black to save whites. The liberals did not want to appear racist, so chose moral principle that would justify the "not racist" choice

Mcann (1997)

showed that the decision to support a candidate leads one to move one's values in a direction consistent with the candidate's values. - The findings were that: 1. Clinton supporters became more egalitarian and less traditional during this timeframe 2. Bush supporters became less egalitarian and more traditional during this timeframe - This lead voters to adjust their views on abortion and social welfare spending in accordance with these values.

Disconfirmation Bias

when High knowledge partisans do not consider the other side

Confirmation Bias

when Highknowledge partisans seek out the sides that they are on

Online Processing

when people encounter new information they can use it to updte their existing "running tally evaluations" but they don't otherwise store this information in memory - When asked for an opinion \, they have access to running tallies

Memory Based Processing

when people encounter new information they store some of it in memory and when they are asked for their opinions they don't have ready access to already formed opinions. Instead they access whatever they can and form an opinion on the spot


Ensembles d'études connexes

Chapter 21 Antidepressant Agents

View Set

EMT Chapter 32: Orthopedic Injuries

View Set

Chapter 5 Temporomandibular Joint

View Set

Chapter 2 Financial Statements and Cash Flow

View Set

Glucose Practice Questions - Metabolism

View Set