Chapter 2 - Mindset

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

What does it mean exactly to seek out evidence that favors an alternative view? Two questions:

"What would things look like if the alternative view were true?" -This question helps balance our natural tendency to focus on how things would look if our initial view were true [12]. We often fail to notice that the evidence fits equally well with some alternative views, because we don't really ask what we'd expect to see if those views were true. "Which observations don't fit quite right with my first or favored view? " -An open search for evidence means paying attention to facts that stick out, that don't seem to fit quite right with our favored hypothesis. Those are the facts that we are likely to learn most from, because sometimes even a theory we are confident in will unravel if we pull on the thread that sticks out

A helpful analogy

A helpful analogy here is to consider the relationship between a map and the territory that it represents. An accurate map is one that represents a road as being in a certain place only when there actually is a road in that place. And the more accurate the map, the more closely its marks match the actual distribution of objects in the territory. So if our goal is to draw an accurate map, we can't just draw a road on it because we wish there were a road there. Likewise, when we're genuinely curious, we aren't secretly hoping to arrive at a particular belief. We want our beliefs to reflect the world, the way an accurate map reflects its territory

Feeling that you are being far too generous to the alternative view....

And overall, in studies where people were given the opportunity to look at information on both sides of controversial issues, they were almost twice as likely to choose information that supported their pre-existing attitudes and beliefs [10]. Since this is our natural tendency, restoring balance requires a deliberate search for facts that may support alternative views, as well as problems with our own view. And that search, if it is really fair, will tend to feel like we are being far too generous to the alternative view.

As a result

As a result, it feels like we keep encountering strong evidence for our view, and weak evidence for alternatives. Naturally, this makes us even more confident.

Divisions of the process of reasoning into 3 stages

At the search stage, we identify a range of possible views, as well as potential evidence for them. At the evaluation stage, we assess the strength of the evidence we've identified. At the updating stage, we revise our degrees of confidence appropriately.

Accurate beliefs

At this point, it's worth taking a step back and reflecting on what exactly the goal of curiosity is. What does it mean to have accurate beliefs? The simple answer is that the more accurate our beliefs, the more closely they reflect the way things actually are. So the goal is to believe things only if they match reality. For example, to believe that the cat is on the mat only if the cat is actually on the mat—and so on for all of our other beliefs.

biased evaluation

Evaluating the strength of potential evidence in a way that's influenced by our initial view—called biased evaluation—is one of the elements of confirmation bias, and it occurs whether or not our initial belief is motivated. Ex. Suppose there is an election coming up and my first inclination is to think that a certain political party will win. Even if I don't care who wins, the outcome that seemed plausible to me at first is the one I'll focus on. Sources of evidence that support that view will tend to seem right, since they agree with what I already think. And things only get worse if I really want that political party to win. In that case, the belief that they will win is not only my first view but also my favored view, so I will actively seek out problems with sources of evidence that might undermine it

The most ubiquitous and harmful of all the cognitive pitfalls:

Confirmation bias

Curiosity

Curiosity in our sense is not just a matter of being interested in a topic, but of wanting to discover the truth about it.

3 most important aspects of good reasoning

Curious. The goal is for our beliefs to reflect how things really are; this is best achieved when our confidence in a belief matches the strength of the support we have for it. Thorough. It takes patience and effort to push past what initially seems true and perform a thorough search of alternative possibilities and potential evidence. Open. This means evaluating evidence impartially, considering weaknesses in our initial view, and asking what we would expect to see if alternative views were true.

Keep this in mind when revising your beliefs of new found information

Here again it can help to remember that often the best response to some new evidence is only a gentle revision of our degree of confidence. The language of "accepting" or "giving up" beliefs fits with a binary picture of beliefs. Instead, we can have a degree of confidence anywhere between being certain that a claim is false to being certain that it's true. Revising an old belief that we discover has little support can simply be a matter of slightly decreasing our confidence.

Search for possibilities

In a wide variety of examples, the natural tendency is to think of only about two explanations for events when we are not prompted. As a result, we tend to be far too confident that one of those explanations is correct, simply because we have not considered the full range of explanations. When prompted to generate more alternative explanations, our probabilities are much more accurate. (Bad weather is considered the primary cause in less than 10% of commercial jet crashes, by the way.)

What does good reasoning require we go

In contrast, good reasoning requires that we evaluate potential evidence on its own merits. This means keeping the following two things separate: -our prior degree of confidence in a claim -the strength of potential evidence for that claim This is called Decoupling

Why is it easy to forget the first stage?

It can be easy to forget the first stage. Even if we're completely impartial in our evaluation of evidence for alternative views, there may be alternative views that haven't occurred to us, or evidence that we've never even considered. In that case, our reasoning will be incomplete and likely skewed despite our fair evaluation of the evidence we do identify. We need a thorough search at the outset.

Two quick examples

Learning that I have two cookies is evidence that I have more than one cookie. It may sound odd to put it this way, because the evidence in this case is absolutely conclusive—but in our broad sense, a conclusive proof still counts as evidence. At the other extreme, we can also get extremely weak evidence. For example, in a typical case, the fact that it's cloudy out is at least some evidence that it's going to rain. It might not be enough to make us think that it will probably rain. But however likely we thought rain was before learning for sure that it's cloudy out, we should think that rain is at least a bit more likely when we learn that it's cloudy out.)

How would I have treated this evidence if I had the opposite belief?

Rather than imagining that the evidence came in another way, this question requires me to imagine that my beliefs had been different. This simple change of frame has a huge impact on our reaction to potential evidence, making us more aware and receptive to evidence that supports the opposing view. Arguing the opposite point takes away from bias!!! Or listing the weaknesses for the claim that you believe in

Stanford study example

Recall the Stanford experiment in which students evaluated studies about whether capital punishment deters potential murderers. As we saw, the two sides looked at the very same material and found reasons to strengthen their own views. Even when students were instructed to be objective and unbiased in their evaluations, this did not help at all. However, a follow-up study found one intervention that completely erased their biased evaluation: when they were told to ask themselves at every step whether they "would have made the same high or low evaluations had exactly the same study produced results on the other side of the issue" [22]. The simple mental act of imagining that the study was confirming the opposite view made them look for flaws with equal intensity in both studies, rather than applying selective standards.

The bias blindspot

So we just need to remember to evaluate potential evidence on its own merits, right? Unfortunately, it turns out this doesn't help much. In various studies about how people evaluate evidence, subjects have been instructed to be "as objective and unbiased as possible", to "weigh all the evidence in a fair and impartial manner", or to think "from the vantage point of a neutral third party". None of these instructions make much difference: in fact, in some studies it made things worse. People don't actually decouple even when they are reminded to

So what does it mean to be accurate with beliefs like these?

Suppose I'm fairly confident that there's a road through the hills, but actually there isn't one. Then I'm wrong. But I'm not as wrong as I would have been if I'd been certain that there's a road through the hills. Or consider the example of weather forecasting. One weather forecaster predicts rain tomorrow with 90% confidence, and another predicts rain tomorrow with only 60% confidence. If it doesn't rain tomorrow, there's a sense in which both are wrong. But the lack of rain counts more strongly against the accuracy of the first forecaster than it does against the second one.

Why does considering the opposite work?

The answer is that people don't know what "being unbiased" means in practice. We think it means feeling unbiased—and we already feel unbiased! But telling people to consider the opposite tells them how to be unbiased. Reasoning in an unbiased way is not a feeling at all, it's a set of mental activities. The mental exercise of considering the opposite might feel like a trick—but it's a trick that actually works to shortcircuit the unnoticed bias in our evaluation.

How much searching is enough?

The answer should depend on the importance of the issue, and not on whether we happen to like our current answer. It can be easy to have a very low standard for adequate search when we are hoping to retain our current belief, and a much higher one when we are hoping to find evidence that can supplant it.

What is the next step?

The next step is to start consciously thinking in degrees of confidence. The trick is to let go of the need to take a side, and to start feeling ok with simply being unsure when there isn't enough support to be confident. It's also ok to just suspect that one view is correct, without the need to consolidate that suspicion into an outright belief. We can take time to gather evidence and assess the strength of that evidence. It can be somewhat liberating to realize that we don't need an outright opinion about everything. In a world full of brash opinions, we can just let the evidence take our confidence wherever it leads—and sometimes that's not very far.

Why is it so hard to alter our beliefs?

The psychologist Robert Abelson has noted that we often speak as though our beliefs are possessions: we talk about "holding", "accepting", "adopting", or "acquiring" views. Some people "lose" or "give up" a belief, while others "buy into" it. Also like possessions, we inherit some beliefs in childhood and choose others because we like them or think others will approve of them: Like one's possessions too, "one shows off one's beliefs to people one thinks will appreciate them, but not to those who are likely to be critical." Of course, if our beliefs become unfashionable, that might be a reason to give them up. But we are reluctant to change our major beliefs: "They are familiar and comfortable, and a big change would upset the whole collection" [26]. If, at some level, beliefs feel like possessions to us, then it's understandable why we're defensive when they're criticized. It also explains the phenomenon of belief perseverance, because giving up a belief feels like losing a possession.

Search for evidence

The second component of the search stage, once the available views have all been considered, is the search for potential evidence that can support them. Failure to search thoroughly and evenly for evidence is a major pitfall that hinders accuracy

Simple solution

The simple solution—if we're considering an issue that matters—is to make the effort to think of as many alternatives as we can. (Of course, the hard part is actually noticing that we may not have considered enough alternatives.) And it's best to not only list alternatives but to linger on them, roll them around in our minds to see if they might actually be plausible.

How can this fallacy be diminished?

The tendency to selectively apply a threshold for "enough evidence" can be partially diminished by thinking in terms of degrees of confidence. The goal is not to reach a point where we have enough evidence that we are permitted to believe something (if we want to), or to avoid having so much evidence that we are obligated to believe it (if we don't)

2.3 open

The third element of the right mindset for reasoning is being genuinely open. This means being open to evidence that supports alternative views. It also means being open to revising our own beliefs, whether they are initial reactions or even considered views. As we will see, research indicates that being consciously open in this way can help to overcome confirmation bias and motivated reasoning.

How does degrees of confidence help us when we are genuinely curious?

This conception of accuracy fits with our goals when we are genuinely curious. When we really want to get things right, we don't allow ourselves to feel confident in a claim unless we have sufficiently strong evidence for it. After all, the more confident we are in something, the more wrong we could be about it! This means we need higher standards of evidence to support our confident beliefs. When our goal is accuracy, we only get more confident when we get more evidence, so that our degree of confidence matches the strength of our support.

Why isn't IQ a good measure of intelligence?

This idea is strongly supported by cognitive psychology. In fact, the kind of intelligence measured by IQ tests doesn't help people much with certain cognitive pitfalls, such as confirmation bias. Even someone with a great deal of intelligence can simply use that intelligence as a kind of lawyer to justify their pre-existing opinions to themselves. In contrast, reasoning well requires the ability to identify which reasons for holding a belief are strong, and which are not—even when it comes to our own beliefs

Evidence

What we mean by "evidence" for a claim is anything we come to know that supports a claim, in the sense that it should increase our degree of confidence in that claim. Keep in mind that sometimes a piece of evidence can provide just a little support for a claim, and sometimes it can prove a claim outright.

Stanford study example

What's striking about this study is the care with which the students examined the opposing evidence: they offered criticisms of sample size, selection methodology, and so on. They just happened to apply much stricter criteria to one side. Many subjects reported trying especially hard to be completely fair and give the other side the benefit of the doubt. They just couldn't overlook the obvious flaws of the research supporting the other side! Several remarked that "they never realized before just how weak the evidence was that people on the other side were relying on for their opinions" [19]. This "completely fair" process consistently led both sides to become more certain that they were right to begin with

Considering the opposite

a technique to reduce biased evaluation of evidence, where we ask ourselves either one of two questions: (i) How would I have treated this evidence if I had the opposite belief? (ii) How would I have treated this evidence if it went the opposite way?

Optional Stopping

allowing the search for evidence to end when convenient; this may skew the evidence if (perhaps unbeknownst to us) we are more likely to stop looking when the evidence collected so far supports our first or favored view So the tendency is to keep looking for evidence when we want to resist accepting a belief, and to stop looking for evidence when we want to stick with our current belief. In other words, we tend to have a different threshold for what is "enough evidence" for believing something, depending on our motivations. We require less evidence for things we want to believe than for things we do not.

The accuracy of our beliefs depends on what?

how confidently it represents things as being a certain way, and whether things actually are that way. The more confidence we have in true beliefs, the better our overall accuracy. The more confidence we have in false beliefs, the worse our overall accuracy.

updating on the evidence

revising our prior beliefs in response to new evidence, so that the confidence we have in a belief will match its degree of support.

Decoupling

separating our prior degree of confidence in a claim from our assessments of the strength of a new argument or a new piece of evidence about that claim. decoupling does not mean we set aside our prior beliefs forever! After assessing the strength of a new piece of potential evidence, we will weigh it against our previous reasons for holding the belief.

introspection illusion

the misguided assumption that our own cognitive biases are transparent to us, and as a result, that we can diagnose these biases in ourselves through introspection.

restricted search

the tendency not to seek out the full range of alternative views or the full range of evidence that favors each view. Along with biased evaluation, restricted search is an instance of confirmation bias.

possibility freeze

the tendency to only consider one or two possibilities in detail, and thereby end up too confident that they are correct. For example, suppose I am wondering whether my favored candidate will win an election in a two-party race. I might just think about the two most obvious possible outcomes, where one of the candidates straightforwardly gets the majority of votes and then wins. But there are other possibilities as well: a candidate could bow out of the race or become sick, or (in a U.S. presidential election) one candidate could win the popular vote while the other wins the electoral college, or an upstart third-party candidate could spoil the vote for one side. If I don't consider the full range of possible outcomes, I am likely to overestimate the two most obvious ones.

Degrees of Confidence

treating beliefs as coming with different levels of certainty. Just as we can be absolutely certain that x is true, we can also think that x is probably true, that x might be true, or that x is probably not true. For example, suppose we are thinking about planning a bike trip through the hills. If there is no road through the hills, we risk getting lost or being unable to complete the trip. So, we need to be pretty confident that there is a road before taking the action of planning or embarking on the trip. If we are not sufficiently confident, we can gather more evidence until we are confident enough, one way or the other.

binary belief

treating beliefs as if they are on/off. For example, we either believe that the cat is on the mat or that the cat is not on the mat, without allowing for different degrees of confidence. Works with the map and roads. Most maps don't have a way of indicating that there's probably or possibly a road through the hills.

Our defensiveness is not

under our control, since it comes from our system 1 he elephant requires training if it is actually going to have a different mindset. This involves learning how to feel differently

Why is this?

we honestly think we're already being unbiased. Confirmation bias happens to other people, or maybe to me in other situations. More generally, those who know about a cognitive bias rarely think in the moment that they are biased, even when they are in precisely the circumstances that give rise to the bias. This effect is known as the bias blindspot

If encounter facts that fit our beliefs

we immediately have a good feeling about them. We expect them, and we take them to provide strong support for our beliefs.

But if we encounter facts that fit better with alternative views,

we tend to ignore them or consider them only weak support for those views.

If it feels like we're fairly evaluating potential evidence but somehow our favored beliefs always remain untouched, then

we're not really being open to alternative views. For example, if we've never changed our minds on any important issue, it's worth asking ourselves: what are the odds that we happened to be right on all of these issues, all along?)


Kaugnay na mga set ng pag-aaral

Mobility (ATI Engage Fundamentals) Funds 100

View Set

ECON 1000 Final Steffans & Whitworth Fall 2019

View Set

Closing the Real Estate Transaction

View Set

Vertical and Horizontal TransitionsWhich of the following statements best describes a trigonometric function having undergone a phase shift? Select all that apply.

View Set

endocrine exam 1 pineal gland and melatonin

View Set

3.1 Basic Machine Startup-Basic Lathe Operator

View Set