12PHIL Thomas Nagel: "What is it like to be a bat?"
Why, on Nagel's view, does it seem impossible to give a physical, objective explanation of the subjective character of experience?
According to Nagel, it seems impossible to give a physical, objective explanation of the subjective character of experience; mental events are tied to single, specific, first-person, wholly subjective experiences, yet that is precisely what the objective, physical account, by its very nature, leaves out. P1. Any complete theory of mind must be able to capture and explain the subjective character of experience P2. The subjective character of experience is essentially connected with a single point of view (and, so, an objective characterisation or explanation is impossible) P3. Any objective, reductionist physicalist theory of mind would, by its very nature, have to abandon (and so, exclude) the single point of view it was aiming to explain in favour of an objective point of view _______________________________________________ C. Therefore, any objective, reductionist physicalist theory of the mind is incomplete
What is the relevance of Nagel's 'conscious aliens' example? How does this connect more broadly to the 'problem of other minds'?
According to Nagel, unlike in the case of bats in which we might at least share various general types of experience (e.g., hunger, fear, pain, lust), in the case of conscious aliens, we may not even share these general types of experience (indeed, there is no reason to suppose that an alien would necessarily have these types of experience at all). What's more, Nagel urges us to see that this not a problem limited to bats and aliens, but exists even between human beings - recall the 'problem of other minds.' We each have a privileged access to our own mind and, try as anyone might, no one can gain access to your subjective experiences - the what it feels like to be you. Nor, if we take the problem further, could we be certain that anyone even has subjective experiences in the way that we, ourselves, do. Hence, we can never truly know what-it-is-like to be a bat, a conscious alien, or even another human being.
What are the three reasons why Nagel chooses bats as his central example?
(1) Bats have conscious experience / there is something it is like to be a bat (2) Bats aren't too far down the phylogenetic tree and, hence, a belief that they have conscious experience won't be too controversial (3) The activity and sensory apparatus of bats is so different from our own that the problem Nagel wishes to pose is exceptionally vivid.
What is the significance of the "blind alien" and a rainbow, lightning or clouds examples? Why does Nagel believe that objectivity and subjectivity are matters of degree?
According to Nagel, we can imagine that blind alien might be able to give an "objective" account of a rainbow, or lightning, or clouds. Even though the concepts of these things might be linked to the human point of view and human experiences, these things ALSO possess objective features. For Nagel, objectivity and subjectivity are matters of degree. We might consider them to be poles on a continuum. He argues that in the case of something like lightning, there is no reason not to see it as far as possible as an objectively observable occurrence. However, in the case of experience, the connection to a particular point of view is MUCH closer.
Does Nagel conclude that materialism is necessarily false?
No. Nagel, instead, is simply aiming to show that IF a materialist hypothesis begins with a faulty or incomplete analysis of mental events (e.g., it fails to include the subjective character of experience or the what-it-is-likeness of experience), THEN we cannot conclude anything from it with confidence.
What is Nagel's 'concluding proposal'?
According to Nagel, we should temporarily put aside the question of the relationship between the mind and the brain, and aim, instead, to first achieve a more objective understanding of the mental in its own right. To achieve this, he says, we will need new concepts and a new method: an "objective phenomenology" which does not depend on taking the point of view of the subject of an experience. Our aim would be to describe the subjective character of experience in a way that could be understood by those not capable of such an experience. [Remember: Phenomenology is the study of structures of consciousness as experienced from the first-person point of view.] For example, we would need such a method to describe the experience of being a bat. But we could begin with humans; for example, how would we explain to someone blind from birth what it is like to see? We might, it is true, eventually reach a "blank wall," but Nagel thinks we should at least pursue this idea as far as we can. He thinks it wouldn't take much to go further with this than we have so far, possibly by giving objective descriptions of structural features of perception. On Nagel's view, this kind of objective phenomenology would enable questions about the physical basis of experience to make more sense.
What, on Nagel's view, is the single necessary and sufficient condition for an organism to have conscious mental states?
According to Nagel: "an organism has conscious mental states if and only if there is something that it is like to be that organism - something it is like for the organism." Nagel terms this 'what-it-is-like-ness' the "subjective character of experience."
What is 'the hard problem of consciousness' and how does Nagel understand the problem?
Coined by David Chalmers in 1995, the hard problem of consciousness, is the problem of explaining the relationship between physical phenomena, such as brain processes, and experience (i.e., phenomenal consciousness, or mental states/events with phenomenal qualities or qualia). Nagel sees the hard problem of consciousness as turning on the "subjectivity" of conscious mental states. He argues that the facts about conscious states are inherently subjective—they can only be fully grasped from limited types of viewpoints. However, scientific explanation demands an objective characterisation of the facts, one that moves away from any particular point of view. Thus, the facts about consciousness elude science and so make "the mind-body problem really intractable." Nagel argues for the inherent subjectivity of the facts about consciousness by reflecting on the question of what it is like to be a bat—for the bat. It seems that no amount of objective data will provide us with this knowledge, given that we do not share its type of point of view (the point of view of a creature able to fly and echolocate). Learning all we can about the brain mechanisms, biochemistry, evolutionary history, psychophysics, and so forth, of a bat still leaves us unable to discover (or even imagine) what it's like for the bat to hunt by echolocation on a dark night. But it is still plausible that there are facts about what it's like to be a bat, facts about how things seem from the bat's perspective. And even though we may have good reason to believe that consciousness is a physical phenomenon (due to considerations of mental causation, the success of materialist science, and so on), we are left in the dark about the bat's conscious experience. This is the hard problem of consciousness on Nagel's view.
What is functionalism and why does Nagel reject it?
Functionalism reduces a thought, desire, pain, or any other type of mental state, solely to how it functions (or the role it plays) within a larger system of inputs and outputs. In this way, it completely sidesteps the question of the exact substance that a mental state might consist in. Hence, on the functionalist's view, mental states are MULTIPLY REALISABLE - the same mental state can be 'realised by' different kinds of physical things. For example, the mental state of pain can be characterised in terms of its function and it is likely that many different kinds of physical properties (e.g., a human brain, a computer, a unique organ in an alien, etc.) could give rise to this mental state. The functionalist view is compatible with materialism and has proven useful in the development of artificial intelligence. On Nagel's view, robots/automata could, in principle, have a 'mental state' as the functionalist understands it, without necessarily having the subjective experience (i.e., the phenomenal / 'what it is like' / subjective character).
What might Nagel's position be in relation to Occam's razor?
Given that Nagel claims that "philosophers share the general human weakness for explanations of what is incomprehensible in terms suited for what is familiar and well understood, though entirely different" and, what's more, that this "has led to the acceptance of implausible accounts of the mental" it might be reasonable to suppose that he would not necessarily use Occam's razor with its penchant for the simple as the best guide to resolving the mind-body problem. Indeed, a complicated phenomenon such as consciousness may well call for an equally complicated explanation.
Why does Nagel argue that experiences may be granted some objective existence (i.e., he opts not to place experiences at the "entirely subjective" end of the objective-subjective continuum)?
How, Nagel asks, would it even make sense to speak of an "objective" character of experience, separate from the point of view of the subject who had the experience? For example, what would remain of what it is like to be a bat if we removed the bat's point of view? Yet, if experience did not have, in addition to its subjective character, SOME objective components that could be observed by outsiders, then how could we say that an alien - or even a human neurophysiologist - could study mental processes? It seems, then, argues Nagel, that experiences may be granted SOME objective existence.
How does Nagel's discussion of subjectivity vs. objectivity highlight a problem for the mind-brain reductionist (e.g., Smart)?
In other types of reduction in science, whose accuracy Nagel does not question, we move towards greater objectivity and away from a human point of view. For example, 'water' becomes 'H20' and 'lightning' becomes 'electrical discharge.' In these cases, members of different species, for example, could refer to a common reality, leaving behind only their species point-of-view (i.e., the "reduction can succeed only if the species-specific viewpoint is omitted from what is being reduced"). However, it does not seem as though we can follow this pattern - from subjective appearance to objective reality - when it comes to experiences. It does not seem that we can get to some kind of 'underlying objective reality' by abandoning the human point of view and, instead, taking up a characterisation of our experiences that even aliens could grasp. This would seem to take us farther from - rather than closer to - the real nature of human experience. Nagel continues his criticism of contemporary philosophers of mind, whom he sees as trying to substitute an objective or behavioural account of mind for the "real thing" so that there is nothing left over that cannot be reduced. If we believe a physicalist theory of mind should be able to account for the subjective character of experience, then we should admit that there is no theory that currently does this. Even if we find it plausible, or even probable, that mental processes are physical brain processes, it remains true that there is something it is like to experience such brain processes. And the 'what it is like' remains unexplained.
How does the mind-body problem intersect with the objective/subjective distinction on Nagel's view?
In terms of the mind-body problem and his distrust of reductionism, Nagel believes he has established that facts about the subjective character of experience can be known only from a particular point of view. Therefore, Nagel says, it is hard to see how they could be observed in the physical processes of an organism. Physical facts about an organism are objective. They can be understood from many different points of view by different types of organisms. We don't have to be bats to understand the neurophysiology of a bat, for example, and it is conceivable that a non-human could have an objective understanding of human neurophysiology. This distinction between the subjective and objective point of view is NOT intended, by itself, as an argument against reductive theories of the mind (e.g., Smart's identity theory) on Nagel's view, however.
What is intentionality and why does Nagel reject it?
Intentionality refers to the idea that mental states (e.g., thoughts, desires, concepts) are ABOUT something, or stand for things that are located externally in the physical world. Intentionality is, thus, a theory that distinguishes between the mental and physical worlds: mental states have an "aboutness" but physical states do not - the latter just "are." This is why the notion of intentionality, sometimes seen as a key feature of (or even as synonymous with) consciousness, has been considered to serve as a challenge to materialism. According to Nagel, however, robots/automata could, in principle, have a "mental state" that could be described as "about" something (i.e., a 'mental state' that was about, directed towards, represented, or stood for something; the mental representation could have "contents" so to speak), and yet still not have the subjective experience (i.e., the phenomenal / 'what it is like' / subjective character).
What is the significance of Nagel's example of the "person deaf and blind from birth"?
It is impossible for us to know what it is like (from a first-person perspective) to be deaf or blind from birth. Equally, a person deaf or blind from birth would find it impossible to know what it is like (from a first-person perspective) to be a non-sensory-impaired person. Even so, Nagel argues, that does not prevent us from thinking that there IS something it is like to be that other person. Similarly, bats and aliens would struggle to know what it is like to be a human. But the alien would be mistaken in supposing that since it cannot imagine what it would be like to be us, that humans do not, therefore, have conscious experience. Likewise, we would be mistaken in thinking that because we cannot imagine what it would be like to be a bat, that bats do not, therefore, have conscious experience.
Nagel argues that consciousness also cannot be reduced to functional brain states or intentional states. What is Nagel's reason for thinking so?
Nagel argues that "...since these [functional or intentional states] could be ascribed to robots or automata that behaved like people though they experienced nothing." Nagel is referring here to the theories of functionalism and intentionality, both of which have been offered as ways of accounting for the peculiar nature of the mind and consciousness. It should be noted that Nagel doesn't deny that mental states can be given functional or intentional characterisations, rather, he denies that such characterisations exhaust their analysis. In other words, whilst functionalism and intentionality may offer a partial explanation of the mind, these reductive theories - just like Smart's identity thesis - fail to properly capture and explain a key feature of the mind/consciousness, namely, the subjective character of experience.
What is Nagel's overarching thesis?
Nagel argues that a "careful examination will show that no currently available concept of reduction is applicable to [consciousness]. Perhaps a new theoretical form can be devised for the purpose, but such a solution, if it exists, lies in the distant intellectual future."
Summary of Thomas Nagel's "What is it like to be a bat?"
Nagel argues that the extraordinary phenomenon of consciousness poses a major challenge to reductionist, materialist theories of mind. He does not deny that there might turn out to be some way of reducing consciousness to the physico-chemical processes of the brain, but he urges us to see that we are a long way from knowing how this might be done. Indeed, as Nagel points out, what remains distinctive about consciousness, no matter to what degree of detail we ever manage to map it to physical states, is its subjective, first-person quality. For any organism that has consciousness - be it human or non-human, Nagel argues, "there is something that it is like to be that organism - something it is like for that organism." For Nagel, if consciousness is, by its very nature, a SUBJECTIVE phenomenon - in a way that is so for no other phenomenon we know of in the universe - then it is surely impossible to analyse it completely within the same terms as OBJECTIVE physical phenomena. If we take the example of a bat - a creature very different from us in a host of ways - it seems impossible that we could ever offer an OBJECTIVE account of the SUBJECTIVE experience of "what it is like for a bat to be a bat." Nagel does not attempt to disprove the physicalist claim that mental states are nothing over and above physical brain states and processes. Rather, he simply argues that the two terms of this alleged identity are so different, and our grasp of what we are actually comparing is so limited, that we lack grounds to make the physicalist claim with ANY kind of confidence.
Nagel claims that the subjective character of experience - the 'what it is like' (or phenomenal consciousness as philosophers often call it) - "is not captured by any of the familiar, or recently devised" reductive theories of mind that have been put forward to date. What is his reason for thinking so?
Nagel believes that the recently devised reductive physicalist theories are "logically compatible" with the "absence" of consciousness. In other words, consciousness could be removed from the physicalist's conception of the mind altogether (e.g., Smart's) without altering it in any way.
Nagel rejects behaviourism as providing any kind of adequate reductive account of consciousness. What is Nagel's reason for rejecting behaviourism?
Nagel does "not deny [as the behaviourist would] that [inner] conscious mental states and events cause behaviour" he denies "only that this kind of thing exhausts their analysis." Remember, the behaviourist denies any kind of inner, first-person experience, only acknowledging its public, objectively observable demonstrations via behaviour. For this reason, the traditional behaviourist also denies that inner mental states cause behaviour - it is this idea that Nagel is referring to.
Does Nagel seek to completely dispute the claims of physicalism, functionalism, intentionality, behaviourism, or any other kind of currently available reductive theory of the mind/consciousness?
Nagel does not aim to dispute the claims of physicalism, functionalism, intentionality, behaviourism, or any other kind of currently available reductive theory - nor is he seeking to align himself with either dualism or physicalism. Instead, he is aiming to show that whatever merit reductive theories may have as a partial account of consciousness, they do not completely account for it: "I deny only that this kind of thing exhausts their analysis." This is his overarching criticism of reductive theories. To reduce mental states to something else, one ought to analyse exactly what it is that one is reducing. If one leaves out the subjective character of experience - the what-it-is-like-ness - then the exercise is incomplete and cannot claim to prove ANYTHING. In other words, Nagel says we shouldn't find reductive theories plausible if consciousness is excluded or the theory doesn't seem able to be extended to include consciousness. Moreover, he suggests that without achieving at least some idea of what the subjective character of experience IS, we can't even fathom what a physicalist theory that CAN adequately capture and explain the subjective character of experience would look like.
What is the significance of Nagel's analogy of the person "ignorant of insect metamorphosis" who is surprised to find a butterfly in place of the caterpillar? To what extent might this analogy allow us to view the flaws in the mind-body identity thesis more charitably on Nagel's view?
Nagel grants that we may have evidence for many things we do not understand. For example, you might know nothing about caterpillars, lock one away in a safe one day, and then be very surprised to find a butterfly weeks later. You may not understand what has occurred, but you would have good pretty evidence that the caterpillar turned into the butterfly. Perhaps, says Nagel, we are in this position with respect to mind-body identity. We do have evidence that sensations are physical processes, although we don't really know how it is so. Nagel cites Donald Davidson, the author of a complex theory that argues that IF mental events have physical causes and effects, THEN they must have physical descriptions. Even so, Nagel argues, we are a long way from understanding how we might form satisfactory physical descriptions of this nature.
What is Nagel's attitude towards science?
Nagel is not anti-science. Rather, he simply argues that science should not be the authority we turn to in relation to the mind-body problem. This is because science, by its very nature, focuses on the objective. Given that a defining feature of consciousness is its radically subjective nature, it seems that science is ill-equipped to provide a complete explanation.
Nagel claims that the ideas that "mental states are brain states" is an incomprehensible position - one we simply "cannot understand" - because we cannot even conceive of how it might be true. What is the possible objection Nagel considers against his claim? How does he reply?
Nagel notes that the physicalist could object that "mental states are brain states" is NOT incomprehensible. We just need to know which brain states are involved. It then comes down to some kind of identity - an "is" or an "are." Shouldn't that be simple? But, Nagel replies, the little word "is" is deceptive. In statements like "X is Y" our understanding of what is meant depends on our knowledge of X and Y, rather than on the word "is." When X and Y are quite different things, we tend to be confused by such sentences. For example, if I know nothing about physics, then the statement "matter IS energy" is very confusing. Know what "is" means doesn't help me. Nagel compares our present understanding of the claim that "mental states are brain states" to pre-Socratic philosophers trying to understand the claim "matter is energy." Much more than the meaning of "is" or "are" is required. Moreover, Nagel contends that we don't really know how a mental and a physical term could refer to the same thing.
Why, on Nagel's view, are examples such as water is identical to H2O, or lightning is identical to electrical discharge, not helpful in the context of the mind-body problem? Why does he think philosophers are so prone to relying on examples such as these?
Nagel thinks that, for example, the idea that lightning is identical to electrical discharge and other such examples are "unlikely... [to] shed light on the relation of mind to brain." What's more, he thinks that "philosophers share the general human weakness for explanations of what is incomprehensible in terms suited for what is familiar and well understood, though entirely different. This has led to the acceptance of implausible accounts of the mental."
Is it possible, according to Nagel, for us to understand what it is like to be a bat?
No. According to Nagel, the activities and sensory system of a bat are VERY different from ours for a bat perceives the distance, size, shape, motion and texture of things through sonar, or echolocation. Thus, we have no reason to believe that the subjective experience of bats is anything remotely like our own. We cannot simply imagine flying around at night listening to echoes and catching bugs, since our imagination draws on our OWN experience and is, therefore, limited. We would only be imagining what it would be like for US to be bats, whereas what we want to know is what it would be like for a BAT to be a bat. As long as our "fundamental structure" remains the same, says Nagel, our experience would not be those of a bat. Some might suggest that we could adapt our neuro-physiological structure to better resemble that of a bat. But even if we could imagine being gradually transformed into a bat, there is nothing in our present make up that would allow us to imagine what that would be like. Whilst it's true that we share various general types of experience with bats - e.g., hunger, fear, pain, lust - we cannot know the subjective character of these experiences for a bat. Indeed, we could never really know what it is like for a bat to experience lust towards another bat.
According to Nagel, there is a factor that makes the mind-body problem extremely difficult to solve - one that is ignored by materialist's like Smart - and we know very little about it. What is this factor?
On Nagel's view, consciousness is what makes the mind-body problem really intractable and materialists (like Smart) tend to ignore it altogether.
According to Nagel, there are facts about bat consciousness that human beings can never understand. How might you represent this argument in standard form?
P1. Essential to conscious experience is the fact that there is SOMETHING IT IS LIKE to have them FOR the subject who has them P2. Bats have conscious experience _______________________________ C1. / P3. Therefore, there is SOMETHING IT IS LIKE to be a bat FOR a bat P4. Humans cannot ever understand WHAT IT IS LIKE to be a bat FOR a bat _______________________________________________ C2. Therefore, there are facts about bat consciousness that humans can never understand
From his discussion of bats and conscious aliens, Nagel confronts the idea that there may be facts that simply lie beyond the reach of human concepts. What are 'conceptual schemes' and how do they relate to Nagel's position?
The notion of conceptual schemes is philosophical territory famously analysed by Immanuel Kant. Consider that much of our conscious experience concerns raw sense data. Many philosophers have argued that raw sense data alone is incomprehensible to us and we must process it in some way: it must be organised and categorised somehow so that we can understand it. Kant used the term conceptual schemes for the mental systems that organise what we sense. He thought these frameworks - including things like time and causation - must be innate, or a part of the apparatus we are born with. Nagel's line of argument in relation to the conscious experience of bats and aliens implies a belief in the existence of facts which we will NEVER have the conceptual schemes to understand. When Nagel says there are things we will never comprehend, this isn't because he thinks we'll, say, run out of time to work them out, but rather, that there are some concepts that are simply beyond our reach (e.g., the subjective character of bat consciousness), because we are not made in a way that could ever permit that understanding. Whilst Nagel grants that there is a sense in which an individual human can say that he/she knows what the experience of another human is like, as a creature becomes less like us, the task is much more challenging. Moreover, as Nagel makes clear, he is not just raising the epistemological problem that we cannot KNOW what it is like to be a bat, but the deeper problem that we cannot even form a CONCEPT of what it is like to be a bat, let alone come to know something of a bat's subjective, conscious experience.