History of ideas AFRICAN-AMERICAN IDEAS. and MORE

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

ALIENATION. The notion of alienation is a very unusual one because it is at once an attempt to explain a widespread feeling—a very subjective, somewhat indefinable feeling—and a critique of the nature of any society that regularly produces it. This was not always so. The feeling that one is not at home in the world, the sense of estrangement from one's surrounding, oneself, and other people, appears to be as old as history; for most world religions (Buddhism, most strains of Christianity and Daoism, Sufi strands in Islam) this feeling was seen mainly as reflecting a profound insight into the truth of the human condition. Hermits, monks, and meditators often actively valued or cultivated feelings of alienation as a way to something higher. Calvinism came closer to the modern conception in seeing feelings of isolation and emptiness as a sign of humanity's fall from grace, but it was really only in the nineteenth century that the modern understanding of the term came into being. This conception was closely tied to the experience of living in a vast, impersonal, industrial city. Feelings of alienation were particularly prone to strike those who in earlier generations might have been considered likely victims of melancholia: intellectuals, artists, and youth. The effects were much the same: depression, anxiety, hopelessness, suicide. One might distinguish two main strains in the modern alienation literature: one that stressed the experience itself as an unavoidable (though possibly ameliorable) effect of the impersonal, bureaucratized nature of modern life, entailing the loss of any ability to use that experience to attain some deeper, more genuine truth about the world—since with the death of God and traditional structures of authority, most of these truths were considered definitively lost. The other, drawing on older theological traditions, saw alienation as the key to the true, hidden nature of the modern (i.e., capitalist, industrial) order itself, showing it to be an intolerable situation that could be resolved only by overthrowing that order and replacing it with something profoundly different. The first tradition can be found in social thinkers such as Alexis de Tocqueville, Émile Durkheim, or Max Weber; novelists such as Fyodor Dostoyevsky or Franz Kafka; and philosophers such as Søren Kierkegaard or Friedrich Nietzsche. Here alienation is the darker underside of all the positive values of modernity, the experience of those sundered from all previous sources of meaning: community, hierarchy, the sacred. It is the point where individualism becomes isolation, freedom becomes rootlessness, egalitarianism becomes the destruction of all value, rationality, an iron cage. Probably the most famous formulation within this genre was Émile Durkheim's (1858-1917) notion of anomie. Observing that suicide rates tend to go up during times of both economic boom and economic collapse, Durkheim concluded that this could only be because both booms and busts threw ordinary people's expectations so completely in disarray that they ended up in a state of lacking norms, unable to determine what they had a right to expect or even want from life and unable to imagine a time when they could. This kind of analysis could lead either to a resigned pessimism, the assumption (favored by social conservatives) that public life in modern society can never really be anything but alienated, or to a liberal approach that saw alienation as a form of deviance or lack of proper integration that policymakers should ideally be able to ameliorate or even overcome. The other tradition can be traced to Georg Wilhelm Friedrich Hegel (1770-1831), who drew heavily on theological sources. For Hegel, "alienation" was a technical term, a necessary moment in the process whereby Spirit (which for Hegel was simultaneously God, Mind, Spirit, and Human SelfConsciousness) would achieve true self-knowledge. Human history involved the same story: Mind would project itself out into the world, creating, say, Law, or Art, or Science, or Government; it would then confront its creations as something alien to it and strange; then, finally, coming to understand that these alienated forms are really aspects of itself, would reincorporate them and come to a richer self-conception as a result. Karl Marx (1818-1883) remained true to this dialectical approach but concentrated on the material creativity of work, emphasizing that under capitalism, not only the products of one's labor but one's labor itself, one's very capacity to create— and for Marx, this is one's very humanity—becomes a commodity that can be bought and sold and hence appears to the worker as an "alien force." Insofar as Marx shares Hegel's optimism, and sees this dilemma as opening the way to a new, revolutionary society, all this is much in line with the older, theological conception in which alienation, however painful, is a realization about the truth of one's relation to the world, so that understanding this becomes the key to transcending it. Twentieth-century Marxists, though, have not been so uniformly optimistic. While Marxist regimes officially claimed to have eliminated the problem of alienation in their own societies, Western ALIENATION 48 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 48 Marxism, starting with György Lukács (1885-1971) and climaxing with the Frankfurt School, forced to explain the lack of revolutionary change in industrial democracies, gradually became a prolonged meditation on the varied forms of alienation (reification, objectification, fetishism, etc.) in modern life. This emphasis set the tone for an outpouring of literature on the subject in the mid-twentieth century, not all of it Marxist. France in the 1950s and early 1960s saw the emergence of a particularly rich body of alienation theory, ranging from the Existentialism of Jean-Paul Sartre and Albert Camus, which attempted to formulate an ethics for the isolated individual, to a variety of Marxist approaches, of which the most extravagant—and influential—was developed by the Situationist International, whose members saw modern consumer society as a gigantic "spectacle," a vast apparatus composed of not only media images but market logic, the rule of experts, and the nature of the commodity form, all combining together to render individuals passive and isolated spectators of their own lives. Like many of the radical art movements from which they emerged, the Situationists were dedicated to imagining ways to revolutionize everyday life itself as a way of overcoming the "living death" of capitalist alienation. After the failed insurrection of May 1968 in France, this literature on alienation rapidly disappeared in the face of poststructuralist critiques that argued it was impossible to talk about a human subject alienated from society or from itself because the subject was itself an effect of discourse and hence a social construct. Over the course of the 1970s and 1980s, these critiques spread outside France and the theme of alienation has, as a result, largely disappeared from intellectual debate in the early twenty-first century. There are two main exceptions. First, in its radical, redemptive form, the idea of alienation has remained alive in artistic and revolutionary circles largely outside the academy. Situationism, for example, is still very much at the center of the (increasingly international) anarchist and punk scenes, both of which are largely rebellions against the meaninglessness and alienation of "mainstream" urban, industrial, or postindustrial life. These themes have suddenly reemerged to public attention with the rise of the "antiglobalization" movement, though they have still found almost no echo in the academy. Second, in its more liberal, ameliorative form, the idea of alienation became ensconced in certain branches of sociology and hence reemerged in what is increasingly called "postmodern" alienation theory. When American sociologists started taking up the theme of alienation systematically in the 1950s and 1960s, they began by making it into a factor that could be quantified. Various questionnaires and techniques of tabulating an individual's degree of alienation were developed; surveys then revealed, not entirely surprisingly, that aside from students, those who scored highest for alienation were, precisely, aliens, immigrants, or else members of minority groups already defined as marginal to mainstream American life. Over the course of the 1990s and the early twenty-first century, this sociological work has converged with an interest in identity and identity-based social movements to yield a new, "postmodern" body of alienation theory. On the individual level, alienation is said to occur when there is a clash between one's own self-definition and the identity assigned one by a larger society. Alienation thus becomes the subjective manner in which various forms of oppression (racism, sexism, ageism, etc.) are actually experienced and internalized by their victims. As a result, where the older revolutionary conception sees alienation as essential to the fundamentally violent, antihuman nature of "the mainstream," postmodern theories now once again see alienation as a measure of exclusion from the mainstream. On the social level, the postmodern conception of alienation is said to be caused by a surfeit rather than a lack of freedom; a notion that appears almost impossible to distinguish from what were, in the late nineteenth century, called "modern" concepts of alienation. So far, these two traditions have barely come into contact with each other—except, perhaps, in recent environmentalist ideas about "alienation from nature." How or whether they will make contact remains an open question.

ALIENATION. The notion of alienation is a very unusual one because it is at once an attempt to explain a widespread feeling—a very subjective, somewhat indefinable feeling—and a critique of the nature of any society that regularly produces it. This was not always so. The feeling that one is not at home in the world, the sense of estrangement from one's surrounding, oneself, and other people, appears to be as old as history; for most world religions (Buddhism, most strains of Christianity and Daoism, Sufi strands in Islam) this feeling was seen mainly as reflecting a profound insight into the truth of the human condition. Hermits, monks, and meditators often actively valued or cultivated feelings of alienation as a way to something higher. Calvinism came closer to the modern conception in seeing feelings of isolation and emptiness as a sign of humanity's fall from grace, but it was really only in the nineteenth century that the modern understanding of the term came into being. This conception was closely tied to the experience of living in a vast, impersonal, industrial city. Feelings of alienation were particularly prone to strike those who in earlier generations might have been considered likely victims of melancholia: intellectuals, artists, and youth. The effects were much the same: depression, anxiety, hopelessness, suicide. One might distinguish two main strains in the modern alienation literature: one that stressed the experience itself as an unavoidable (though possibly ameliorable) effect of the impersonal, bureaucratized nature of modern life, entailing the loss of any ability to use that experience to attain some deeper, more genuine truth about the world—since with the death of God and traditional structures of authority, most of these truths were considered definitively lost. The other, drawing on older theological traditions, saw alienation as the key to the true, hidden nature of the modern (i.e., capitalist, industrial) order itself, showing it to be an intolerable situation that could be resolved only by overthrowing that order and replacing it with something profoundly different. The first tradition can be found in social thinkers such as Alexis de Tocqueville, Émile Durkheim, or Max Weber; novelists such as Fyodor Dostoyevsky or Franz Kafka; and philosophers such as Søren Kierkegaard or Friedrich Nietzsche. Here alienation is the darker underside of all the positive values of modernity, the experience of those sundered from all previous sources of meaning: community, hierarchy, the sacred. It is the point where individualism becomes isolation, freedom becomes rootlessness, egalitarianism becomes the destruction of all value, rationality, an iron cage. Probably the most famous formulation within this genre was Émile Durkheim's (1858-1917) notion of anomie. Observing that suicide rates tend to go up during times of both economic boom and economic collapse, Durkheim concluded that this could only be because both booms and busts threw ordinary people's expectations so completely in disarray that they ended up in a state of lacking norms, unable to determine what they had a right to expect or even want from life and unable to imagine a time when they could. This kind of analysis could lead either to a resigned pessimism, the assumption (favored by social conservatives) that public life in modern society can never really be anything but alienated, or to a liberal approach that saw alienation as a form of deviance or lack of proper integration that policymakers should ideally be able to ameliorate or even overcome. The other tradition can be traced to Georg Wilhelm Friedrich Hegel (1770-1831), who drew heavily on theological sources. For Hegel, "alienation" was a technical term, a necessary moment in the process whereby Spirit (which for Hegel was simultaneously God, Mind, Spirit, and Human SelfConsciousness) would achieve true self-knowledge. Human history involved the same story: Mind would project itself out into the world, creating, say, Law, or Art, or Science, or Government; it would then confront its creations as something alien to it and strange; then, finally, coming to understand that these alienated forms are really aspects of itself, would reincorporate them and come to a richer self-conception as a result. Karl Marx (1818-1883) remained true to this dialectical approach but concentrated on the material creativity of work, emphasizing that under capitalism, not only the products of one's labor but one's labor itself, one's very capacity to create— and for Marx, this is one's very humanity—becomes a commodity that can be bought and sold and hence appears to the worker as an "alien force." Insofar as Marx shares Hegel's optimism, and sees this dilemma as opening the way to a new, revolutionary society, all this is much in line with the older, theological conception in which alienation, however painful, is a realization about the truth of one's relation to the world, so that understanding this becomes the key to transcending it. Twentieth-century Marxists, though, have not been so uniformly optimistic. While Marxist regimes officially claimed to have eliminated the problem of alienation in their own societies, Western ALIENATION 48 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 48 Marxism, starting with György Lukács (1885-1971) and climaxing with the Frankfurt School, forced to explain the lack of revolutionary change in industrial democracies, gradually became a prolonged meditation on the varied forms of alienation (reification, objectification, fetishism, etc.) in modern life. This emphasis set the tone for an outpouring of literature on the subject in the mid-twentieth century, not all of it Marxist. France in the 1950s and early 1960s saw the emergence of a particularly rich body of alienation theory, ranging from the Existentialism of Jean-Paul Sartre and Albert Camus, which attempted to formulate an ethics for the isolated individual, to a variety of Marxist approaches, of which the most extravagant—and influential—was developed by the Situationist International, whose members saw modern consumer society as a gigantic "spectacle," a vast apparatus composed of not only media images but market logic, the rule of experts, and the nature of the commodity form, all combining together to render individuals passive and isolated spectators of their own lives. Like many of the radical art movements from which they emerged, the Situationists were dedicated to imagining ways to revolutionize everyday life itself as a way of overcoming the "living death" of capitalist alienation. After the failed insurrection of May 1968 in France, this literature on alienation rapidly disappeared in the face of poststructuralist critiques that argued it was impossible to talk about a human subject alienated from society or from itself because the subject was itself an effect of discourse and hence a social construct. Over the course of the 1970s and 1980s, these critiques spread outside France and the theme of alienation has, as a result, largely disappeared from intellectual debate in the early twenty-first century. There are two main exceptions. First, in its radical, redemptive form, the idea of alienation has remained alive in artistic and revolutionary circles largely outside the academy. Situationism, for example, is still very much at the center of the (increasingly international) anarchist and punk scenes, both of which are largely rebellions against the meaninglessness and alienation of "mainstream" urban, industrial, or postindustrial life. These themes have suddenly reemerged to public attention with the rise of the "antiglobalization" movement, though they have still found almost no echo in the academy. Second, in its more liberal, ameliorative form, the idea of alienation became ensconced in certain branches of sociology and hence reemerged in what is increasingly called "postmodern" alienation theory. When American sociologists started taking up the theme of alienation systematically in the 1950s and 1960s, they began by making it into a factor that could be quantified. Various questionnaires and techniques of tabulating an individual's degree of alienation were developed; surveys then revealed, not entirely surprisingly, that aside from students, those who scored highest for alienation were, precisely, aliens, immigrants, or else members of minority groups already defined as marginal to mainstream American life. Over the course of the 1990s and the early twenty-first century, this sociological work has converged with an interest in identity and identity-based social movements to yield a new, "postmodern" body of alienation theory. On the individual level, alienation is said to occur when there is a clash between one's own self-definition and the identity assigned one by a larger society. Alienation thus becomes the subjective manner in which various forms of oppression (racism, sexism, ageism, etc.) are actually experienced and internalized by their victims. As a result, where the older revolutionary conception sees alienation as essential to the fundamentally violent, antihuman nature of "the mainstream," postmodern theories now once again see alienation as a measure of exclusion from the mainstream. On the social level, the postmodern conception of alienation is said to be caused by a surfeit rather than a lack of freedom; a notion that appears almost impossible to distinguish from what were, in the late nineteenth century, called "modern" concepts of alienation. So far, these two traditions have barely come into contact with each other—except, perhaps, in recent environmentalist ideas about "alienation from nature." How or whether they will make contact remains an open question.

The Scientific Revolution and Beyond Although it has been assumed that alchemy was inconsistent with new ideas about nature associated with the scientific revolution or that the promoters of the "new science" rejected alchemy, historians now refuse simple narratives associating the scientific revolution with the decline of alchemy. During the seventeenth century, alchemy continued to be a vibrant field of natural philosophical inquiry; most prominent natural philosophers took a mixed view of it. Francis Bacon (1561-1626), for instance, condemned alchemists' tendency toward secrecy, contrasting it with the openness and cooperation that he advocated in reforming the pursuit of natural knowledge. Still, he looked to alchemy as an important source of knowledge about matter and medicine. Historians have shown that Robert Boyle (1627-1691) and Isaac Newton (1642-1727) were both deeply involved with alchemy both in theory and practice, often in startlingly productive ways. Boyle, for instance, believed in the possibility of transmutation and worked on it for decades, seeking out the knowledge and skills of numerous adepts. Boyle's corpuscular matter theory, for which he is often hailed as a crucial figure in the history of chemistry, bolstered his belief in transmutation and the philosophers' stone. Scholars have shown that Isaac Newton was an avid student of alchemy as well, likely devoting more time to alchemical study and experiments in his lifetime than to physics. Although historians still have a great deal to understand about the exact purpose of Newton's alchemical studies, they clearly played a crucial role in his larger project of understanding God and nature. By the end of the seventeenth century, alchemy was associated with new medicines, natural magic, ancient wisdom, and popular recipes for making gold as well as the innovations of the scientific revolution. Over the course of the eighteenth century, however, the alchemist's purview came to be more limited. Influenced by Antoine Lavoisier's (1743-1794) efforts to resituate chemical natural philosophy on a foundation of quantitative analysis of matter, a new kind of chemist emerged. Whereas the terms alchemy and chemistry were used synonymously until the end of the seventeenth century, in the eighteenth century scholars increasingly sought to separate the two, restricting alchemy to gold making and spiritual alchemy (activities that natural philosophers began to exclude from science), while also redefining alchemy's scientific and technical dimensions as chemistry. As a result, alchemy increasingly lost its long-standing association with science in the eighteenth century, retaining only its ancient links to mysticism and transmutation. Alchemy continued to flourish among communities of occultists and Romantic natural philosophers in the late eighteenth, nineteenth, and twentieth centuries. Johann Wolfgang von Goethe (1749-1832), for instance, had an enduring interest in alchemy, viewing it as a secret key to the relationships between humans, God, and the cosmos. Nineteenth-century occultists picked up a theosophical thread in the writings of earlier authors such as Paracelsus, Valentin Weigel (1533-1588), Jakob Böhme (1575-1624), and Emanuel Swedenborg (1688-1772), incorporating alchemical study and images into the activities of secret societies. Finally, in the early twentieth century, the psychologist Carl Gustav Jung identified similarities between alchemical symbols and the dreams of his patients, positing that alchemists' descriptions of transmutation were a metaphor for the development of the individual. This view of alchemy, which interprets alchemy as a symbol for deeper psychological processes, has endured in the popular imagination into the twenty-first century.

The Scientific Revolution and Beyond Although it has been assumed that alchemy was inconsistent with new ideas about nature associated with the scientific revolution or that the promoters of the "new science" rejected alchemy, historians now refuse simple narratives associating the scientific revolution with the decline of alchemy. During the seventeenth century, alchemy continued to be a vibrant field of natural philosophical inquiry; most prominent natural philosophers took a mixed view of it. Francis Bacon (1561-1626), for instance, condemned alchemists' tendency toward secrecy, contrasting it with the openness and cooperation that he advocated in reforming the pursuit of natural knowledge. Still, he looked to alchemy as an important source of knowledge about matter and medicine. Historians have shown that Robert Boyle (1627-1691) and Isaac Newton (1642-1727) were both deeply involved with alchemy both in theory and practice, often in startlingly productive ways. Boyle, for instance, believed in the possibility of transmutation and worked on it for decades, seeking out the knowledge and skills of numerous adepts. Boyle's corpuscular matter theory, for which he is often hailed as a crucial figure in the history of chemistry, bolstered his belief in transmutation and the philosophers' stone. Scholars have shown that Isaac Newton was an avid student of alchemy as well, likely devoting more time to alchemical study and experiments in his lifetime than to physics. Although historians still have a great deal to understand about the exact purpose of Newton's alchemical studies, they clearly played a crucial role in his larger project of understanding God and nature. By the end of the seventeenth century, alchemy was associated with new medicines, natural magic, ancient wisdom, and popular recipes for making gold as well as the innovations of the scientific revolution. Over the course of the eighteenth century, however, the alchemist's purview came to be more limited. Influenced by Antoine Lavoisier's (1743-1794) efforts to resituate chemical natural philosophy on a foundation of quantitative analysis of matter, a new kind of chemist emerged. Whereas the terms alchemy and chemistry were used synonymously until the end of the seventeenth century, in the eighteenth century scholars increasingly sought to separate the two, restricting alchemy to gold making and spiritual alchemy (activities that natural philosophers began to exclude from science), while also redefining alchemy's scientific and technical dimensions as chemistry. As a result, alchemy increasingly lost its long-standing association with science in the eighteenth century, retaining only its ancient links to mysticism and transmutation. Alchemy continued to flourish among communities of occultists and Romantic natural philosophers in the late eighteenth, nineteenth, and twentieth centuries. Johann Wolfgang von Goethe (1749-1832), for instance, had an enduring interest in alchemy, viewing it as a secret key to the relationships between humans, God, and the cosmos. Nineteenth-century occultists picked up a theosophical thread in the writings of earlier authors such as Paracelsus, Valentin Weigel (1533-1588), Jakob Böhme (1575-1624), and Emanuel Swedenborg (1688-1772), incorporating alchemical study and images into the activities of secret societies. Finally, in the early twentieth century, the psychologist Carl Gustav Jung identified similarities between alchemical symbols and the dreams of his patients, positing that alchemists' descriptions of transmutation were a metaphor for the development of the individual. This view of alchemy, which interprets alchemy as a symbol for deeper psychological processes, has endured in the popular imagination into the twenty-first century.

Archaeology Archaeology's roots lie in the early eighteenth century, when the landed gentry in Britain and elsewhere in Europe began to acquire stone, bronze, and iron implements for display, but it was not until late in that century that serious excavations began, largely inspired by discoveries at Pompeii and Herculaneum. Two major events in the 1830s moved the fledgling discipline of archaeology to a new level. One was Jacques Boucher de Perthes's (1788-1868) discovery in 1838, near Abbeville, France, of a crude lithic (stone) technology that predated the gentlemen's displayed objects by well over 100,000 years. The second was the Danish scholar Christian Thomsen's (1788- 1865) articulation of the "three-age system"—still a fundamental archaeological concept: the Stone, Bronze, and Iron Ages, which appeared in 1836 in the Catalogue of the Danish National Museum's collection (Thomsen was its first curator). Because stone artifacts typically came from the lowest levels of a trench or pit, while bronze objects came from the middle levels, and iron objects were typically found closest to the surface, Thomsen realized that this reflected a universal temporal sequence. A generation later, in another important book, Prehistoric Times (1865), Sir John Lubbock (1834-1913), later Lord Avebury, not only coined the term prehistory, but also divided Thomsen's Stone Age into two successive stages, the Paleolithic, or Old Stone Age, and the Neolithic, or New Stone Age. Subsequent archaeologists added the term Mesolithic to refer to the transitional period at the end of the Ice Age between the Paleolithic and the Neolithic, which saw the beginnings of settled life, agriculture, and animal husbandry. By the early twentieth century, archaeology was an established scholarly discipline. In subsequent decades, archaeologists sought to discover sequences, or stages, in the evolution of culture per se and to reconstruct the trajectory of cultural development in specific regions, such as the ancient Near East, Mexico, the American Southwest, Peru, Africa, India, Oceania, East Asia, and Southeast Asia. In the late twentieth and early twenty-first centuries, a split occurred between processual and postprocessual archaeologists. Processual archaeology starts from the assumption that all human communities are themselves systems, and need to be viewed as such. Processual archaeologists are primarily concerned with the processes whereby ancient peoples adapted to their ecosystems, and how these processes changed over time as the ecosystems changed. Postprocessual archaeology, on the other hand, focuses on reconstructing the daily lives of the people who lived in prehistoric communities, how their societies were organized, the nature of their religious beliefs and worldviews, their socioeconomic hierarchies, and other elements of culture that sociocultural anthropologists study in living communities. Postprocessualists, for the most part, see themselves as cultural anthropologists who work with artifacts rather than living informants. Out of processual and postprocessual archaeology have developed branches of contemporary archaeology: urban archaeology, which looks at the nature of urban life in premodern cities, and industrial archaeology, which attempts to reconstruct what life was like for the majority of people in early industrial towns in the Midlands of England, parts of New England, and elsewhere in the emerging industrial regions of Europe and America in the late eighteenth and early nineteenth centuries. A similar approach has been applied to reconstructing the lives of enslaved Africans and AfricanAmericans on antebellum plantations in the Caribbean and the American South, as well as of African-Americans in the urban northeast. Other trends in contemporary archaeology include a focus on the lives of women, ordinary people, and the poor, rather than the "great men" of history. Scientific developments that make it possible to recover detailed data about diet, farming systems, and other aspects of everyday life have provided the technical impetus for these new research areas. While the study of the most ancient manifestations of human culture, such as the rise of agriculture or the state, remains important to archaeologists, an increasing number have turned to historical projects in which documents and archives and working with historians complement the material remains retrieved during excavations. Such projects promise new insights into many aspects of human history, including medieval Europe, colonial Latin America, and early settlements in the United States.

Archaeology Archaeology's roots lie in the early eighteenth century, when the landed gentry in Britain and elsewhere in Europe began to acquire stone, bronze, and iron implements for display, but it was not until late in that century that serious excavations began, largely inspired by discoveries at Pompeii and Herculaneum. Two major events in the 1830s moved the fledgling discipline of archaeology to a new level. One was Jacques Boucher de Perthes's (1788-1868) discovery in 1838, near Abbeville, France, of a crude lithic (stone) technology that predated the gentlemen's displayed objects by well over 100,000 years. The second was the Danish scholar Christian Thomsen's (1788- 1865) articulation of the "three-age system"—still a fundamental archaeological concept: the Stone, Bronze, and Iron Ages, which appeared in 1836 in the Catalogue of the Danish National Museum's collection (Thomsen was its first curator). Because stone artifacts typically came from the lowest levels of a trench or pit, while bronze objects came from the middle levels, and iron objects were typically found closest to the surface, Thomsen realized that this reflected a universal temporal sequence. A generation later, in another important book, Prehistoric Times (1865), Sir John Lubbock (1834-1913), later Lord Avebury, not only coined the term prehistory, but also divided Thomsen's Stone Age into two successive stages, the Paleolithic, or Old Stone Age, and the Neolithic, or New Stone Age. Subsequent archaeologists added the term Mesolithic to refer to the transitional period at the end of the Ice Age between the Paleolithic and the Neolithic, which saw the beginnings of settled life, agriculture, and animal husbandry. By the early twentieth century, archaeology was an established scholarly discipline. In subsequent decades, archaeologists sought to discover sequences, or stages, in the evolution of culture per se and to reconstruct the trajectory of cultural development in specific regions, such as the ancient Near East, Mexico, the American Southwest, Peru, Africa, India, Oceania, East Asia, and Southeast Asia. In the late twentieth and early twenty-first centuries, a split occurred between processual and postprocessual archaeologists. Processual archaeology starts from the assumption that all human communities are themselves systems, and need to be viewed as such. Processual archaeologists are primarily concerned with the processes whereby ancient peoples adapted to their ecosystems, and how these processes changed over time as the ecosystems changed. Postprocessual archaeology, on the other hand, focuses on reconstructing the daily lives of the people who lived in prehistoric communities, how their societies were organized, the nature of their religious beliefs and worldviews, their socioeconomic hierarchies, and other elements of culture that sociocultural anthropologists study in living communities. Postprocessualists, for the most part, see themselves as cultural anthropologists who work with artifacts rather than living informants. Out of processual and postprocessual archaeology have developed branches of contemporary archaeology: urban archaeology, which looks at the nature of urban life in premodern cities, and industrial archaeology, which attempts to reconstruct what life was like for the majority of people in early industrial towns in the Midlands of England, parts of New England, and elsewhere in the emerging industrial regions of Europe and America in the late eighteenth and early nineteenth centuries. A similar approach has been applied to reconstructing the lives of enslaved Africans and AfricanAmericans on antebellum plantations in the Caribbean and the American South, as well as of African-Americans in the urban northeast. Other trends in contemporary archaeology include a focus on the lives of women, ordinary people, and the poor, rather than the "great men" of history. Scientific developments that make it possible to recover detailed data about diet, farming systems, and other aspects of everyday life have provided the technical impetus for these new research areas. While the study of the most ancient manifestations of human culture, such as the rise of agriculture or the state, remains important to archaeologists, an increasing number have turned to historical projects in which documents and archives and working with historians complement the material remains retrieved during excavations. Such projects promise new insights into many aspects of human history, including medieval Europe, colonial Latin America, and early settlements in the United States.

Doctrines and Practices of Inner Alchemy Besides a new variety of waidan, the Cantong qi also influenced the formation of neidan (Robinet, 1989, 1995), whose earliest extant texts date from the first half of the eighth century. The authors of several neidan treatises refer to their teachings as the Way of the Golden Elixir (jindan zhi dao). Their doctrines essentially consist of a reformulation of those enunciated in the early Daoist texts, integrated with language and images drawn from the system of correlative cosmology according to the model provided by the Cantong qi. The respective functions of these two major components of the alchemical discourse are clearly distinguished in the doctrinal treatises. Their authors point out that the alchemical teachings can only be understood in the light of those of the Daode jing (which they consider to be "the origin of the Way of the Golden Elixir") and that correlative cosmology provides "images" (xiang) that serve, as stated by Li Daochun (fl. 1288-1292), "to give form to the Formless by the word, and thus manifest the authentic and absolute Dao" (Zhonghe ji, chapter 3; see Robinet, 1995, p. 75). The alchemical discourse therefore has its roots in metaphysical principles; it uses the language and images of correlative cosmology to explicate the nature of the cosmos and its ultimate unity with the absolute principle that generates and regulates it. Its final purpose, however, is to transcend the cosmic domain, so that the use of images and metaphors involves explaining their relative value and temporary function. The status attributed to doctrines and practices reflects this view. Some authors emphasize that the inner elixir is possessed by every human being and is a representation of one's own innate realized state. Liu Yiming (1737-1821) expresses this notion as follows: Human beings receive this Golden Elixir from Heaven. . . . Golden Elixir is another name for one's fundamental nature, formed out of primeval inchoateness [huncheng, a term derived from the Daode jing]. There is no other ALCHEMY New Dictionary of the History of Ideas 39 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 39 Golden Elixir outside one's fundamental nature. Every human being has this Golden Elixir complete in oneself: it is entirely achieved in everybody. It is neither more in a sage, nor less in an ordinary person. It is the seed of Immortals and Buddhas, and the root of worthies and sages. (Wuzhen zhizhi, chapter 1) Borrowing terms from the Cantong qi, which in turn draws them from the Daode jing, Liu Yiming calls "superior virtue" (shangde) the immediate realization of the original "celestial reality" (tianzhen), which is never affected by the change and impermanence that dominate in the cosmos, and "inferior virtue" (xiade), the performance of the alchemical process in order to "return to the Dao." He states, however, that the latter way, when it achieves fruition, "becomes a road leading to the same goal as superior virtue" (Cantong zhizhi, "Jing," chapter 2). While the neidan practices are codified in ways that differ, sometimes noticeably, from each other, the notion of "inversion" (ni) is common to all of them (Robinet, 1992). In the most common codification, the practice is framed as the reintegration of each of the primary components of being, namely essence, pneuma, and spirit (jing, qi, and shen), into the one that precedes it in the ontological hierarchy, culminating in the "reversion" (huan) to the state of nonbeing (wu) or emptiness (kong). The typical formulation of this process is "refining essence and transmuting it into pneuma," "refining pneuma and transmuting it into spirit," and "refining spirit and returning to Emptiness." Li Daochun relates these stages to the passage of the Daode jing that states: "The Dao generates the One, the One generates the Two, the Two generate the Three, the Three generate the ten thousand things." According to this passage, the Dao first generates Oneness, which harbors the complementary principles of Yin and Yang. After Yin and Yang differentiate from each other, they rejoin and generate the "Three," which represents the One at the level of the particular entities. The "ten thousand things" are the totality of the entities produced by the continuous reiteration of this process. In Li Daochun's explication, the three stages of the neidan practice consist in reverting from the "ten thousand things" to emptiness, or the Dao. In this way, the gradual process that characterizes inner alchemy as a practice is equivalent to the instantaneous realization of the nonduality of the Absolute and the relative. Just as waidan draws many of its basic methods from pharmacology, so neidan too shares a significant portion of its notions and methods with classical Chinese medicine and with other bodies of practices, such as meditation and the methods for "nourishing life" (yangsheng). What distinguishes alchemy from these related traditions is its unique view of the elixir and a material or immaterial entity that represents the original state of being and the attainment of that state.

Doctrines and Practices of Inner Alchemy Besides a new variety of waidan, the Cantong qi also influenced the formation of neidan (Robinet, 1989, 1995), whose earliest extant texts date from the first half of the eighth century. The authors of several neidan treatises refer to their teachings as the Way of the Golden Elixir (jindan zhi dao). Their doctrines essentially consist of a reformulation of those enunciated in the early Daoist texts, integrated with language and images drawn from the system of correlative cosmology according to the model provided by the Cantong qi. The respective functions of these two major components of the alchemical discourse are clearly distinguished in the doctrinal treatises. Their authors point out that the alchemical teachings can only be understood in the light of those of the Daode jing (which they consider to be "the origin of the Way of the Golden Elixir") and that correlative cosmology provides "images" (xiang) that serve, as stated by Li Daochun (fl. 1288-1292), "to give form to the Formless by the word, and thus manifest the authentic and absolute Dao" (Zhonghe ji, chapter 3; see Robinet, 1995, p. 75). The alchemical discourse therefore has its roots in metaphysical principles; it uses the language and images of correlative cosmology to explicate the nature of the cosmos and its ultimate unity with the absolute principle that generates and regulates it. Its final purpose, however, is to transcend the cosmic domain, so that the use of images and metaphors involves explaining their relative value and temporary function. The status attributed to doctrines and practices reflects this view. Some authors emphasize that the inner elixir is possessed by every human being and is a representation of one's own innate realized state. Liu Yiming (1737-1821) expresses this notion as follows: Human beings receive this Golden Elixir from Heaven. . . . Golden Elixir is another name for one's fundamental nature, formed out of primeval inchoateness [huncheng, a term derived from the Daode jing]. There is no other ALCHEMY New Dictionary of the History of Ideas 39 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 39 Golden Elixir outside one's fundamental nature. Every human being has this Golden Elixir complete in oneself: it is entirely achieved in everybody. It is neither more in a sage, nor less in an ordinary person. It is the seed of Immortals and Buddhas, and the root of worthies and sages. (Wuzhen zhizhi, chapter 1) Borrowing terms from the Cantong qi, which in turn draws them from the Daode jing, Liu Yiming calls "superior virtue" (shangde) the immediate realization of the original "celestial reality" (tianzhen), which is never affected by the change and impermanence that dominate in the cosmos, and "inferior virtue" (xiade), the performance of the alchemical process in order to "return to the Dao." He states, however, that the latter way, when it achieves fruition, "becomes a road leading to the same goal as superior virtue" (Cantong zhizhi, "Jing," chapter 2). While the neidan practices are codified in ways that differ, sometimes noticeably, from each other, the notion of "inversion" (ni) is common to all of them (Robinet, 1992). In the most common codification, the practice is framed as the reintegration of each of the primary components of being, namely essence, pneuma, and spirit (jing, qi, and shen), into the one that precedes it in the ontological hierarchy, culminating in the "reversion" (huan) to the state of nonbeing (wu) or emptiness (kong). The typical formulation of this process is "refining essence and transmuting it into pneuma," "refining pneuma and transmuting it into spirit," and "refining spirit and returning to Emptiness." Li Daochun relates these stages to the passage of the Daode jing that states: "The Dao generates the One, the One generates the Two, the Two generate the Three, the Three generate the ten thousand things." According to this passage, the Dao first generates Oneness, which harbors the complementary principles of Yin and Yang. After Yin and Yang differentiate from each other, they rejoin and generate the "Three," which represents the One at the level of the particular entities. The "ten thousand things" are the totality of the entities produced by the continuous reiteration of this process. In Li Daochun's explication, the three stages of the neidan practice consist in reverting from the "ten thousand things" to emptiness, or the Dao. In this way, the gradual process that characterizes inner alchemy as a practice is equivalent to the instantaneous realization of the nonduality of the Absolute and the relative. Just as waidan draws many of its basic methods from pharmacology, so neidan too shares a significant portion of its notions and methods with classical Chinese medicine and with other bodies of practices, such as meditation and the methods for "nourishing life" (yangsheng). What distinguishes alchemy from these related traditions is its unique view of the elixir and a material or immaterial entity that represents the original state of being and the attainment of that state.

Russell, Frege, Wittgenstein The decisive development that gave a distinctive character to analytical philosophy was that whereby the young Bertrand Russell (1872-1970), freshly converted from idealism by Moore, used his new logical theories to enhance the possibilities for philosophical analysis. For what is special about analytical philosophy is the preeminence given to logical analysis. In Russell's early work this development is manifest in his "theory of descriptions," whereby he uses his logical theory to provide an analysis of propositions in which particular things are described. Russell argued that he was thereby able to resolve long-standing metaphysical puzzles about existence and identity, and equally to show how it is possible for us to have knowledge ("by description") of things of which we have no direct experience. Indeed as Russell became increasingly adept at developing and applying his logical theory, he came to think that its use was really the only proper way of doing philosophy. Thus in 1914 he gave some lectures that included one with the title "Logic as the Essence of Philosophy," and he here declares: "every philosophical problem, when it is subjected to the necessary analysis and purification, is found either to be not really philosophical at all, or else to be, in the sense in which we are using the word, logical" (1914, p. 33). Russell here describes his method as "the logical-analytic method of philosophy" (p. v) and he goes on to add that the first clear example of this method is provided by Gottlob Frege's (1848-1925) writings. Russell has in mind here Frege's development in 1879 of a radically new logical theory (first order predicate logic, as we would now call it) in his Begriffsschrift ("Concept-script"). Although Frege does not here apply his logic to philosophical debates, he does offer it as "a useful tool for the philosopher" who seeks to "break the domination of the word over the human spirit by laying bare the misconceptions that through the use of language almost unavoidably arise concerning the relations between concepts" (1879; 1970, p. 7). This contrast between the new logical "concept-script" and the apparent structure of ordinary language brings to the surface a concern with the proper understanding of language that is characteristic of analytical philosophy. The relationship between logic and ordinary language remains a contested matter, but the identification of "logical form" is one enduring strand of analytical philosophy, as in Donald Davidson's theories of action and causation. As indicated, Russell looked back to Frege when describing his "logical-analytic method of philosophy"; but in truth Russell's philosophy also contained much more besides, in particular a problematic emphasis on the priority of the things that are presented in experience, the things that we "know by acquaintance." One of the achievements of Ludwig Wittgenstein (1889-1951), who had studied with Russell and through him made contact with Frege, was to set aside this aspect of Russell's philosophy and present a purified logical-analytic method in his Tractatus Logico-Philosophicus (1922). Wittgenstein maintains here that "Philosophy is not one of the natural sciences"; instead "Philosophy aims at the logical clarification of thoughts. Philosophy is not a body of doctrine but an activity" (4.111-112). There is a sharp disagreement here with Russell, whose philosophy certainly does offer "a body of doctrine" based on his theory of knowledge by acquaintance. By contrast Wittgenstein holds that one should be able to demonstrate to anyone who seeks to advance a philosophical proposition that in doing so they have fallen into talking nonsense (6.53).

Russell, Frege, Wittgenstein The decisive development that gave a distinctive character to analytical philosophy was that whereby the young Bertrand Russell (1872-1970), freshly converted from idealism by Moore, used his new logical theories to enhance the possibilities for philosophical analysis. For what is special about analytical philosophy is the preeminence given to logical analysis. In Russell's early work this development is manifest in his "theory of descriptions," whereby he uses his logical theory to provide an analysis of propositions in which particular things are described. Russell argued that he was thereby able to resolve long-standing metaphysical puzzles about existence and identity, and equally to show how it is possible for us to have knowledge ("by description") of things of which we have no direct experience. Indeed as Russell became increasingly adept at developing and applying his logical theory, he came to think that its use was really the only proper way of doing philosophy. Thus in 1914 he gave some lectures that included one with the title "Logic as the Essence of Philosophy," and he here declares: "every philosophical problem, when it is subjected to the necessary analysis and purification, is found either to be not really philosophical at all, or else to be, in the sense in which we are using the word, logical" (1914, p. 33). Russell here describes his method as "the logical-analytic method of philosophy" (p. v) and he goes on to add that the first clear example of this method is provided by Gottlob Frege's (1848-1925) writings. Russell has in mind here Frege's development in 1879 of a radically new logical theory (first order predicate logic, as we would now call it) in his Begriffsschrift ("Concept-script"). Although Frege does not here apply his logic to philosophical debates, he does offer it as "a useful tool for the philosopher" who seeks to "break the domination of the word over the human spirit by laying bare the misconceptions that through the use of language almost unavoidably arise concerning the relations between concepts" (1879; 1970, p. 7). This contrast between the new logical "concept-script" and the apparent structure of ordinary language brings to the surface a concern with the proper understanding of language that is characteristic of analytical philosophy. The relationship between logic and ordinary language remains a contested matter, but the identification of "logical form" is one enduring strand of analytical philosophy, as in Donald Davidson's theories of action and causation. As indicated, Russell looked back to Frege when describing his "logical-analytic method of philosophy"; but in truth Russell's philosophy also contained much more besides, in particular a problematic emphasis on the priority of the things that are presented in experience, the things that we "know by acquaintance." One of the achievements of Ludwig Wittgenstein (1889-1951), who had studied with Russell and through him made contact with Frege, was to set aside this aspect of Russell's philosophy and present a purified logical-analytic method in his Tractatus Logico-Philosophicus (1922). Wittgenstein maintains here that "Philosophy is not one of the natural sciences"; instead "Philosophy aims at the logical clarification of thoughts. Philosophy is not a body of doctrine but an activity" (4.111-112). There is a sharp disagreement here with Russell, whose philosophy certainly does offer "a body of doctrine" based on his theory of knowledge by acquaintance. By contrast Wittgenstein holds that one should be able to demonstrate to anyone who seeks to advance a philosophical proposition that in doing so they have fallen into talking nonsense (6.53).

Concepts The concept of animism first appeared explicitly in Victorian British anthropology in Primitive Culture (1871), by Sir Edward Burnett Tylor (later published as Religion in Primitive Culture, 1958). His writings are preceded historically by those of the Greek Lucretius (c. 96-c. 55 B.C.E.) and the Roman Marcus Tullius Cicero (106-43 B.C.E.), among many others. "The doctrine of human and other souls" or "the doctrine of spiritual beings" constitutes the essence of Tylor's theory. The doctrine of souls is based on the foundational doctrine of "psychic unity," which affirms that all people, everywhere, for all time (or at least the past fifty thousand years or so), have the same capacity to comprehend all phenomena in the known, observed, and imagined universe by use of their own cultural symbols and languages. Tylor regards Spiritualism as a modern cult that lacks panhuman motivations of animism. The idea of animism is that in all cultural systems people experience phenomena—such as dreams, visions, sudden insights, out-of-body experiences, near-death experiences, and trances—that simultaneously conjoin perceptions of being "elsewhere" with the knowledge of being "here." Some thinkers explain this experience through a belief in the human soul, which they envision as distinct from but inextricably attached to the body until death do they part, so that animistic belief in the soul becomes part of every cultural system. Robert Ranulph Marett (1866-1943), Tylor's successor at Oxford, introduced the concept of animatism to that of animism, extending the idea of an animating spirit similar to the soul to include many different forces in nature and culture (The Threshold of Religion, 1909). Such force is what makes a tree grow from a seed, the rain fall, or the sun shine—that which brings fertility and fecundity to the earth. Loss of such force results in death. People are in awe of such forces as manifest in volcanoes and earthquakes and especially in inert corpses. Out of the observations and awe of force in nature comes the universality of the sacral basis for religious experience, which Marett argued was prior to animism. Animism and animatism are often not clearly distinguished, as many of Marett's ideas have been blended through time in philosophical and religious literature with those of Tylor and many others. The Canelos Quichua native people of Amazonian Ecuador illustrate concepts of animism and animatism. Souls and spirits are ubiquitous and even spirits have souls. Those who interact intensively with the souls are the male shamans and the female potters, both of whom influence the conceptual system of one another through mutual symbol revelation. For example, when a shaman in trance dimly "sees" an approaching colorful, noisy spirit, a woman quietly, from the darkened recesses of the room, clarifies his emerging vision and names the actual spirit. Human souls are acquired through both mother and father. Spirit essences are hierarchized into four essential tiers, easily represented as spheres encompassing one another. Sungui, the master spirit of the rain forest and hydrosphere, ANIMISM 72 New Dictionary of the History of Ideas African man beside animist figure (sculpted from mud and bone and covered with feathers), Burkina Faso, c. 1979. Many indigenous cultures believe in spiritual beings, a concept first defined as animism by Sir Edward Burnett Tylor in 1871. Tylor proposed that animism derived from an attempt to explain certain phenomenon such as trances, dreams, and visions. © CHARLES LENARS/CORBIS 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 72 is the apotheosis of androgynous power. This male and female spirit takes many corporeal forms, the most prominent being the giant anaconda. This spiritual superpower must be controlled or it will overwhelm and inundate the world; Amasanga, master spirit of the rain forest, controls the power of Sungui. The corporeal representative of this androgynous being is the great black jaguar. In turn, rainforest dynamics are controlled by Nungüi, a strictly feminine spirit, master of garden soil and pottery clay, whose corporeality is manifest in the deadly black coral snake with a mouth too small to bite humans. The inner sphere is the human household, wherein the souls and spirits come together in a special system of human knowledge, vision, and imagery. Power flows downward through the spheres, and control of power is exercised upward from inner to outer spheres.

Concepts The concept of animism first appeared explicitly in Victorian British anthropology in Primitive Culture (1871), by Sir Edward Burnett Tylor (later published as Religion in Primitive Culture, 1958). His writings are preceded historically by those of the Greek Lucretius (c. 96-c. 55 B.C.E.) and the Roman Marcus Tullius Cicero (106-43 B.C.E.), among many others. "The doctrine of human and other souls" or "the doctrine of spiritual beings" constitutes the essence of Tylor's theory. The doctrine of souls is based on the foundational doctrine of "psychic unity," which affirms that all people, everywhere, for all time (or at least the past fifty thousand years or so), have the same capacity to comprehend all phenomena in the known, observed, and imagined universe by use of their own cultural symbols and languages. Tylor regards Spiritualism as a modern cult that lacks panhuman motivations of animism. The idea of animism is that in all cultural systems people experience phenomena—such as dreams, visions, sudden insights, out-of-body experiences, near-death experiences, and trances—that simultaneously conjoin perceptions of being "elsewhere" with the knowledge of being "here." Some thinkers explain this experience through a belief in the human soul, which they envision as distinct from but inextricably attached to the body until death do they part, so that animistic belief in the soul becomes part of every cultural system. Robert Ranulph Marett (1866-1943), Tylor's successor at Oxford, introduced the concept of animatism to that of animism, extending the idea of an animating spirit similar to the soul to include many different forces in nature and culture (The Threshold of Religion, 1909). Such force is what makes a tree grow from a seed, the rain fall, or the sun shine—that which brings fertility and fecundity to the earth. Loss of such force results in death. People are in awe of such forces as manifest in volcanoes and earthquakes and especially in inert corpses. Out of the observations and awe of force in nature comes the universality of the sacral basis for religious experience, which Marett argued was prior to animism. Animism and animatism are often not clearly distinguished, as many of Marett's ideas have been blended through time in philosophical and religious literature with those of Tylor and many others. The Canelos Quichua native people of Amazonian Ecuador illustrate concepts of animism and animatism. Souls and spirits are ubiquitous and even spirits have souls. Those who interact intensively with the souls are the male shamans and the female potters, both of whom influence the conceptual system of one another through mutual symbol revelation. For example, when a shaman in trance dimly "sees" an approaching colorful, noisy spirit, a woman quietly, from the darkened recesses of the room, clarifies his emerging vision and names the actual spirit. Human souls are acquired through both mother and father. Spirit essences are hierarchized into four essential tiers, easily represented as spheres encompassing one another. Sungui, the master spirit of the rain forest and hydrosphere, ANIMISM 72 New Dictionary of the History of Ideas African man beside animist figure (sculpted from mud and bone and covered with feathers), Burkina Faso, c. 1979. Many indigenous cultures believe in spiritual beings, a concept first defined as animism by Sir Edward Burnett Tylor in 1871. Tylor proposed that animism derived from an attempt to explain certain phenomenon such as trances, dreams, and visions. © CHARLES LENARS/CORBIS 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 72 is the apotheosis of androgynous power. This male and female spirit takes many corporeal forms, the most prominent being the giant anaconda. This spiritual superpower must be controlled or it will overwhelm and inundate the world; Amasanga, master spirit of the rain forest, controls the power of Sungui. The corporeal representative of this androgynous being is the great black jaguar. In turn, rainforest dynamics are controlled by Nungüi, a strictly feminine spirit, master of garden soil and pottery clay, whose corporeality is manifest in the deadly black coral snake with a mouth too small to bite humans. The inner sphere is the human household, wherein the souls and spirits come together in a special system of human knowledge, vision, and imagery. Power flows downward through the spheres, and control of power is exercised upward from inner to outer spheres.

ALCHEMY. This entry includes two subentries: China Europe and the Middle East CHINA Chinese alchemy is based on doctrinal principles, first set out in the founding texts of Daoism, concerning the relation between the domains of the Absolute and the relative, or the Dao and the "ten thousand things" (wanwu). Its teachings and practices focus on the idea of the elixir, frequently referred to as the Golden Elixir (jindan), the Reverted Elixir (huandan), or the Medicine (yao). Lexical analysis shows that the semantic field of the term dan (elixir) evolves from a root-meaning of "essence"; its connotations include the reality, principle, or true nature of an entity, or its most basic and significant element or property. The purport of alchemy as a doctrine is to illustrate the nature of this underlying "authentic principle" and to explicate its relation to change and multiplicity. In the associated practices, compounding the elixir has two primary meanings. In the first sense, the elixir is obtained by heating its ingredients in a crucible. This practice, as well as the branch of alchemy that is associated with it, is known as waidan, or "external alchemy" (literally, "outer elixir"). In the second sense, the ingredients of the elixir are the primary components of the cosmos and the human being, and the entire process takes place within the practitioner. This second form of practice (which incorporates some aspects of Daoist meditation methods and of physiological techniques of selfcultivation), as well as the corresponding branch of the alchemical tradition, is known as neidan, or "inner alchemy" (literally, "inner elixir"). The Chinese alchemical tradition has therefore three main aspects, namely a doctrinal level and two paradigmatic forms of practice, respectively based on the refining of an "outer" or an "inner" elixir. The Elixir in External Alchemy Although the first allusions to alchemy in China date from the second century B.C.E., the combination of doctrines and practices involving the compounding of an elixir, which is necessary to define alchemy as such and to distinguish it from proto-chemistry, is not clearly attested to in extant sources until the third century C.E. The first identifiable tradition, known as Taiqing (Great Clarity; Pregadio, 2005), developed from that time in Jiangnan, the region south of the lower Yangzi River that was also crucial for the history of Daoism during the Six Dynasties (third to sixth centuries). The Taiqing scriptures consist of descriptions of methods for compounding elixirs and of benefits gained from their performance and contain virtually no statements regarding their doctrinal foundations. The emphasis given to certain aspects of the practice and the terminology used in those descriptions, however, show that the central act of the alchemical process consists in causing matter to revert to its state of "essence" (jing), or prima materia. The main role in this task is played by the crucible, whose function is to provide a medium equivalent to the inchoate state (hundun) prior to the formation of the cosmos. In that medium, under the action of fire, the ingredients of the elixir are transmuted, or "reverted" (huan), to their original state. A seventh-century commentary to one of the Taiqing scriptures equates this refined matter with the "essence" that, as stated in the Daode jing (Scripture of the Way and Its Virtue), gives birth to the world of multiplicity: "Indistinct! Vague! But within it there is something. Dark! Obscure! But within it there is an essence." In the Taiqing texts, compounding the elixir constitutes the central part of a larger process consisting of several stages, each of which is marked by the performance of rites and ceremonies. ALCHEMY 38 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 38 Receiving the scriptures and the oral instructions, building the laboratory, kindling the fire, and ingesting the elixir all require offering pledges to one's master and to the gods, observing rules on seclusion and purification, performing ceremonies to delimit and protect the ritual area, and making invocations to the highest deities. Ingesting the elixir is said to confer transcendence and admission into the celestial bureaucracy. Additionally the elixir grants healing from illnesses and protection from demons, spirits, and several other disturbances. To provide these supplementary benefits, the elixir does not need to be ingested and may simply be kept in one's hand or carried at one's belt as a powerful apotropaic talisman. The methods of the Taiqing texts are characterized by the use of a large number of ingredients. Sources attached to later waidan traditions instead describe different varieties of a single exemplary method, consisting of the refining of mercury (Yin) from cinnabar (Yang), its addition to sulfur (Yang), and its further refining. This process, typically repeated seven or nine times, yields an elixir that is deemed to embody the qualities of pure Yang (chunyang)—that is, the state of oneness before its differentiation into Yin and Yang

ALCHEMY. This entry includes two subentries: China Europe and the Middle East CHINA Chinese alchemy is based on doctrinal principles, first set out in the founding texts of Daoism, concerning the relation between the domains of the Absolute and the relative, or the Dao and the "ten thousand things" (wanwu). Its teachings and practices focus on the idea of the elixir, frequently referred to as the Golden Elixir (jindan), the Reverted Elixir (huandan), or the Medicine (yao). Lexical analysis shows that the semantic field of the term dan (elixir) evolves from a root-meaning of "essence"; its connotations include the reality, principle, or true nature of an entity, or its most basic and significant element or property. The purport of alchemy as a doctrine is to illustrate the nature of this underlying "authentic principle" and to explicate its relation to change and multiplicity. In the associated practices, compounding the elixir has two primary meanings. In the first sense, the elixir is obtained by heating its ingredients in a crucible. This practice, as well as the branch of alchemy that is associated with it, is known as waidan, or "external alchemy" (literally, "outer elixir"). In the second sense, the ingredients of the elixir are the primary components of the cosmos and the human being, and the entire process takes place within the practitioner. This second form of practice (which incorporates some aspects of Daoist meditation methods and of physiological techniques of selfcultivation), as well as the corresponding branch of the alchemical tradition, is known as neidan, or "inner alchemy" (literally, "inner elixir"). The Chinese alchemical tradition has therefore three main aspects, namely a doctrinal level and two paradigmatic forms of practice, respectively based on the refining of an "outer" or an "inner" elixir. The Elixir in External Alchemy Although the first allusions to alchemy in China date from the second century B.C.E., the combination of doctrines and practices involving the compounding of an elixir, which is necessary to define alchemy as such and to distinguish it from proto-chemistry, is not clearly attested to in extant sources until the third century C.E. The first identifiable tradition, known as Taiqing (Great Clarity; Pregadio, 2005), developed from that time in Jiangnan, the region south of the lower Yangzi River that was also crucial for the history of Daoism during the Six Dynasties (third to sixth centuries). The Taiqing scriptures consist of descriptions of methods for compounding elixirs and of benefits gained from their performance and contain virtually no statements regarding their doctrinal foundations. The emphasis given to certain aspects of the practice and the terminology used in those descriptions, however, show that the central act of the alchemical process consists in causing matter to revert to its state of "essence" (jing), or prima materia. The main role in this task is played by the crucible, whose function is to provide a medium equivalent to the inchoate state (hundun) prior to the formation of the cosmos. In that medium, under the action of fire, the ingredients of the elixir are transmuted, or "reverted" (huan), to their original state. A seventh-century commentary to one of the Taiqing scriptures equates this refined matter with the "essence" that, as stated in the Daode jing (Scripture of the Way and Its Virtue), gives birth to the world of multiplicity: "Indistinct! Vague! But within it there is something. Dark! Obscure! But within it there is an essence." In the Taiqing texts, compounding the elixir constitutes the central part of a larger process consisting of several stages, each of which is marked by the performance of rites and ceremonies. ALCHEMY 38 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 38 Receiving the scriptures and the oral instructions, building the laboratory, kindling the fire, and ingesting the elixir all require offering pledges to one's master and to the gods, observing rules on seclusion and purification, performing ceremonies to delimit and protect the ritual area, and making invocations to the highest deities. Ingesting the elixir is said to confer transcendence and admission into the celestial bureaucracy. Additionally the elixir grants healing from illnesses and protection from demons, spirits, and several other disturbances. To provide these supplementary benefits, the elixir does not need to be ingested and may simply be kept in one's hand or carried at one's belt as a powerful apotropaic talisman. The methods of the Taiqing texts are characterized by the use of a large number of ingredients. Sources attached to later waidan traditions instead describe different varieties of a single exemplary method, consisting of the refining of mercury (Yin) from cinnabar (Yang), its addition to sulfur (Yang), and its further refining. This process, typically repeated seven or nine times, yields an elixir that is deemed to embody the qualities of pure Yang (chunyang)—that is, the state of oneness before its differentiation into Yin and Yang

America as the home of natural man. In 1580 the French philosopher Michel de Montaigne (1533-1592) began a pathbreaking new way of thinking about the Indians. A skeptic and a keen observer of human diversity, Montaigne argued that "each man calls barbarism whatever is not his own practice; for indeed it seems we have no other test of truth and reason than the example and pattern of the opinions and customs of the country we live in." Unlike the Spanish, Montaigne doubts the standards of his own place and time. In his famous essay "Of Cannibals" (Essays) he describes Indian society as the best society that ever was, real or imagined, because they are "still very close to their original naturalness" and thus live in a "state of purity" according to "les loix naturelles." He claims their society, held together with "little artifice and human solder," is as pure and natural as a society can be. His account claims that these Indians do fight and eat their captives, but he says they do so not for economic gain but as a kind of aristocratic struggle for mastery. He describes their warfare as "wholly noble" and "as excusable and beautiful as this human disease can be." This is the origin of the image of the noble savage. Montaigne knows, however, that his account of the Indians' tranquility and bliss is fictitious. He concedes the barbarous horror of some of their actions, writing, "I am not sorry that we notice the barbarous horror of [their] acts, but I am heartily sorry that, judging their faults rightly, we should be so blind to our own." Here Montaigne reveals his true intentions in describing the Indians: he uses them as an image with which to expose the horrors and cruelty of his own world. This usage of the Indians as a countercultural marker was to become the norm. While Montaigne's account of the Indians is in the end neither anthropologically accurate nor fully desirable, he is the first to misrepresent the Indians in a positive fashion. After Montaigne, no major philosopher in Europe doubted the Indians' naturalness. To the contrary, the Indians came to represent natural man par excellence. From Montaigne until the end of the Enlightenment, every major philosopher agreed AMERICA 56 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 56 with John Locke's (1632-1704) famous statement that "in the beginning all the world was America" (Second Treatise of Government). America represented Europe's past. In ending one debate, however, Montaigne began a new one. While every major thinker agreed that the Indians represented mankind's natural state, debate arose over the interpretation of the natural state: was it a brutishness to overcome or an innocence to recapture? Among these philosophers the debate evolved in a single direction. Thomas Hobbes (1588-1679) first argued that mankind's natural state is a horrible state of war to be avoided at all costs. Locke and Charles-Louis de Secondat, baron de Montesquieu (1689-1755), countered that the state of nature is pacific but undesirable. Jean-Jacques Rousseau (1712- 1778), François Marie Arouet de Voltaire (1694-1778), and Denis Diderot (1713-1784) later praised the Indians as naturally good and happy, in contrast to European artificiality and corruption. These varied representations, it should be noted, do not correspond to any changes in Indian societies, nor do they respond to new information about the Indians. In truth, the available evidence was barely consulted at all by any of the great thinkers. Rather, these philosophers clearly used their descriptions of the Indians as support for their own ends. As dissatisfaction with Europe increased, so did praise of the Indians grow as an alternative, more desirable and more natural, way of living. In sum, contemporaneous representations of the American Indians really reflect Europe's own debates, not the reality of America. They have left the legacies of brutishness and of the noble savage, which remain in the twenty-first century. But there is another legacy of these debates. In using the Indians of America to promote their own visions of freedom and legitimate institutions, the philosophers set in motion a train of thought and actions that would lead to revolution. The first of these revolutions took place in America and led to the founding of the United States.

America as the home of natural man. In 1580 the French philosopher Michel de Montaigne (1533-1592) began a pathbreaking new way of thinking about the Indians. A skeptic and a keen observer of human diversity, Montaigne argued that "each man calls barbarism whatever is not his own practice; for indeed it seems we have no other test of truth and reason than the example and pattern of the opinions and customs of the country we live in." Unlike the Spanish, Montaigne doubts the standards of his own place and time. In his famous essay "Of Cannibals" (Essays) he describes Indian society as the best society that ever was, real or imagined, because they are "still very close to their original naturalness" and thus live in a "state of purity" according to "les loix naturelles." He claims their society, held together with "little artifice and human solder," is as pure and natural as a society can be. His account claims that these Indians do fight and eat their captives, but he says they do so not for economic gain but as a kind of aristocratic struggle for mastery. He describes their warfare as "wholly noble" and "as excusable and beautiful as this human disease can be." This is the origin of the image of the noble savage. Montaigne knows, however, that his account of the Indians' tranquility and bliss is fictitious. He concedes the barbarous horror of some of their actions, writing, "I am not sorry that we notice the barbarous horror of [their] acts, but I am heartily sorry that, judging their faults rightly, we should be so blind to our own." Here Montaigne reveals his true intentions in describing the Indians: he uses them as an image with which to expose the horrors and cruelty of his own world. This usage of the Indians as a countercultural marker was to become the norm. While Montaigne's account of the Indians is in the end neither anthropologically accurate nor fully desirable, he is the first to misrepresent the Indians in a positive fashion. After Montaigne, no major philosopher in Europe doubted the Indians' naturalness. To the contrary, the Indians came to represent natural man par excellence. From Montaigne until the end of the Enlightenment, every major philosopher agreed AMERICA 56 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 56 with John Locke's (1632-1704) famous statement that "in the beginning all the world was America" (Second Treatise of Government). America represented Europe's past. In ending one debate, however, Montaigne began a new one. While every major thinker agreed that the Indians represented mankind's natural state, debate arose over the interpretation of the natural state: was it a brutishness to overcome or an innocence to recapture? Among these philosophers the debate evolved in a single direction. Thomas Hobbes (1588-1679) first argued that mankind's natural state is a horrible state of war to be avoided at all costs. Locke and Charles-Louis de Secondat, baron de Montesquieu (1689-1755), countered that the state of nature is pacific but undesirable. Jean-Jacques Rousseau (1712- 1778), François Marie Arouet de Voltaire (1694-1778), and Denis Diderot (1713-1784) later praised the Indians as naturally good and happy, in contrast to European artificiality and corruption. These varied representations, it should be noted, do not correspond to any changes in Indian societies, nor do they respond to new information about the Indians. In truth, the available evidence was barely consulted at all by any of the great thinkers. Rather, these philosophers clearly used their descriptions of the Indians as support for their own ends. As dissatisfaction with Europe increased, so did praise of the Indians grow as an alternative, more desirable and more natural, way of living. In sum, contemporaneous representations of the American Indians really reflect Europe's own debates, not the reality of America. They have left the legacies of brutishness and of the noble savage, which remain in the twenty-first century. But there is another legacy of these debates. In using the Indians of America to promote their own visions of freedom and legitimate institutions, the philosophers set in motion a train of thought and actions that would lead to revolution. The first of these revolutions took place in America and led to the founding of the United States.

Sociocultural Anthropology Sociocultural anthropology, the subfield concerned with culture per se, especially in its many contemporary ethnographic manifestations, commands the attention of the majority of ANTHROPOLOGY 76 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 76 professional anthropologists. Although there are some excellent examples of ethnographic description in antiquity—Herodotus's (484?-425 B.C.E.) account of the ancient Scythians in Book 4 of his History (c. 440 B.C.E.) and Cornelius Tacitus's (55? B.C.E.-after 117 C.E.) detailed account of ancient Germanicspeaking culture, Germania (c. 98 C.E.)—like the other principal subfields of anthropology, sociocultural anthropology began to take shape in the early nineteenth century and is closely linked to colonialism. As Europeans (or people of European heritage) expanded into India, Southeast Asia, Africa, and Oceania, as well as across North America, they found themselves confronting—and eventually dominating—what they called "primitive" cultures. These encounters led to two fundamental questions that dominate anthropology: (1) Why do cultures differ? and (2) Why are some cultures technologically "simple" societies, while others developed more complex, technologically sophisticated societies? Accounting for the differences found among cultures is problematic. For example, while, in some regions, humans live in small-scale societies with very basic technologies and low population densities—what early anthropologists, influenced by colonialism and scientific racism, called "primitive"—in other places, such as the Valley of Mexico, people began millennia ago to develop complex, energy-intensive agricultural technologies that enabled them to congregate in great numbers, build enormous cities and finance the construction of elaborate buildings and works of art, and, in general, to develop what are called "civilizations." This second question has especially been the province of archaeologists, but it underlies cultural anthropology as well. At its best, cultural anthropology has steadfastly argued for the value of the small-scale and the more environmentally wise "primitive" as culturally significant. At its worst, it has functioned as the "handmaiden of imperialism," either overtly, as when British anthropologists worked for the colonial enterprise in Africa, or indirectly, as purveyors of "exotica" that reinforce the prejudices of urbanites and racial elites regarding the "savagery" of foreigners or of native populations and minorities closer to home. It is thus no accident that the first great theoretical paradigm in sociocultural anthropology was unilineal evolutionism, the idea that all cultures can be ranked along a grand scale that culminated, of course, with nineteenth-century European and American industrial civilization, the "best of all possible worlds." Darwin's The Origin of Species reinforced this approach to the assessment of cultural differences, in particular the concept of "natural selection." Unilineal evolutionism was predicated on two fundamental axioms: (1) the idea of progress, that the direction of cultural evolution is everywhere from "primitive" to "civilized" and (2) the idea of psychic unity, that all human beings, irrespective of their environment or specific history, will necessarily think the same thoughts and, therefore, progress through the same series of evolutionary stages. Sir Edward Burnett Tylor, who was the first anthropologist to define the concept of culture, was a major contributor to unilineal evolutionism, suggesting a three-stage model for the evolution of religion: animism (a belief that all phenomena are "animated" by unique spirit beings), polytheism, and monotheism, which, he held, is a prime characteristic of advanced civilizations. The most influential unilineal evolutionist was Lewis Henry Morgan (1818-1881), a successful American lawyer who practiced anthropology as an avocation. In Ancient Society (1877), Morgan posited a three-stage model for the evolution of culture: savagery, barbarism, and civilization. The first two stages, which he labeled, collectively, societas (society), as opposed to civitas (civilization), were each subdivided into three successive substages: lower, middle, and upper. His prime criterion for assigning cultures to one or another of these stages was the character and complexity of their technology. Because they lacked the bow and arrow, a prime technological criterion, Morgan assigned the ancient Hawaiians to "Middle Savagery," despite the fact that they practiced agriculture and had a highly complex social organization. Although Morgan's emphasis on material culture—tools, weapons, and other artifacts—had a significant influence on a later school of sociocultural anthropology, cultural materialism, he also pioneered the study of kinship systems. By the 1890s, however, the unilineal evolutionists' rigid adherence to paradigms based on incomplete and questionable ethnographic data (largely collected by missionaries, traders, colonial administrators, and so forth) was called into question by a new generation of anthropologists who had spent time in the field. (Most of the unilineal evolutionists were "armchair scholars," although both Tylor and Morgan did have some field experience in their youth, the former in Mexico and the latter among the Seneca, an Iroquois tribe that lived near his home in upstate New York.) In the United States, the chief critic of what was then called the comparative method in anthropology was Franz Boas, the most influential American anthropologist. A rigorous, scientifically trained German-born scholar, he later switched to anthropology and did extensive fieldwork among the Baffin Island (Canada) Eskimo (or Inuit), as well as the Native Americans of British Columbia. He and other critics of the unilineal approach, many of whom were his students, such as Alfred Louis Kroeber (1876-1960) and Robert Lowie (1883-1957), also called into question unilinealism's fundamental axioms, seriously questioning whether "progress" was in fact universal or lineal and whether it was possible to rank all human cultures according to a single evolutionary scheme. Finally, the Boasians attacked the concept of "psychic unity," suggesting that all cultures are inherently different from one another and that they should be assessed on their own merits and not comparatively. This approach, which stressed empirical field research over "armchair" theorizing, came to be known as historical particularism, and emphasized cultural relativism and diffusion rather than rigid evolutionary sequences. Anthropologists were enjoined to reconstruct the culture-history of particular tribes and societies, but not the evolution of culture per se. Emphasis was also placed on what has been called "salvage ethnography," gathering ethnographic data before the simpler cultures of the world were overwhelmed by Western culture. Boas left another important legacy: his work as a public intellectual who used his scholarly knowledge to educate the American public about racial equality. For Boas, who had experienced anti-Semitism in his native Germany, this kind of ANTHROPOLOGY New Dictionary of the History of Ideas 77 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 77 work on the part of intellectuals was crucial if America was to realize its democratic ideals. He inspired many of his students, including Ruth Benedict (1887-1948) and Margaret Mead (1901-1978), to their own forms of public work by his example. He also influenced several important African and Native American intellectuals who left anthropology for other pursuits, most notably the novelist Zora Neale Hurston (1891-1960), as well as the Brazilian sociologist Gilberto Freyre (1900-1987), who credits Boas's influence in the preface to his controversial books on race in Brazilian culture. In Britain, the empirical reaction to Morgan, Tylor, and their colleagues took a different turn. Most early-twentiethcentury British anthropologists, such as Bronislaw K. Malinowski (1884-1942), a Polish scholar who immigrated to England to complete his studies, and A. R. Radcliffe-Brown (1881-1955), were largely ahistorical; that is, they advocated a structural-functional, rather than historical, approach to the study of cultures. Their emphasis was primarily, if not in some cases wholly, on the here and now, on the social organization of living human communities and how the elements thereof were functionally interrelated to form integrated wholes. Malinowski, who spent four years (1915-1918) studying the culture of the Trobriand Islanders (near New Guinea), also focused on how social institutions function to serve basic human needs, such as shelter, reproduction, and nourishment. Radcliffe-Brown, who did field work in the Andaman Islands, South Africa, and Australia, drew liberally on the ideas of the French sociologist Émile Durkheim (1858-1917) in his attempts to discover what he called the "social laws" and "structural principles" that govern social organization everywhere. Among them, Boas, Malinowski, and Radcliffe-Brown trained or influenced at least two generations of sociocultural anthropologists on both sides of the Atlantic, from the early 1900s to the threshold of World War II, and, in Radcliffe-Brown's case, for a decade afterward as well. However, beginning in the late 1930s, a reaction to the essentially antitheoretical stance of the Boasians began to take shape, based on the assumption that all cultures are necessarily adapted to the ecological circumstances in which they exist. The leading advocate of "cultural ecology" was Julian H. Steward (1902-1972), whose book Theory of Culture Change (1955) had a major impact on the discipline. Other scholars, such as Leslie A. White (1900-1975) and British archaeologist V. Gordon Childe (1892-1957), drawing on the Marxist assumption that the "means of production" is everywhere crucial in determining the nature of society, emphasized the primacy of material culture. (Indeed, White consistently described himself as a disciple of Lewis Henry Morgan, whose work had, in turn, influenced Marx's collaborator Friedrich Engels [1820-1895].) These early "neo-evolutionists" of the mid-twentieth century all acknowledged a debt to Marx, but the new materialism in anthropology soon split into two camps: one that emphasized historical materialism, political economy, and the study of imperialism and inequality, and another that repudiated Marx and focused on questions of ecological adaptation and evolution. The former include Eric Wolf, Sidney Mintz, Eleanor Leacock, and John Murra. Mintz's study of the history of sugar and Wolf's timely comparative project on Peasant Wars of the Twentieth Century (1999), published in response to U.S. involvement in Vietnam, remain two classic studies from this school. Among the anti-Marxists, the best-known is the late Marvin R. Harris, whose provocative books for the general public argue that apparently "irrational" religious behavior, such as the Hindu refusal to eat meat or Jewish dietary law, can be attributed to biological needs unknown to practitioners. Neo-evolutionism was not the only post-Boasian development in sociocultural anthropology. Also in the late 1930s, a number of American anthropologists, among them Margaret Mead, Ruth Benedict, and Ralph Linton, drew selectively on Freud and other early twentieth-century psychologists and developed the "culture and personality" school, which emphasized the interface among individual personalities and the cultures they share, as well as the "infant disciplines"— weaning and toilet training—and their effects on both the formation of individual personality structures and the nature of particular cultures. Early and harsh toilet training was held to produce "anal" personalities and authoritarian cultures, whereas relaxed attitudes toward sphincter control and related processes produce relaxed social systems. At the start of the twenty-first century this school has few proponents, but psychological anthropology remains a recognized branch of sociocultural anthropology. In the 1950s, linguistics began to influence an increasing number of anthropologists. If the cultural ecologists and materialists had come to conceive of culture as essentially an adaptive system, their linguistically oriented colleagues were concerned with cognitive systems and shared symbols, with how people attach meaning to the world around them. Initially known as "ethnoscience," this approach has come to be called "cognitive anthropology." Closely related to it are two other approaches also concerned with meaning. One of them, closely identified with the eminent French anthropologist Claude Lévi-Strauss, is structuralism. Extremely influential outside of anthropology, especially in France, this school focused on underlying structures of thought based on binary oppositions, like the binary mathematical code used by computers. From simple pairs such as hot/cold or up/down, cultures construct elaborate systems of myth and meaning that shape everything from cooking to kinship, as well as providing answers to questions about life and death. When initially published, his partially autobiographical Tristes tropiques (1955; The sad tropics) in which he recounts his flight from Nazi Germany to find refuge among the tribes of Amazonian Brazil, was perhaps more influential than his dense and difficult works of structural analysis. Later, however, it was criticized for its portrayal of Native Americans as seemingly "outside of history." In America, a different school of anthropology, "symbolic anthropology," would ultimately prove more influential than structuralism. This approach, which draws on the same linguistic and cognitive models as structuralism, emphasizes emotion and affect in addition to cognition, and looks back to traditional anthropological studies of magico-religious belief systems, especially Durkheim's work. Among the more important contributors to symbolic anthropology have been ANTHROPOLOGY 78 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 78 the British scholars Victor W. Turner and Mary Douglas, and the American anthropologist Clifford Geertz. Turner's study of the multivalent symbolic meanings of the "milk-tree" in the life of the Ndembu of northwestern Zambia—its milklike sap stands for everything from semen to mothers' milk—remains a classic. In it, he borrows from Arnold van Gennep's (1873- 1957) classic work, Les rites de passage (1909; The Rites of Passage), the concept of the "transition" stage in a rite of passage, which Turner rechristened "liminality." Douglas studied the symbolic opposition between what she calls "purity and danger," as exemplified in her brilliant analysis of the food taboos in the Old Testament, which, she argues, reflect a fear of anomalous animals, like the pig, which is neither a browser nor a ruminant. Geertz, who has done extensive fieldwork in Indonesia and Morocco, is famous for his in-depth analysis of the symbolism of cockfighting in Bali, as well as for the concept of "webs of significance," the idea that all human beings are necessarily bound together by intricate symbolic "webs" in terms of which they collectively confront external reality. In the last two decades, sociocultural anthropology has seen the emergence of the "post-isms": postcolonialism, which examines the impact of neoliberal capitalism on recently decolonized states; poststructuralism, which critiques the work of Lévi-Strauss and other classic structuralists; and, most importantly, postmodernism, a manifestation of a broader intellectual movement in architecture, literature, cinema, and the arts predicated in fair measure on the theories of French scholars Jacques Derrida and Michel Foucault. Postmodernists question the validity of externally imposed orders, as well as linear analysis and "essentialist" interpretations, and assert that anthropologists should "deconstruct" the cultures they are attempting to understand. Moreover, postmodern ethnographies often focus as much on the ethnographers as they do the communities they have studied, as any cultural account must necessarily include the impact of the investigator on the investigated, and vice versa. This element in postmodernism has been criticized a great deal, especially by materialists such as Sidney Mintz. Among the more prominent postmodernist anthropologists are Stephen A. Tyler, Vincent Crapanzano, and James A. Boon. In recent years, sociocultural anthropologists—from a variety of perspectives—have been concerned with globalization, transnational communities, such as the African, Indian, and Chinese diasporas, and borderlands, in which the inhabitants freely share culture traits that are otherwise, for the most part, extremely different and seemingly contradictory and integrate them into new, "hybrid" cultures. There has also been increased concern with feminism, especially what has been labeled "thirdwave feminism," and with the heretofore often neglected roles women play in shaping cultural norms, as well as the inequality that persists almost everywhere between the sexes. Finally, gay and lesbian, as well as transsexual, studies form a significant element of contemporary sociocultural anthropology, leading to a major reassessment of the concept of "gender" and the extent to which it is socially constructed rather than innate. This brief overview of the history and current state of the discipline of anthropology, primarily in the United States, has necessarily omitted mention of many specific developments and schools of thought, for example, the Kulturkreis, or "culture-circle" school, centered on Father Wilhelm Schmidt (1868-1954), that took shape in Vienna in the early years of the last century; the impact of Sir James G. Frazer's (1854- 1941) The Golden Bough (1890), which seduced Malinowski into completing his studies in England; and the singlediffusionist ideas of G. Elliot Smith (1871-1937). A great many important contributors, to say nothing of specific topical and regional specialties, such as urban anthropology, esthetic anthropology, East Asian anthropology, African anthropology, and so on, have been slighted. Nevertheless, this discussion provides a general description of what anthropology, in its several major dimensions, is about, how it got that way, and the overwhelming importance of the concept of culture to the disciplin

Sociocultural Anthropology Sociocultural anthropology, the subfield concerned with culture per se, especially in its many contemporary ethnographic manifestations, commands the attention of the majority of ANTHROPOLOGY 76 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 76 professional anthropologists. Although there are some excellent examples of ethnographic description in antiquity—Herodotus's (484?-425 B.C.E.) account of the ancient Scythians in Book 4 of his History (c. 440 B.C.E.) and Cornelius Tacitus's (55? B.C.E.-after 117 C.E.) detailed account of ancient Germanicspeaking culture, Germania (c. 98 C.E.)—like the other principal subfields of anthropology, sociocultural anthropology began to take shape in the early nineteenth century and is closely linked to colonialism. As Europeans (or people of European heritage) expanded into India, Southeast Asia, Africa, and Oceania, as well as across North America, they found themselves confronting—and eventually dominating—what they called "primitive" cultures. These encounters led to two fundamental questions that dominate anthropology: (1) Why do cultures differ? and (2) Why are some cultures technologically "simple" societies, while others developed more complex, technologically sophisticated societies? Accounting for the differences found among cultures is problematic. For example, while, in some regions, humans live in small-scale societies with very basic technologies and low population densities—what early anthropologists, influenced by colonialism and scientific racism, called "primitive"—in other places, such as the Valley of Mexico, people began millennia ago to develop complex, energy-intensive agricultural technologies that enabled them to congregate in great numbers, build enormous cities and finance the construction of elaborate buildings and works of art, and, in general, to develop what are called "civilizations." This second question has especially been the province of archaeologists, but it underlies cultural anthropology as well. At its best, cultural anthropology has steadfastly argued for the value of the small-scale and the more environmentally wise "primitive" as culturally significant. At its worst, it has functioned as the "handmaiden of imperialism," either overtly, as when British anthropologists worked for the colonial enterprise in Africa, or indirectly, as purveyors of "exotica" that reinforce the prejudices of urbanites and racial elites regarding the "savagery" of foreigners or of native populations and minorities closer to home. It is thus no accident that the first great theoretical paradigm in sociocultural anthropology was unilineal evolutionism, the idea that all cultures can be ranked along a grand scale that culminated, of course, with nineteenth-century European and American industrial civilization, the "best of all possible worlds." Darwin's The Origin of Species reinforced this approach to the assessment of cultural differences, in particular the concept of "natural selection." Unilineal evolutionism was predicated on two fundamental axioms: (1) the idea of progress, that the direction of cultural evolution is everywhere from "primitive" to "civilized" and (2) the idea of psychic unity, that all human beings, irrespective of their environment or specific history, will necessarily think the same thoughts and, therefore, progress through the same series of evolutionary stages. Sir Edward Burnett Tylor, who was the first anthropologist to define the concept of culture, was a major contributor to unilineal evolutionism, suggesting a three-stage model for the evolution of religion: animism (a belief that all phenomena are "animated" by unique spirit beings), polytheism, and monotheism, which, he held, is a prime characteristic of advanced civilizations. The most influential unilineal evolutionist was Lewis Henry Morgan (1818-1881), a successful American lawyer who practiced anthropology as an avocation. In Ancient Society (1877), Morgan posited a three-stage model for the evolution of culture: savagery, barbarism, and civilization. The first two stages, which he labeled, collectively, societas (society), as opposed to civitas (civilization), were each subdivided into three successive substages: lower, middle, and upper. His prime criterion for assigning cultures to one or another of these stages was the character and complexity of their technology. Because they lacked the bow and arrow, a prime technological criterion, Morgan assigned the ancient Hawaiians to "Middle Savagery," despite the fact that they practiced agriculture and had a highly complex social organization. Although Morgan's emphasis on material culture—tools, weapons, and other artifacts—had a significant influence on a later school of sociocultural anthropology, cultural materialism, he also pioneered the study of kinship systems. By the 1890s, however, the unilineal evolutionists' rigid adherence to paradigms based on incomplete and questionable ethnographic data (largely collected by missionaries, traders, colonial administrators, and so forth) was called into question by a new generation of anthropologists who had spent time in the field. (Most of the unilineal evolutionists were "armchair scholars," although both Tylor and Morgan did have some field experience in their youth, the former in Mexico and the latter among the Seneca, an Iroquois tribe that lived near his home in upstate New York.) In the United States, the chief critic of what was then called the comparative method in anthropology was Franz Boas, the most influential American anthropologist. A rigorous, scientifically trained German-born scholar, he later switched to anthropology and did extensive fieldwork among the Baffin Island (Canada) Eskimo (or Inuit), as well as the Native Americans of British Columbia. He and other critics of the unilineal approach, many of whom were his students, such as Alfred Louis Kroeber (1876-1960) and Robert Lowie (1883-1957), also called into question unilinealism's fundamental axioms, seriously questioning whether "progress" was in fact universal or lineal and whether it was possible to rank all human cultures according to a single evolutionary scheme. Finally, the Boasians attacked the concept of "psychic unity," suggesting that all cultures are inherently different from one another and that they should be assessed on their own merits and not comparatively. This approach, which stressed empirical field research over "armchair" theorizing, came to be known as historical particularism, and emphasized cultural relativism and diffusion rather than rigid evolutionary sequences. Anthropologists were enjoined to reconstruct the culture-history of particular tribes and societies, but not the evolution of culture per se. Emphasis was also placed on what has been called "salvage ethnography," gathering ethnographic data before the simpler cultures of the world were overwhelmed by Western culture. Boas left another important legacy: his work as a public intellectual who used his scholarly knowledge to educate the American public about racial equality. For Boas, who had experienced anti-Semitism in his native Germany, this kind of ANTHROPOLOGY New Dictionary of the History of Ideas 77 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 77 work on the part of intellectuals was crucial if America was to realize its democratic ideals. He inspired many of his students, including Ruth Benedict (1887-1948) and Margaret Mead (1901-1978), to their own forms of public work by his example. He also influenced several important African and Native American intellectuals who left anthropology for other pursuits, most notably the novelist Zora Neale Hurston (1891-1960), as well as the Brazilian sociologist Gilberto Freyre (1900-1987), who credits Boas's influence in the preface to his controversial books on race in Brazilian culture. In Britain, the empirical reaction to Morgan, Tylor, and their colleagues took a different turn. Most early-twentiethcentury British anthropologists, such as Bronislaw K. Malinowski (1884-1942), a Polish scholar who immigrated to England to complete his studies, and A. R. Radcliffe-Brown (1881-1955), were largely ahistorical; that is, they advocated a structural-functional, rather than historical, approach to the study of cultures. Their emphasis was primarily, if not in some cases wholly, on the here and now, on the social organization of living human communities and how the elements thereof were functionally interrelated to form integrated wholes. Malinowski, who spent four years (1915-1918) studying the culture of the Trobriand Islanders (near New Guinea), also focused on how social institutions function to serve basic human needs, such as shelter, reproduction, and nourishment. Radcliffe-Brown, who did field work in the Andaman Islands, South Africa, and Australia, drew liberally on the ideas of the French sociologist Émile Durkheim (1858-1917) in his attempts to discover what he called the "social laws" and "structural principles" that govern social organization everywhere. Among them, Boas, Malinowski, and Radcliffe-Brown trained or influenced at least two generations of sociocultural anthropologists on both sides of the Atlantic, from the early 1900s to the threshold of World War II, and, in Radcliffe-Brown's case, for a decade afterward as well. However, beginning in the late 1930s, a reaction to the essentially antitheoretical stance of the Boasians began to take shape, based on the assumption that all cultures are necessarily adapted to the ecological circumstances in which they exist. The leading advocate of "cultural ecology" was Julian H. Steward (1902-1972), whose book Theory of Culture Change (1955) had a major impact on the discipline. Other scholars, such as Leslie A. White (1900-1975) and British archaeologist V. Gordon Childe (1892-1957), drawing on the Marxist assumption that the "means of production" is everywhere crucial in determining the nature of society, emphasized the primacy of material culture. (Indeed, White consistently described himself as a disciple of Lewis Henry Morgan, whose work had, in turn, influenced Marx's collaborator Friedrich Engels [1820-1895].) These early "neo-evolutionists" of the mid-twentieth century all acknowledged a debt to Marx, but the new materialism in anthropology soon split into two camps: one that emphasized historical materialism, political economy, and the study of imperialism and inequality, and another that repudiated Marx and focused on questions of ecological adaptation and evolution. The former include Eric Wolf, Sidney Mintz, Eleanor Leacock, and John Murra. Mintz's study of the history of sugar and Wolf's timely comparative project on Peasant Wars of the Twentieth Century (1999), published in response to U.S. involvement in Vietnam, remain two classic studies from this school. Among the anti-Marxists, the best-known is the late Marvin R. Harris, whose provocative books for the general public argue that apparently "irrational" religious behavior, such as the Hindu refusal to eat meat or Jewish dietary law, can be attributed to biological needs unknown to practitioners. Neo-evolutionism was not the only post-Boasian development in sociocultural anthropology. Also in the late 1930s, a number of American anthropologists, among them Margaret Mead, Ruth Benedict, and Ralph Linton, drew selectively on Freud and other early twentieth-century psychologists and developed the "culture and personality" school, which emphasized the interface among individual personalities and the cultures they share, as well as the "infant disciplines"— weaning and toilet training—and their effects on both the formation of individual personality structures and the nature of particular cultures. Early and harsh toilet training was held to produce "anal" personalities and authoritarian cultures, whereas relaxed attitudes toward sphincter control and related processes produce relaxed social systems. At the start of the twenty-first century this school has few proponents, but psychological anthropology remains a recognized branch of sociocultural anthropology. In the 1950s, linguistics began to influence an increasing number of anthropologists. If the cultural ecologists and materialists had come to conceive of culture as essentially an adaptive system, their linguistically oriented colleagues were concerned with cognitive systems and shared symbols, with how people attach meaning to the world around them. Initially known as "ethnoscience," this approach has come to be called "cognitive anthropology." Closely related to it are two other approaches also concerned with meaning. One of them, closely identified with the eminent French anthropologist Claude Lévi-Strauss, is structuralism. Extremely influential outside of anthropology, especially in France, this school focused on underlying structures of thought based on binary oppositions, like the binary mathematical code used by computers. From simple pairs such as hot/cold or up/down, cultures construct elaborate systems of myth and meaning that shape everything from cooking to kinship, as well as providing answers to questions about life and death. When initially published, his partially autobiographical Tristes tropiques (1955; The sad tropics) in which he recounts his flight from Nazi Germany to find refuge among the tribes of Amazonian Brazil, was perhaps more influential than his dense and difficult works of structural analysis. Later, however, it was criticized for its portrayal of Native Americans as seemingly "outside of history." In America, a different school of anthropology, "symbolic anthropology," would ultimately prove more influential than structuralism. This approach, which draws on the same linguistic and cognitive models as structuralism, emphasizes emotion and affect in addition to cognition, and looks back to traditional anthropological studies of magico-religious belief systems, especially Durkheim's work. Among the more important contributors to symbolic anthropology have been ANTHROPOLOGY 78 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 78 the British scholars Victor W. Turner and Mary Douglas, and the American anthropologist Clifford Geertz. Turner's study of the multivalent symbolic meanings of the "milk-tree" in the life of the Ndembu of northwestern Zambia—its milklike sap stands for everything from semen to mothers' milk—remains a classic. In it, he borrows from Arnold van Gennep's (1873- 1957) classic work, Les rites de passage (1909; The Rites of Passage), the concept of the "transition" stage in a rite of passage, which Turner rechristened "liminality." Douglas studied the symbolic opposition between what she calls "purity and danger," as exemplified in her brilliant analysis of the food taboos in the Old Testament, which, she argues, reflect a fear of anomalous animals, like the pig, which is neither a browser nor a ruminant. Geertz, who has done extensive fieldwork in Indonesia and Morocco, is famous for his in-depth analysis of the symbolism of cockfighting in Bali, as well as for the concept of "webs of significance," the idea that all human beings are necessarily bound together by intricate symbolic "webs" in terms of which they collectively confront external reality. In the last two decades, sociocultural anthropology has seen the emergence of the "post-isms": postcolonialism, which examines the impact of neoliberal capitalism on recently decolonized states; poststructuralism, which critiques the work of Lévi-Strauss and other classic structuralists; and, most importantly, postmodernism, a manifestation of a broader intellectual movement in architecture, literature, cinema, and the arts predicated in fair measure on the theories of French scholars Jacques Derrida and Michel Foucault. Postmodernists question the validity of externally imposed orders, as well as linear analysis and "essentialist" interpretations, and assert that anthropologists should "deconstruct" the cultures they are attempting to understand. Moreover, postmodern ethnographies often focus as much on the ethnographers as they do the communities they have studied, as any cultural account must necessarily include the impact of the investigator on the investigated, and vice versa. This element in postmodernism has been criticized a great deal, especially by materialists such as Sidney Mintz. Among the more prominent postmodernist anthropologists are Stephen A. Tyler, Vincent Crapanzano, and James A. Boon. In recent years, sociocultural anthropologists—from a variety of perspectives—have been concerned with globalization, transnational communities, such as the African, Indian, and Chinese diasporas, and borderlands, in which the inhabitants freely share culture traits that are otherwise, for the most part, extremely different and seemingly contradictory and integrate them into new, "hybrid" cultures. There has also been increased concern with feminism, especially what has been labeled "thirdwave feminism," and with the heretofore often neglected roles women play in shaping cultural norms, as well as the inequality that persists almost everywhere between the sexes. Finally, gay and lesbian, as well as transsexual, studies form a significant element of contemporary sociocultural anthropology, leading to a major reassessment of the concept of "gender" and the extent to which it is socially constructed rather than innate. This brief overview of the history and current state of the discipline of anthropology, primarily in the United States, has necessarily omitted mention of many specific developments and schools of thought, for example, the Kulturkreis, or "culture-circle" school, centered on Father Wilhelm Schmidt (1868-1954), that took shape in Vienna in the early years of the last century; the impact of Sir James G. Frazer's (1854- 1941) The Golden Bough (1890), which seduced Malinowski into completing his studies in England; and the singlediffusionist ideas of G. Elliot Smith (1871-1937). A great many important contributors, to say nothing of specific topical and regional specialties, such as urban anthropology, esthetic anthropology, East Asian anthropology, African anthropology, and so on, have been slighted. Nevertheless, this discussion provides a general description of what anthropology, in its several major dimensions, is about, how it got that way, and the overwhelming importance of the concept of culture to the disciplin

The Nineteenth Century: From Algebra to Algebras Lagrange's algebraic ambitions inspired some new algebras from the late eighteenth century onward. The names used below are modern. Firstly, in differential operators, the process of differentiating a function in the calculus was symbolized by D, with the converse operation of integration taken as 1/D, with 1 denoting the identity operation; similarly, finite differencing was symbolized by , with summation taken as 1/. Much success followed, especially in solving differential and difference equations, though the workings of the method remained mysterious. One earnest practitioner from the 1840s was George Boole (1815-1864), who then imitated it to form another one, today called Boolean algebra, to found logic. Secondly, in functional equations, the "object" was the function f itself ("sine of," say) rather than its values. In this context F.-J. Servois (1767-1847) individuated two properties in 1814: "commutative" (fg g f ) and "distributive" (f (g h) fg f h); they were to be important also in several other algebras. As part of his effort to extend Lagrange's algebraization of applied mathematics, Hamilton introduced another new algebra in 1843. He enlarged complex numbers into quaternions q with four units 1, i, j and k: q : a ib jc kd, where i2 j2 k2 ijk 1; and ij k and ji k and similar properties. He also individuated the property of associativity (his word), where i(jk) (ij)k. At that time the German Hermann Grassmann (1809-1877) published Ausdehnungslehre (1844), a very general algebra for expressing relationships between geometrical magnitudes. It was capable of several other readings also; for example, later his brother Robert adapted it to rediscover parts of Boolean algebra. Reception of the Grassmanns was much slower than for Hamilton; but by the 1880s their theories were gaining much attention, with quaternions extended to, for example, the eight-unit "octaves," and boasting a supporting "International Association." However, the American J. W. Gibbs (1839-1903) was decomposing quaternions into separate theories of vector algebra and of vector analysis, and this revision came to prevail among mathematicians and physicists. Another collection of algebras developed to refine means of handling systems of linear equations. The first step (1840s) was to introduce determinants, especially to express the formulae for the roots of systems of linear equations. The more profound move of inventing matrices as a manner of expressing and manipulating systems themselves dates from the 1860s. The Englishmen J. J. Sylvester (1814-1897) and Arthur Cayley (1821-1895) played important roles in developing matrices (Sylvester's word). An important inspiration was their study of quantics, homogeneous polynomials of some degree in any finite number of variables: the task was to find algebraic expressions that preserved their form under linear transformation of those variables. They and other figures also contributed to the important theory of the "latent roots and vectors" (Sylvester again) of matrices. Determinants and matrices together are known today as linear algebra; the analysis of quantics is part of invariant theory. On polynomial equations, Lagrange's study of properties of functions of their roots led especially from the 1840s to a theory of substitution groups with Cauchy and others, where the operation of replacing one root by another one was treated as new algebra. Abel's even younger French contemporary Évariste Galois (1811-1832) found some remarkable properties of substitutions around 1830. This theory of substitutions gradually generalized to group theory. In its abstract form, as pioneered by the German Richard Dedekind (1831-1916) in the 1850s, the theory was based upon a given collection of laws obeyed by objects that were not specified: substitutions provided one interpretation, but many others were found, such as their philological intrusion into projective and (non-)Euclidean geometries. The steady accumulation of these applications increased the importance of group theory. Other algebras also appeared; for example, one to express the basic properties of probability theory. In analysis the Norwegian Sophus Lie (1842-1899) developed in the 1880s a theory of "infinitesimal transformations" as linear differential operators on functions, and formed it as an algebra that is now named after him, including a group version; it has become an important subject in its own right.

The Nineteenth Century: From Algebra to Algebras Lagrange's algebraic ambitions inspired some new algebras from the late eighteenth century onward. The names used below are modern. Firstly, in differential operators, the process of differentiating a function in the calculus was symbolized by D, with the converse operation of integration taken as 1/D, with 1 denoting the identity operation; similarly, finite differencing was symbolized by , with summation taken as 1/. Much success followed, especially in solving differential and difference equations, though the workings of the method remained mysterious. One earnest practitioner from the 1840s was George Boole (1815-1864), who then imitated it to form another one, today called Boolean algebra, to found logic. Secondly, in functional equations, the "object" was the function f itself ("sine of," say) rather than its values. In this context F.-J. Servois (1767-1847) individuated two properties in 1814: "commutative" (fg g f ) and "distributive" (f (g h) fg f h); they were to be important also in several other algebras. As part of his effort to extend Lagrange's algebraization of applied mathematics, Hamilton introduced another new algebra in 1843. He enlarged complex numbers into quaternions q with four units 1, i, j and k: q : a ib jc kd, where i2 j2 k2 ijk 1; and ij k and ji k and similar properties. He also individuated the property of associativity (his word), where i(jk) (ij)k. At that time the German Hermann Grassmann (1809-1877) published Ausdehnungslehre (1844), a very general algebra for expressing relationships between geometrical magnitudes. It was capable of several other readings also; for example, later his brother Robert adapted it to rediscover parts of Boolean algebra. Reception of the Grassmanns was much slower than for Hamilton; but by the 1880s their theories were gaining much attention, with quaternions extended to, for example, the eight-unit "octaves," and boasting a supporting "International Association." However, the American J. W. Gibbs (1839-1903) was decomposing quaternions into separate theories of vector algebra and of vector analysis, and this revision came to prevail among mathematicians and physicists. Another collection of algebras developed to refine means of handling systems of linear equations. The first step (1840s) was to introduce determinants, especially to express the formulae for the roots of systems of linear equations. The more profound move of inventing matrices as a manner of expressing and manipulating systems themselves dates from the 1860s. The Englishmen J. J. Sylvester (1814-1897) and Arthur Cayley (1821-1895) played important roles in developing matrices (Sylvester's word). An important inspiration was their study of quantics, homogeneous polynomials of some degree in any finite number of variables: the task was to find algebraic expressions that preserved their form under linear transformation of those variables. They and other figures also contributed to the important theory of the "latent roots and vectors" (Sylvester again) of matrices. Determinants and matrices together are known today as linear algebra; the analysis of quantics is part of invariant theory. On polynomial equations, Lagrange's study of properties of functions of their roots led especially from the 1840s to a theory of substitution groups with Cauchy and others, where the operation of replacing one root by another one was treated as new algebra. Abel's even younger French contemporary Évariste Galois (1811-1832) found some remarkable properties of substitutions around 1830. This theory of substitutions gradually generalized to group theory. In its abstract form, as pioneered by the German Richard Dedekind (1831-1916) in the 1850s, the theory was based upon a given collection of laws obeyed by objects that were not specified: substitutions provided one interpretation, but many others were found, such as their philological intrusion into projective and (non-)Euclidean geometries. The steady accumulation of these applications increased the importance of group theory. Other algebras also appeared; for example, one to express the basic properties of probability theory. In analysis the Norwegian Sophus Lie (1842-1899) developed in the 1880s a theory of "infinitesimal transformations" as linear differential operators on functions, and formed it as an algebra that is now named after him, including a group version; it has become an important subject in its own right.

nti-Imperialism When the Haitian sugar economy collapsed with the slave revolt at the end of the eighteenth century, much of this production shifted to the neighboring island of Cuba. As a result, while other colonial economies stagnated, leading to elite discontent with European rule, the Cuban economy took off, undercutting any impetus for a serious anticolonial movement. As a result, the island remained a Spanish colony until the end of the nineteenth century. José Martí (1853-1895) perhaps best represents Cuban anticolonial movements. Born to peninsular parents (his father was a Spanish official), he was a teenage rebel who was exiled to Spain for his political activities and later worked in the United States as a journalist. He was killed in battle on 19 May 1895, when he returned to the island to join the anticolonial struggle. Much of Martí's ideology emerged out of the context of nineteenth-century liberalism, but his contact with radical movements in the United States also imbued his anticolonialism with aspects of social revolution. Rather than seeking to merely change one elite for another, as had happened when colonialism ended in most other American republics, he wanted true social changes. He was an anti-imperialist and a revolutionary nationalist who worked against economic dependency as well as for political independence. Martí, like Venezuelan independence leader Simón Bolívar (1783-1830) before him and Argentine-born guerrilla leader Ernesto "Che" Guevara (1928-1967) after him, called for a unified America to confront the common problems left by a legacy of European colonization. After Martí's death, with Cuba on the verge of gaining its independence in 1898, the United States intervened in order to control the economic wealth of the colony for its own benefit and to prevent the establishment of another black republic on the Haitian model. Disguising its efforts as altruism, the U.S. Senate passed the Teller Amendment, which declared that the United States would not recolonize the island. Although this legislation thwarted the imperial intent of the United States to annex the island, the 1901 Platt Amendment declared "that the government of Cuba consents that the United States may exercise the right to intervene for the preservation of Cuban independence, the maintenance of a government adequate for the protection of life, property and individual liberty" (Bevans, pp. 116-117). This led to a unique colonial situation, in which Cuba had a civilian government but not one that could be called a democracy. The island became an extension of Miami, and U.S. intervention promoted and perpetuated corruption, violence, and economic stagnation. This set the stage for the successful 1959 Cuban Revolution, which freed the country from economic colonization, much as independence in 1898 had freed it from Spain's political colonization. After the triumph of the revolution, Cuba became a global leader in postcolonial anti-imperialist struggles. Although the Teller Amendment prohibited the annexation of Cuba to the United States, the legislation stood mute on Spain's few remaining colonial possessions in the Caribbean. Most importantly, this led the United States to occupy the island of Puerto Rico, a territory it continues to hold in the twenty-first century. In fact, after Namibia was freed from South African control in the 1980s, Puerto Rico became the sole remaining item on the agenda of the United Nations's decolonization committee, although anticolonial struggles continue elsewhere, notably in French Polynesia. For the United States, Puerto Rico remains an unresolved and seemingly irresolvable colonial question. In the early twenty-first century the island is an Estado Libre Asociado (literally, Associated Free State, but defined by the United States as a commonwealth), which means that it is an unincorporated territory that belongs to, but is not part of, the United States. This leaves Puerto Rico subject to the whims of the United States, and its residents with few legal avenues through which to address offenses committed against them. As an example of the colonial relationship, residents on the island were made ANTICOLONIALISM 86 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 86 U.S. citizens during World War I so that they could be drafted to fight in Europe, but even in the early twenty-first century they do not have the right to political representation in Washington. However, the economic advantages of their status, including the ability to migrate freely to the United States to work, create a situation where only a small percentage of Puerto Ricans favor independence for the island, but resentment at the island's colonial status is nonetheless widespread and deeply felt. Anticolonial sentiments in Puerto Rico flourished during the second half of the twentieth century, and in part gained a focus around political campaigns to halt U.S. naval bombing practice at Vieques Island. In 1941, with World War II on the horizon, the United States military acquired most of the land at Vieques as an extension of the Roosevelt Roads Naval Station in order to develop a base like Pearl Harbor for its Atlantic fleet. Noise from bombs and low-flying airplanes engaged in practice maneuvers disturbed inhabitants and disrupted the fishing economy. The later use of napalm, depleted uranium, and other experimental weapons left the area heavily contaminated. The imperialist nature of the military's occupation of Vieques quickly gave rise to popular sentiments against the navy's presence and calls for them to leave. Finally, on 19 April 1999, two off-target bombs destroyed an observation post, killing David Sanes Rodríguez, a local civilian employee. This triggered a massive civil disobedience campaign that finally forced the navy to leave Vieques on 1 May 2003. Independence leaders such as Pedro Albizu Campos and Rubén Berríos Martínez provided leadership to the campaigns, seeing Vieques as an important part of an anticolonial and anti-imperialist struggle. Their slogan became "Today Vieques, tomorrow Puerto Rico."

nti-Imperialism When the Haitian sugar economy collapsed with the slave revolt at the end of the eighteenth century, much of this production shifted to the neighboring island of Cuba. As a result, while other colonial economies stagnated, leading to elite discontent with European rule, the Cuban economy took off, undercutting any impetus for a serious anticolonial movement. As a result, the island remained a Spanish colony until the end of the nineteenth century. José Martí (1853-1895) perhaps best represents Cuban anticolonial movements. Born to peninsular parents (his father was a Spanish official), he was a teenage rebel who was exiled to Spain for his political activities and later worked in the United States as a journalist. He was killed in battle on 19 May 1895, when he returned to the island to join the anticolonial struggle. Much of Martí's ideology emerged out of the context of nineteenth-century liberalism, but his contact with radical movements in the United States also imbued his anticolonialism with aspects of social revolution. Rather than seeking to merely change one elite for another, as had happened when colonialism ended in most other American republics, he wanted true social changes. He was an anti-imperialist and a revolutionary nationalist who worked against economic dependency as well as for political independence. Martí, like Venezuelan independence leader Simón Bolívar (1783-1830) before him and Argentine-born guerrilla leader Ernesto "Che" Guevara (1928-1967) after him, called for a unified America to confront the common problems left by a legacy of European colonization. After Martí's death, with Cuba on the verge of gaining its independence in 1898, the United States intervened in order to control the economic wealth of the colony for its own benefit and to prevent the establishment of another black republic on the Haitian model. Disguising its efforts as altruism, the U.S. Senate passed the Teller Amendment, which declared that the United States would not recolonize the island. Although this legislation thwarted the imperial intent of the United States to annex the island, the 1901 Platt Amendment declared "that the government of Cuba consents that the United States may exercise the right to intervene for the preservation of Cuban independence, the maintenance of a government adequate for the protection of life, property and individual liberty" (Bevans, pp. 116-117). This led to a unique colonial situation, in which Cuba had a civilian government but not one that could be called a democracy. The island became an extension of Miami, and U.S. intervention promoted and perpetuated corruption, violence, and economic stagnation. This set the stage for the successful 1959 Cuban Revolution, which freed the country from economic colonization, much as independence in 1898 had freed it from Spain's political colonization. After the triumph of the revolution, Cuba became a global leader in postcolonial anti-imperialist struggles. Although the Teller Amendment prohibited the annexation of Cuba to the United States, the legislation stood mute on Spain's few remaining colonial possessions in the Caribbean. Most importantly, this led the United States to occupy the island of Puerto Rico, a territory it continues to hold in the twenty-first century. In fact, after Namibia was freed from South African control in the 1980s, Puerto Rico became the sole remaining item on the agenda of the United Nations's decolonization committee, although anticolonial struggles continue elsewhere, notably in French Polynesia. For the United States, Puerto Rico remains an unresolved and seemingly irresolvable colonial question. In the early twenty-first century the island is an Estado Libre Asociado (literally, Associated Free State, but defined by the United States as a commonwealth), which means that it is an unincorporated territory that belongs to, but is not part of, the United States. This leaves Puerto Rico subject to the whims of the United States, and its residents with few legal avenues through which to address offenses committed against them. As an example of the colonial relationship, residents on the island were made ANTICOLONIALISM 86 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 86 U.S. citizens during World War I so that they could be drafted to fight in Europe, but even in the early twenty-first century they do not have the right to political representation in Washington. However, the economic advantages of their status, including the ability to migrate freely to the United States to work, create a situation where only a small percentage of Puerto Ricans favor independence for the island, but resentment at the island's colonial status is nonetheless widespread and deeply felt. Anticolonial sentiments in Puerto Rico flourished during the second half of the twentieth century, and in part gained a focus around political campaigns to halt U.S. naval bombing practice at Vieques Island. In 1941, with World War II on the horizon, the United States military acquired most of the land at Vieques as an extension of the Roosevelt Roads Naval Station in order to develop a base like Pearl Harbor for its Atlantic fleet. Noise from bombs and low-flying airplanes engaged in practice maneuvers disturbed inhabitants and disrupted the fishing economy. The later use of napalm, depleted uranium, and other experimental weapons left the area heavily contaminated. The imperialist nature of the military's occupation of Vieques quickly gave rise to popular sentiments against the navy's presence and calls for them to leave. Finally, on 19 April 1999, two off-target bombs destroyed an observation post, killing David Sanes Rodríguez, a local civilian employee. This triggered a massive civil disobedience campaign that finally forced the navy to leave Vieques on 1 May 2003. Independence leaders such as Pedro Albizu Campos and Rubén Berríos Martínez provided leadership to the campaigns, seeing Vieques as an important part of an anticolonial and anti-imperialist struggle. Their slogan became "Today Vieques, tomorrow Puerto Rico."

Colonial Origins of the Idea The earliest traces of "anticolonialism" can be found in the documents compiled by scholar-officials working within the various colonial administrations. Specifically, political officers who accompanied the initial military campaigns of conquest and later those within the civil service were among the first to interpret and write about the wide range of responses to colonial operations in the region. Many of these accounts speak of anticolonial resistance as brief interludes or disturbances, mere interruptions to the social order established by the authorities. Within official reports, gazetteers, manuals, and censuses, administrators organized, defined, and made sense of these outbreaks, thereby creating the very categories and perspectives under which "resistance" and "anticolonialism" would eventually be considered. Throughout the region, officials identified key cultural markers such as protective tattooing, charms, and astronomical symbols as part of the "traditional" uniform of resistance, which combined superstitious beliefs and religion in order to appeal to the masses who participated in these movements. Other features included the rebuilding of royal palaces and religious edifices in mountain strongholds that were said to represent cosmological and spiritual power. Case studies demonstrate these similarities in the early minlaung (prince) movements of Burma (1885-1890s), the "save-theemperor" movements of northern Vietnam (1885-1896), and the Java War (1825-1830). Characteristics of anticolonial resistance were first identified, labeled, and codified by officials whose jobs were to affirm colonial policies as much as they were supposed to collect and interpret the societies they were charged with administering. More importantly, colonial officials were interested in establishing the causal factors for these disturbances and wrote their reports accordingly, influencing scholars who would later use these sources, their approaches, and their descriptions for their own studies. Reports often stated that these brief instances of violence resulted from irrationality, superstition, gullibility, false prophets, religious fanaticism, and other inherent cultural traits that predictably would endure if not for colonial intervention. It was no surprise that initial pockets of resistance that faced the Dutch in Java, the British in Lower Burma, and the French in Vietnam would be considered akin both in character and origin to the anticolonial rebellions in the early twentieth century, though the circumstances would be considerably different. Thus, officials were charged with finding and naming examples of what was "anticolonial" in Southeast Asia partly in hopes of establishing the difference between traditional Asia and modern Europe. In this manner, the idea of anticolonialism began to take shape along a binary framing that placed Southeast Asians and Europeans at opposite ends, structuring the way in which protest, resistance, and revolt would be studied in the years to come.

Colonial Origins of the Idea The earliest traces of "anticolonialism" can be found in the documents compiled by scholar-officials working within the various colonial administrations. Specifically, political officers who accompanied the initial military campaigns of conquest and later those within the civil service were among the first to interpret and write about the wide range of responses to colonial operations in the region. Many of these accounts speak of anticolonial resistance as brief interludes or disturbances, mere interruptions to the social order established by the authorities. Within official reports, gazetteers, manuals, and censuses, administrators organized, defined, and made sense of these outbreaks, thereby creating the very categories and perspectives under which "resistance" and "anticolonialism" would eventually be considered. Throughout the region, officials identified key cultural markers such as protective tattooing, charms, and astronomical symbols as part of the "traditional" uniform of resistance, which combined superstitious beliefs and religion in order to appeal to the masses who participated in these movements. Other features included the rebuilding of royal palaces and religious edifices in mountain strongholds that were said to represent cosmological and spiritual power. Case studies demonstrate these similarities in the early minlaung (prince) movements of Burma (1885-1890s), the "save-theemperor" movements of northern Vietnam (1885-1896), and the Java War (1825-1830). Characteristics of anticolonial resistance were first identified, labeled, and codified by officials whose jobs were to affirm colonial policies as much as they were supposed to collect and interpret the societies they were charged with administering. More importantly, colonial officials were interested in establishing the causal factors for these disturbances and wrote their reports accordingly, influencing scholars who would later use these sources, their approaches, and their descriptions for their own studies. Reports often stated that these brief instances of violence resulted from irrationality, superstition, gullibility, false prophets, religious fanaticism, and other inherent cultural traits that predictably would endure if not for colonial intervention. It was no surprise that initial pockets of resistance that faced the Dutch in Java, the British in Lower Burma, and the French in Vietnam would be considered akin both in character and origin to the anticolonial rebellions in the early twentieth century, though the circumstances would be considerably different. Thus, officials were charged with finding and naming examples of what was "anticolonial" in Southeast Asia partly in hopes of establishing the difference between traditional Asia and modern Europe. In this manner, the idea of anticolonialism began to take shape along a binary framing that placed Southeast Asians and Europeans at opposite ends, structuring the way in which protest, resistance, and revolt would be studied in the years to come.

Darwin, Spencer, and Evolution Charles Darwin did not use the term "altruism," preferring to use older terms with which he was familiar from his reading of moral philosophy in the 1830s and 1840s, such as "benevolence," "sympathy," and "moral sense" (see Darwin; Richards). In his Descent of Man (1871), Darwin famously developed a group-selection explanation for the apparent self-sacrificing behavior of neuter insects. According to this view, communities of insects that happen to contain self-sacrificers benefit in the struggle for existence at the expense of communities made up of more selfish individuals with which they are in competition. As a result, contrary to the popular caricature of Darwinian nature as dominated by selfishness and competition, Darwin actually argued that benevolence and cooperation are entirely natural—that they are deeply embedded in our biology. The problem of how to account for altruistic behavior, especially in insects, continued to puzzle biologists (see Lustig) and became a central topic in the new discipline of "sociobiology" founded by the entomologist E. O. Wilson in the 1970s. In the English-speaking world of the later nineteenth century, however, it was Herbert Spencer (1820-1903) rather ALTRUISM 50 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 50 than Charles Darwin (1809-1882) who was celebrated as the leading exponent of the philosophy of evolution. Spencer was also one of the writers most responsible for the spread of the language of altruism (and sociology) from the 1870s onward (see Dixon, 2004). Spencer acknowledged that he had borrowed these terms from Comte. In his Principles of Psychology (second edition of 1870-1872) and Data of Ethics (1879), he developed his theory of how altruistic instincts could evolve and be inherited and how they would increase as social evolution progressed. He denied, however, that by doing so he endorsed Comte's views on philosophy, science, or religion. Indeed, although Spencer agreed with Comte that altruism would increase as societies evolved further, his vision of the ideal future society was in many ways the opposite of the Comtean vision. Whereas Comte envisaged a hierarchical and, in effect, totalitarian society in which individuals sacrificed personal freedom in the interests of order and progress, Spencer hoped for a society in which individual freedoms (and responsibilities) were maximized (see Richards). Spencer's hope was that people would increasingly act in altruistic ways spontaneously and voluntarily, without state intervention. Although Spencer had a very elevated reputation and a wide sphere of influence in Britain and America in the 1860s and 1870s, the scientific rejection of his belief in the heritability of acquired moral and intellectual characteristics, along with the rise of a political consensus in favor of some kind of state provision of welfare, rendered much of his thought untenable by the early twentieth century.

Darwin, Spencer, and Evolution Charles Darwin did not use the term "altruism," preferring to use older terms with which he was familiar from his reading of moral philosophy in the 1830s and 1840s, such as "benevolence," "sympathy," and "moral sense" (see Darwin; Richards). In his Descent of Man (1871), Darwin famously developed a group-selection explanation for the apparent self-sacrificing behavior of neuter insects. According to this view, communities of insects that happen to contain self-sacrificers benefit in the struggle for existence at the expense of communities made up of more selfish individuals with which they are in competition. As a result, contrary to the popular caricature of Darwinian nature as dominated by selfishness and competition, Darwin actually argued that benevolence and cooperation are entirely natural—that they are deeply embedded in our biology. The problem of how to account for altruistic behavior, especially in insects, continued to puzzle biologists (see Lustig) and became a central topic in the new discipline of "sociobiology" founded by the entomologist E. O. Wilson in the 1970s. In the English-speaking world of the later nineteenth century, however, it was Herbert Spencer (1820-1903) rather ALTRUISM 50 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 50 than Charles Darwin (1809-1882) who was celebrated as the leading exponent of the philosophy of evolution. Spencer was also one of the writers most responsible for the spread of the language of altruism (and sociology) from the 1870s onward (see Dixon, 2004). Spencer acknowledged that he had borrowed these terms from Comte. In his Principles of Psychology (second edition of 1870-1872) and Data of Ethics (1879), he developed his theory of how altruistic instincts could evolve and be inherited and how they would increase as social evolution progressed. He denied, however, that by doing so he endorsed Comte's views on philosophy, science, or religion. Indeed, although Spencer agreed with Comte that altruism would increase as societies evolved further, his vision of the ideal future society was in many ways the opposite of the Comtean vision. Whereas Comte envisaged a hierarchical and, in effect, totalitarian society in which individuals sacrificed personal freedom in the interests of order and progress, Spencer hoped for a society in which individual freedoms (and responsibilities) were maximized (see Richards). Spencer's hope was that people would increasingly act in altruistic ways spontaneously and voluntarily, without state intervention. Although Spencer had a very elevated reputation and a wide sphere of influence in Britain and America in the 1860s and 1870s, the scientific rejection of his belief in the heritability of acquired moral and intellectual characteristics, along with the rise of a political consensus in favor of some kind of state provision of welfare, rendered much of his thought untenable by the early twentieth century.

Victorian Agnosticism Herbert Spencer's First Principles (1862) laid the groundwork for the hugely ambitious, multivolume Synthetic Philosophy, finally completed in 1896, which articulated Spencer's vision of how philosophy, biology, sociology, ethics, religion, and society itself needed to be reconceptualized and transformed in the light of the doctrine of evolution (see Peel). The first part of the First Principles, entitled "The Unknowable," was considered the Bible of agnosticism for the rest of the Victorian period. Spencer argued that science and religion could be reconciled if they recognized that both, ultimately, were concerned with realities whose foundations were beyond the grasp of human knowledge. However, while science could get on with measuring, analyzing, and interpreting observable phenomena, nothing was left for theologians but total silence in the face of the unknowable. There was no role for revelation in Spencer's proposed scientific and agnostic religion, and Mansel's conservative critics saw in Spencer's system exactly the conclusions they had feared would follow from Mansel's teachings on the impotence of human reason in the theological realm. Although Spencer was later generally considered to be the leading representative of agnosticism, the terms agnostic and agnosticism did not themselves come into use until about ten years after the publication of the First Principles. The terms gained currency through their use by Spencer but also by the theologian and journalist R. H. Hutton, the editor of the Spectator in the 1870s, and the lapsed Anglican minister Leslie Stephen, who, after leaving the Church of England, wrote An Agnostic's Apology (1876). Although he made some use of the term in his writings from the 1870s onward, it was only in 1889 that Thomas Huxley revealed himself as the inventor of the terms agnostic and agnosticism and explained how and why he had come to coin them (Lightman, 2002). One of Huxley's earlier essays that gained him much attention (and much criticism) was entitled "On the Physical Basis of Life" (reprinted in Collected Essays, vol. 1). This essay, based on a lecture delivered in Edinburgh in 1868, just a year before he coined the term agnostic, is one of the most helpful illustrations of the essence of Huxley's agnosticism. Although the essay was criticized for espousing a materialistic view of life (the idea that all living things are made up of the same substance—"protoplasm"), in fact it defended a nescient or radically empiricist understanding of science as producing nothing more than a set of symbols with which to describe and organize observable phenomena. Huxley rejected materialism on the grounds that it was impossible for empirical science to determine anything at all about the nature of any putative substance or substances underlying the phenomena or of any supposed laws or causes. "In itself," Huxley said, "it is of little moment whether we express the phænomena of matter in terms of spirit; or the phænomena of spirit in terms of matter: matter may be regarded as a form of thought, thought may be regarded as a property of matter—each statement has a certain relative truth" (1893-1894, vol. 1, p. 164). (The materialistic terminology was to be preferred, however, for the pragmatic reason that it connected with other areas of scientific investigation, which were expressed in the same terms, and for the reason that spiritualistic terminology was entirely barren.) Huxley denied that this was a "new philosophy" and especially that it was the invention of the positivist Auguste Comte (1798-1857), as some supposed. Comte, he said, lacked entirely "the vigour of thought and the exquisite clearness of style" of the true author of this philosophy, "the man whom I make bold to term the most acute thinker of the eighteenth century— even though that century produced Kant" (1893-1894, vol. 1, p. 158). The man Huxley had in mind, of course, was Hume. The closing pages of "On the Physical Basis of Life," then, show several important things about Huxley's agnosticism. They show that Huxley felt the need for a new label—agnostic—not in order to distance himself from Christianity (everyone already knew he was an opponent of theological orthodoxy) but primarily in order to repudiate the labels materialist, atheist, and positivist. They also show that Huxley considered Hume to be at least as important as Kant, if not more important, in the historical pedigree of agnosticism. And finally, they show that agnosticism involved admitting ignorance about the fundamental nature of the physical universe as well as about the existence and attributes of the divine.

Victorian Agnosticism Herbert Spencer's First Principles (1862) laid the groundwork for the hugely ambitious, multivolume Synthetic Philosophy, finally completed in 1896, which articulated Spencer's vision of how philosophy, biology, sociology, ethics, religion, and society itself needed to be reconceptualized and transformed in the light of the doctrine of evolution (see Peel). The first part of the First Principles, entitled "The Unknowable," was considered the Bible of agnosticism for the rest of the Victorian period. Spencer argued that science and religion could be reconciled if they recognized that both, ultimately, were concerned with realities whose foundations were beyond the grasp of human knowledge. However, while science could get on with measuring, analyzing, and interpreting observable phenomena, nothing was left for theologians but total silence in the face of the unknowable. There was no role for revelation in Spencer's proposed scientific and agnostic religion, and Mansel's conservative critics saw in Spencer's system exactly the conclusions they had feared would follow from Mansel's teachings on the impotence of human reason in the theological realm. Although Spencer was later generally considered to be the leading representative of agnosticism, the terms agnostic and agnosticism did not themselves come into use until about ten years after the publication of the First Principles. The terms gained currency through their use by Spencer but also by the theologian and journalist R. H. Hutton, the editor of the Spectator in the 1870s, and the lapsed Anglican minister Leslie Stephen, who, after leaving the Church of England, wrote An Agnostic's Apology (1876). Although he made some use of the term in his writings from the 1870s onward, it was only in 1889 that Thomas Huxley revealed himself as the inventor of the terms agnostic and agnosticism and explained how and why he had come to coin them (Lightman, 2002). One of Huxley's earlier essays that gained him much attention (and much criticism) was entitled "On the Physical Basis of Life" (reprinted in Collected Essays, vol. 1). This essay, based on a lecture delivered in Edinburgh in 1868, just a year before he coined the term agnostic, is one of the most helpful illustrations of the essence of Huxley's agnosticism. Although the essay was criticized for espousing a materialistic view of life (the idea that all living things are made up of the same substance—"protoplasm"), in fact it defended a nescient or radically empiricist understanding of science as producing nothing more than a set of symbols with which to describe and organize observable phenomena. Huxley rejected materialism on the grounds that it was impossible for empirical science to determine anything at all about the nature of any putative substance or substances underlying the phenomena or of any supposed laws or causes. "In itself," Huxley said, "it is of little moment whether we express the phænomena of matter in terms of spirit; or the phænomena of spirit in terms of matter: matter may be regarded as a form of thought, thought may be regarded as a property of matter—each statement has a certain relative truth" (1893-1894, vol. 1, p. 164). (The materialistic terminology was to be preferred, however, for the pragmatic reason that it connected with other areas of scientific investigation, which were expressed in the same terms, and for the reason that spiritualistic terminology was entirely barren.) Huxley denied that this was a "new philosophy" and especially that it was the invention of the positivist Auguste Comte (1798-1857), as some supposed. Comte, he said, lacked entirely "the vigour of thought and the exquisite clearness of style" of the true author of this philosophy, "the man whom I make bold to term the most acute thinker of the eighteenth century— even though that century produced Kant" (1893-1894, vol. 1, p. 158). The man Huxley had in mind, of course, was Hume. The closing pages of "On the Physical Basis of Life," then, show several important things about Huxley's agnosticism. They show that Huxley felt the need for a new label—agnostic—not in order to distance himself from Christianity (everyone already knew he was an opponent of theological orthodoxy) but primarily in order to repudiate the labels materialist, atheist, and positivist. They also show that Huxley considered Hume to be at least as important as Kant, if not more important, in the historical pedigree of agnosticism. And finally, they show that agnosticism involved admitting ignorance about the fundamental nature of the physical universe as well as about the existence and attributes of the divine.

Historical and Intellectual Context African-American studies academic units came into existence as a result of great political pressure on European institutions of higher learning in a demand for space for the African voice and experience in the late 1960s, during the Black Power movement. No longer satisfied to be culturally disenfranchised and to feel alienated from the classroom (Asante, 1990), African-American students and community activists brought to the fore of the discussion the question of educational relevance for black people, arguing for a culturally inclusive and sensitive curriculum apt to produce scholars in tune with and committed to the betterment of their communities (Karenga). One of the major characteristics of black studies, therefore, has been a dual concern for academic matters and the life conditions of African-Americans, with African-American studies scholars expected to be scholar-activists. However, if the political mandate of African-American scholars is clear, their intellectual mission has, on the other hand, been clouded with conceptual confusion since the very beginning. Particularly vexing has been the issue of the relationship between African-American studies and the Western academe, with much debate over the status of AfricanAmerican studies as a full-fledged independent discipline, or rather a field of studies, devoted to the Black experience and yet operating within the confines of Western intellectual thought. At the core of this issue, however, lies the question of Eurocentrism, with the degree to which one seems willing to challenge European intellectual hegemony determining one's position in the debate over the intellectual status of African-American studies. Eurocentrism is understood as the interpretation of all reality from the Western perspective, especially as it emerged during the European Age of the Enlightenment. That perspective developed both internally, with the development of a metaparadigm specific and relevant to Europe; and externally, in opposition to "others," especially African people. Hence there are at least four assumptions of that European metaparadigm that have played a major and negative role as far as black people are concerned: (1) all human beings evolve along the same line; (2) the European experience is universal; (3) Europeans are superior; and (4) "others" are defined by their experiences with Europeans. In other words, the European metaparadigm rests among other things on the belief in the superiority and universality of the European experience. Indeed, in that linear and evolutionary schema of thought, the West claims that when it talks about itself, it is also ipso facto talking about all human beings. The history of all women, men, and children in the world supposedly naturally coincides with that of Europeans. The latter are thus implicitly or explicitly held to be the universal norm by which African intellectual, cultural, and social "progress" will be evaluated. However, if all human beings share a common essence, it is also obvious that they have not all reached the same stage of development. Indeed, it is rather clear, from reading European writers, that Europe precedes the rest of humankind, and time after time it is suggested that Africans must emulate Europeans in order to put an end to their inferior condition. The expected outcome of such emulation has been a process of mental conversion (Mudimbe, 1988), predicated upon the belief that only through a careful imitation of Europeans would Africans improve their lot. While the ontological reduction of the colonized had been well understood as a necessary part of colonialism, the implications of the conversion process, on the other hand, had not been fully appreciated. This may be AFROCENTRICITY 30 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 30 precisely because the early African critiques of European colonialism (e.g., Frantz Fanon) still functioned within a fundamentally European conceptual framework, such as Marxism. Hence what was challenged was not Western modernity per se, but its abusive practices. Europe's tacit advancement of its own culture as some "no-man's cultural land"—its implicit claims to cultural neutrality and universality—was rarely questioned for it was not construed as problematic. Such an approach, which was to be expected during those early days, would not allow one to understand the colonization process as the systematic imposition of the European worldview, grounded in a specific time and place yet parading as universal, on people whose cultural and historical experiences were quite different.

Historical and Intellectual Context African-American studies academic units came into existence as a result of great political pressure on European institutions of higher learning in a demand for space for the African voice and experience in the late 1960s, during the Black Power movement. No longer satisfied to be culturally disenfranchised and to feel alienated from the classroom (Asante, 1990), African-American students and community activists brought to the fore of the discussion the question of educational relevance for black people, arguing for a culturally inclusive and sensitive curriculum apt to produce scholars in tune with and committed to the betterment of their communities (Karenga). One of the major characteristics of black studies, therefore, has been a dual concern for academic matters and the life conditions of African-Americans, with African-American studies scholars expected to be scholar-activists. However, if the political mandate of African-American scholars is clear, their intellectual mission has, on the other hand, been clouded with conceptual confusion since the very beginning. Particularly vexing has been the issue of the relationship between African-American studies and the Western academe, with much debate over the status of AfricanAmerican studies as a full-fledged independent discipline, or rather a field of studies, devoted to the Black experience and yet operating within the confines of Western intellectual thought. At the core of this issue, however, lies the question of Eurocentrism, with the degree to which one seems willing to challenge European intellectual hegemony determining one's position in the debate over the intellectual status of African-American studies. Eurocentrism is understood as the interpretation of all reality from the Western perspective, especially as it emerged during the European Age of the Enlightenment. That perspective developed both internally, with the development of a metaparadigm specific and relevant to Europe; and externally, in opposition to "others," especially African people. Hence there are at least four assumptions of that European metaparadigm that have played a major and negative role as far as black people are concerned: (1) all human beings evolve along the same line; (2) the European experience is universal; (3) Europeans are superior; and (4) "others" are defined by their experiences with Europeans. In other words, the European metaparadigm rests among other things on the belief in the superiority and universality of the European experience. Indeed, in that linear and evolutionary schema of thought, the West claims that when it talks about itself, it is also ipso facto talking about all human beings. The history of all women, men, and children in the world supposedly naturally coincides with that of Europeans. The latter are thus implicitly or explicitly held to be the universal norm by which African intellectual, cultural, and social "progress" will be evaluated. However, if all human beings share a common essence, it is also obvious that they have not all reached the same stage of development. Indeed, it is rather clear, from reading European writers, that Europe precedes the rest of humankind, and time after time it is suggested that Africans must emulate Europeans in order to put an end to their inferior condition. The expected outcome of such emulation has been a process of mental conversion (Mudimbe, 1988), predicated upon the belief that only through a careful imitation of Europeans would Africans improve their lot. While the ontological reduction of the colonized had been well understood as a necessary part of colonialism, the implications of the conversion process, on the other hand, had not been fully appreciated. This may be AFROCENTRICITY 30 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 30 precisely because the early African critiques of European colonialism (e.g., Frantz Fanon) still functioned within a fundamentally European conceptual framework, such as Marxism. Hence what was challenged was not Western modernity per se, but its abusive practices. Europe's tacit advancement of its own culture as some "no-man's cultural land"—its implicit claims to cultural neutrality and universality—was rarely questioned for it was not construed as problematic. Such an approach, which was to be expected during those early days, would not allow one to understand the colonization process as the systematic imposition of the European worldview, grounded in a specific time and place yet parading as universal, on people whose cultural and historical experiences were quite different.

Neocolonialism By the 1820s, most of Latin America had gained political independence from its colonial masters. With Iberian mercantile restrictions gone, northern European (and particularly British) ANTICOLONIALISM New Dictionary of the History of Ideas 85 Christopher Columbus coming ashore at San Salvador. Columbus's arrival in the New World in 1492 gave rise to the first period of colonization of Latin America. Spain, Portugal, and Britain proceeded to control the region, which didn't achieve independence until the early nineteenth century. © BETTMANN/CORBIS 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 85 capital flooded the region. As critics have noted, a legacy of colonization was a blocking of moves toward industrialization, which would have represented little gain for colonial powers. This trend continued with the British (and later the United States) extracting raw materials from and importing finished goods into the region. The infrastructure, such as the railroad systems, was designed to transport products from mines and plantations to seaports rather than to integrate a country. The economic benefits of this trade accrued to foreign powers, with wages and living standards remaining depressed as resources were drained away from the domestic economy. Neocolonialism also led to cultural shifts. For example, predominantly Catholic Latin American countries implemented freedom of religion in order to encourage foreign investment from Protestant powers. Despite formal independence, external economic forces determined many of the domestic policies in Latin America. This irony has come to be known as neocolonialism. Nineteenth-century examples of neocolonialism include the export of Peruvian guano and Chilean nitrates, which fueled an agricultural boom in Europe. Neocolonialism, and Latin America's subsequent falling behind relative to economic growth in northern industrial economies, was not inevitable nor was it the only possible option. In The Poverty of Progress, E. Bradford Burns points to Paraguay as a viable example of autonomous economic development. The country's leaders eliminated large estates and emphasized domestic food production, and they restricted foreign penetration of the economy. Rapid economic development without outside foreign development alarmed the elitist governments in the neighboring countries of Argentina, Brazil, and Uruguay, who feared the model Paraguay offered to the poor in their own countries. Their opposition led to the War of the Triple Alliance (1864-1870), which devastated Paraguay and destroyed this alternative model to neocolonialism. The concept of formally independent countries that remained economically dependent on outside powers first was articulated in Marxist circles in the 1920s, though the term neocolonialism was not introduced until the 1960s. It has always been closely associated with anti-imperialism, as was demonstrated at the 1966 Tricontinental Conference in Havana, Cuba, which linked anticolonial struggles in Asia, Africa, and Latin America. Although U.S. neocolonial control is largely a twentieth-century phenomenon, it is rooted in the 1823 Monroe Doctrine, which declared Latin America to be part of the U.S. imperial sphere of influence.

Neocolonialism By the 1820s, most of Latin America had gained political independence from its colonial masters. With Iberian mercantile restrictions gone, northern European (and particularly British) ANTICOLONIALISM New Dictionary of the History of Ideas 85 Christopher Columbus coming ashore at San Salvador. Columbus's arrival in the New World in 1492 gave rise to the first period of colonization of Latin America. Spain, Portugal, and Britain proceeded to control the region, which didn't achieve independence until the early nineteenth century. © BETTMANN/CORBIS 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 85 capital flooded the region. As critics have noted, a legacy of colonization was a blocking of moves toward industrialization, which would have represented little gain for colonial powers. This trend continued with the British (and later the United States) extracting raw materials from and importing finished goods into the region. The infrastructure, such as the railroad systems, was designed to transport products from mines and plantations to seaports rather than to integrate a country. The economic benefits of this trade accrued to foreign powers, with wages and living standards remaining depressed as resources were drained away from the domestic economy. Neocolonialism also led to cultural shifts. For example, predominantly Catholic Latin American countries implemented freedom of religion in order to encourage foreign investment from Protestant powers. Despite formal independence, external economic forces determined many of the domestic policies in Latin America. This irony has come to be known as neocolonialism. Nineteenth-century examples of neocolonialism include the export of Peruvian guano and Chilean nitrates, which fueled an agricultural boom in Europe. Neocolonialism, and Latin America's subsequent falling behind relative to economic growth in northern industrial economies, was not inevitable nor was it the only possible option. In The Poverty of Progress, E. Bradford Burns points to Paraguay as a viable example of autonomous economic development. The country's leaders eliminated large estates and emphasized domestic food production, and they restricted foreign penetration of the economy. Rapid economic development without outside foreign development alarmed the elitist governments in the neighboring countries of Argentina, Brazil, and Uruguay, who feared the model Paraguay offered to the poor in their own countries. Their opposition led to the War of the Triple Alliance (1864-1870), which devastated Paraguay and destroyed this alternative model to neocolonialism. The concept of formally independent countries that remained economically dependent on outside powers first was articulated in Marxist circles in the 1920s, though the term neocolonialism was not introduced until the 1960s. It has always been closely associated with anti-imperialism, as was demonstrated at the 1966 Tricontinental Conference in Havana, Cuba, which linked anticolonial struggles in Asia, Africa, and Latin America. Although U.S. neocolonial control is largely a twentieth-century phenomenon, it is rooted in the 1823 Monroe Doctrine, which declared Latin America to be part of the U.S. imperial sphere of influence.

Social Psychology, Sociobiology, and Altruism since the 1960s Scientific research into altruism has markedly increased since the 1960s. During the 1970s, "helping behavior" and the problem of the "unresponsive bystander" were among the most popular topics in social psychology (see Howard and Pilliavin; Latané and Darley; Wispé). Later C. Daniel Batson stimulated considerable discussion among social psychologists with a series of experiments trying to establish the genuinely altruistic motivation of some helping behavior, explaining it as the product of empathy (see Batson). Others have preferred more egoistic hypotheses, such as the theory that helping behavior is undertaken in order to alleviate the helper's own distress at the suffering of the person to be helped. In the field of evolutionary biology, 1975 saw the publication of E. O. Wilson's controversial Sociobiology, which set out to explain all social phenomena in terms of underlying biological mechanisms. The following year Richard Dawkins's highly successful popular science book The Selfish Gene was published. It was based on mathematical models developed by William D. Hamilton to explain altruistic behaviors in terms of their benefits to genetically related individuals. Absolutely central to both these books was the puzzle of how selfsacrificing individuals could ever have been successful in the merciless struggle for existence. In short, how could Darwinian evolution produce altruism? Dawkins's straightforward answer was that it could not. According to Dawkins, human beings and other animals are blind robots programmed by their "selfish genes," and any actions that on the surface seem to be examples of "altruism" are in fact driven by the interests of the genes. The existence of apparently altruistic impulses could thus be explained by the fact that an individual who acts in the interests of close relatives (who have many of the same genes) is increasing the chances of copies of the individual's genes persisting into the next generation. Since there is no genuine altruism in nature, Dawkins concluded, the most we can do is to try to teach our children altruism in the hope that they can succeed in rebelling against their genetic inheritance. Scientific, philosophical, and theological critiques of Dawkins's ideas have been abundant. Some have argued that the idea that genes can have "interests" or be described as "selfish" is misleadingly anthropomorphic. Dawkins has replied that these are only metaphors, but ones that help to communicate the fact that the real business of evolution goes on at the genetic level. But others have questioned whether it has really been established that selection operates exclusively, or even primarily, at the genetic level rather than at the level of individuals, groups, or species (see Sober and Wilson). And many commentators have found the view of human nature implicit in The Selfish Gene to be unacceptably cynical, fatalistic, and pessimistic. Since the 1990s, although academic discussions have now moved on from the agenda set by sociobiology and The Selfish Gene, the topic of "altruism" has continued to attract a great deal of attention from a wide range of disciplines, including theology, philosophy, evolutionary biology, economics, social psychology, and sociology (see Batson; Mansbridge; Monroe; and for a particularly helpful collection, Post et al.). The same central questions about what science, religion, and philosophy each have to contribute to an understanding of human altruism, and about their ethical and political implications, continue to be vigorously debated.

Social Psychology, Sociobiology, and Altruism since the 1960s Scientific research into altruism has markedly increased since the 1960s. During the 1970s, "helping behavior" and the problem of the "unresponsive bystander" were among the most popular topics in social psychology (see Howard and Pilliavin; Latané and Darley; Wispé). Later C. Daniel Batson stimulated considerable discussion among social psychologists with a series of experiments trying to establish the genuinely altruistic motivation of some helping behavior, explaining it as the product of empathy (see Batson). Others have preferred more egoistic hypotheses, such as the theory that helping behavior is undertaken in order to alleviate the helper's own distress at the suffering of the person to be helped. In the field of evolutionary biology, 1975 saw the publication of E. O. Wilson's controversial Sociobiology, which set out to explain all social phenomena in terms of underlying biological mechanisms. The following year Richard Dawkins's highly successful popular science book The Selfish Gene was published. It was based on mathematical models developed by William D. Hamilton to explain altruistic behaviors in terms of their benefits to genetically related individuals. Absolutely central to both these books was the puzzle of how selfsacrificing individuals could ever have been successful in the merciless struggle for existence. In short, how could Darwinian evolution produce altruism? Dawkins's straightforward answer was that it could not. According to Dawkins, human beings and other animals are blind robots programmed by their "selfish genes," and any actions that on the surface seem to be examples of "altruism" are in fact driven by the interests of the genes. The existence of apparently altruistic impulses could thus be explained by the fact that an individual who acts in the interests of close relatives (who have many of the same genes) is increasing the chances of copies of the individual's genes persisting into the next generation. Since there is no genuine altruism in nature, Dawkins concluded, the most we can do is to try to teach our children altruism in the hope that they can succeed in rebelling against their genetic inheritance. Scientific, philosophical, and theological critiques of Dawkins's ideas have been abundant. Some have argued that the idea that genes can have "interests" or be described as "selfish" is misleadingly anthropomorphic. Dawkins has replied that these are only metaphors, but ones that help to communicate the fact that the real business of evolution goes on at the genetic level. But others have questioned whether it has really been established that selection operates exclusively, or even primarily, at the genetic level rather than at the level of individuals, groups, or species (see Sober and Wilson). And many commentators have found the view of human nature implicit in The Selfish Gene to be unacceptably cynical, fatalistic, and pessimistic. Since the 1990s, although academic discussions have now moved on from the agenda set by sociobiology and The Selfish Gene, the topic of "altruism" has continued to attract a great deal of attention from a wide range of disciplines, including theology, philosophy, evolutionary biology, economics, social psychology, and sociology (see Batson; Mansbridge; Monroe; and for a particularly helpful collection, Post et al.). The same central questions about what science, religion, and philosophy each have to contribute to an understanding of human altruism, and about their ethical and political implications, continue to be vigorously debated.

Critical Reflections According to its representations, America has moved from representing Europe's past to representing Europe's future and from the epitome of nature to the epitome of technology, polar opposite views. Four points might be noted, however, that raise questions about the validity of these representations. First, descriptions of America have been fantastical from the beginning. They are inaccurate and often intentionally so. Second, although twentieth-century thinkers blame the United States for the technologization of the world, it is apparent that the technological attitude long predates the founding of the United States. Columbus and the conquistadores neither saw the New World for what it was nor had any desire to do so. Rather, they sought to exploit resources and people, and this is the essence of the technological attitude, the attitude that some claim began only with the United States. Third, twentiethcentury thinkers miss the mark in blaming America for problems that have to do with modernity itself. Because the United States was created from scratch by colonists with minimal feudal baggage, the United States emerged as perhaps the purest embodiment of modern values. But there are multinational corporations in Europe and other countries around the world, and most people wherever they live in the world desire the standard of living and freedom that the United States—and many modern countries—have. So while there is a certain justification for seeing the United States as embodying modernity, it is not modernity's sole embodiment. Fourth, there is a fundamental continuity in the views about America. The Indians have been described as on the one hand, naïve, innocent, childlike, and simple, and on the other as brutish, vulgar, shallow, stupid, and lacking spirituality. These are essentially the same charges that Europe and the world leveled at the United States throughout the nineteenth and twentieth centuries. The United States might be all of these things, although probably not more than most countries and possibly less so than many. But the fact that ways of life as opposite as those of the Indians and the United States are described in fundamentally the same terms indicates a problem in the substantive nature of the representations. As an epilogue, it is worth noting briefly a postmodern view of America. Postmodern thinkers reject the idea of there being any humanly knowable truth and choose to play with images, which they claim is all we are left with. The French postmodern thinker Jean Baudrillard has done this with the United States. In a book entitled America (1986; English translation published in 1988), Baudrillard writes contradictorally, "For me there is no truth of America" and, "I knew all about this nuclear form, this future catastrophe when I was still in Paris, of course." He also mixes all of the main images of America, describing the United States both as "the original version of Modernity" and as "the only remaining primitive society." For him, America is the "Primitive society of the future." He combines five hundred years of images of America in a clever fashion.

Critical Reflections According to its representations, America has moved from representing Europe's past to representing Europe's future and from the epitome of nature to the epitome of technology, polar opposite views. Four points might be noted, however, that raise questions about the validity of these representations. First, descriptions of America have been fantastical from the beginning. They are inaccurate and often intentionally so. Second, although twentieth-century thinkers blame the United States for the technologization of the world, it is apparent that the technological attitude long predates the founding of the United States. Columbus and the conquistadores neither saw the New World for what it was nor had any desire to do so. Rather, they sought to exploit resources and people, and this is the essence of the technological attitude, the attitude that some claim began only with the United States. Third, twentiethcentury thinkers miss the mark in blaming America for problems that have to do with modernity itself. Because the United States was created from scratch by colonists with minimal feudal baggage, the United States emerged as perhaps the purest embodiment of modern values. But there are multinational corporations in Europe and other countries around the world, and most people wherever they live in the world desire the standard of living and freedom that the United States—and many modern countries—have. So while there is a certain justification for seeing the United States as embodying modernity, it is not modernity's sole embodiment. Fourth, there is a fundamental continuity in the views about America. The Indians have been described as on the one hand, naïve, innocent, childlike, and simple, and on the other as brutish, vulgar, shallow, stupid, and lacking spirituality. These are essentially the same charges that Europe and the world leveled at the United States throughout the nineteenth and twentieth centuries. The United States might be all of these things, although probably not more than most countries and possibly less so than many. But the fact that ways of life as opposite as those of the Indians and the United States are described in fundamentally the same terms indicates a problem in the substantive nature of the representations. As an epilogue, it is worth noting briefly a postmodern view of America. Postmodern thinkers reject the idea of there being any humanly knowable truth and choose to play with images, which they claim is all we are left with. The French postmodern thinker Jean Baudrillard has done this with the United States. In a book entitled America (1986; English translation published in 1988), Baudrillard writes contradictorally, "For me there is no truth of America" and, "I knew all about this nuclear form, this future catastrophe when I was still in Paris, of course." He also mixes all of the main images of America, describing the United States both as "the original version of Modernity" and as "the only remaining primitive society." For him, America is the "Primitive society of the future." He combines five hundred years of images of America in a clever fashion.

Algeria Provided a certain flexibility is adopted, it is possible to identify the major templates of anticolonial resistance, which vary according to the nature of the colonizing process. The Algerian case is probably the most extreme because of the extent of the devastation caused by the colonization process over a period of some 130 years. In the months after the conquest of the city of Algiers in July 1830, the French military began to encourage the settlement of French colons in the city's rural hinterland. At the time, Algeria was, if only nominally, an Ottoman province and had no developed political structures. Local leaders in the west of the country turned first to the Moroccan sultan, but the French warned him not to interfere. The leaders then turned to the Sufi orders, the only bodies with an organizational structure, and Muhi al-Din, the leader of the Qadiriyya order, and his shrewd and energetic son 'Abd al-Qadir were asked to lead a tribal jihad against the French. Between 1832 and 1844 'Abd al-Qadir managed to keep the French at bay with an army of about ten thousand. Initially, he achieved this by making agreements with the French recognizing his authority over certain parts of the country, but by the 1840s the French had decided on a policy of total subjugation and 'Abd al-Qadir, defeated at Isly in 1844, eventually surrendered in 1847. By this time the European population had reached over 100,000, living mostly in the larger towns. In the 1840s, the French had begun a policy of wholesale land confiscation and appropriation, and a number of local risings took place in protest. The settlers had influential allies in Paris, and throughout the nineteenth century the indigenous population faced the gradual erosion of most of their rights. The last major act of resistance until the war of 1954 to 1962 was the rebellion in Kabylia in 1870 to 1871, led by Muhammad al-Muqrani. For a while, al-Muqrani's army of some 200,000 controlled much of eastern Algeria, but it was no match for the better equipped French troops. After the defeat of al-Muqrani's rebellion (he was killed in battle in May 1871) the local communities involved were fined heavily and lost most of their tribal lands. The Algerian national movement was slow to develop in the twentieth century. The tribal aristocracy had been defeated and no former indigenous governing class or emerging business bourgeoisie existed (as they did in, for example, Morocco, Tunisia, Syria, and Lebanon). Some Algerians felt that France had brought them into the modern world and wanted to become more French—that is, to enjoy the same rights as the French in Algeria without having to give up their Islamic identity. This tendency, generally called assimilationist, was represented by Ferhat Abbas, who sought to become a member of the French Chamber of Deputies. The first strictly nationalist movement, the Étoile Nord-Africaine (later the Parti du Peuple Algérien), which initially had links to the French Communist Party, was founded by Messali Hadj in 1926, recruiting among Algerian workers in France. Yet another tendency was represented by Ahmad Ibn Badis (1889-1940), who sought to reform Algerian popular Islam through the Association of 'Ulama', asserting the Muslim nature of Algeria. From the 1930s onwards, rapid urbanization fuelled Algerian resistance to France. By the end of World War II there was some hope on the part of moderates both in France and Algeria that compromises could be worked out that might deflect violent nationalism, but the Algerian European community's dogged insistence on maintaining its privileges meant that these hopes soon evaporated. Ferhat Abbas's movement soon became insignificant. Ibn Badis's death meant that the Association of 'Ulama' lacked influence, leaving Messali Hadj dominating the field, with supporters among Algerian workers in France as well as in Algeria. However, his organization was regarded as too moderate, and a splinter group, the Organisation Secrète, seceded from it in the mid-1940s. Its members included such major revolutionary figures as Ahmed Ben Bella, Ait Ahmad, Murad Didouche, Mohammed Boudiaf, and Belkacem Krim. This group subsequently launched the Algerian Revolution, or war of national liberation, on 1 November 1954. The war lasted until 1962, when Algeria became independent; over the eight years, between 1 million and 1.5 million Algerians and 27,000 French were killed. The war proved intensely divisive, especially as more Algerian Muslims fought as soldiers or harkis on the French side than in the Algerian army

Algeria Provided a certain flexibility is adopted, it is possible to identify the major templates of anticolonial resistance, which vary according to the nature of the colonizing process. The Algerian case is probably the most extreme because of the extent of the devastation caused by the colonization process over a period of some 130 years. In the months after the conquest of the city of Algiers in July 1830, the French military began to encourage the settlement of French colons in the city's rural hinterland. At the time, Algeria was, if only nominally, an Ottoman province and had no developed political structures. Local leaders in the west of the country turned first to the Moroccan sultan, but the French warned him not to interfere. The leaders then turned to the Sufi orders, the only bodies with an organizational structure, and Muhi al-Din, the leader of the Qadiriyya order, and his shrewd and energetic son 'Abd al-Qadir were asked to lead a tribal jihad against the French. Between 1832 and 1844 'Abd al-Qadir managed to keep the French at bay with an army of about ten thousand. Initially, he achieved this by making agreements with the French recognizing his authority over certain parts of the country, but by the 1840s the French had decided on a policy of total subjugation and 'Abd al-Qadir, defeated at Isly in 1844, eventually surrendered in 1847. By this time the European population had reached over 100,000, living mostly in the larger towns. In the 1840s, the French had begun a policy of wholesale land confiscation and appropriation, and a number of local risings took place in protest. The settlers had influential allies in Paris, and throughout the nineteenth century the indigenous population faced the gradual erosion of most of their rights. The last major act of resistance until the war of 1954 to 1962 was the rebellion in Kabylia in 1870 to 1871, led by Muhammad al-Muqrani. For a while, al-Muqrani's army of some 200,000 controlled much of eastern Algeria, but it was no match for the better equipped French troops. After the defeat of al-Muqrani's rebellion (he was killed in battle in May 1871) the local communities involved were fined heavily and lost most of their tribal lands. The Algerian national movement was slow to develop in the twentieth century. The tribal aristocracy had been defeated and no former indigenous governing class or emerging business bourgeoisie existed (as they did in, for example, Morocco, Tunisia, Syria, and Lebanon). Some Algerians felt that France had brought them into the modern world and wanted to become more French—that is, to enjoy the same rights as the French in Algeria without having to give up their Islamic identity. This tendency, generally called assimilationist, was represented by Ferhat Abbas, who sought to become a member of the French Chamber of Deputies. The first strictly nationalist movement, the Étoile Nord-Africaine (later the Parti du Peuple Algérien), which initially had links to the French Communist Party, was founded by Messali Hadj in 1926, recruiting among Algerian workers in France. Yet another tendency was represented by Ahmad Ibn Badis (1889-1940), who sought to reform Algerian popular Islam through the Association of 'Ulama', asserting the Muslim nature of Algeria. From the 1930s onwards, rapid urbanization fuelled Algerian resistance to France. By the end of World War II there was some hope on the part of moderates both in France and Algeria that compromises could be worked out that might deflect violent nationalism, but the Algerian European community's dogged insistence on maintaining its privileges meant that these hopes soon evaporated. Ferhat Abbas's movement soon became insignificant. Ibn Badis's death meant that the Association of 'Ulama' lacked influence, leaving Messali Hadj dominating the field, with supporters among Algerian workers in France as well as in Algeria. However, his organization was regarded as too moderate, and a splinter group, the Organisation Secrète, seceded from it in the mid-1940s. Its members included such major revolutionary figures as Ahmed Ben Bella, Ait Ahmad, Murad Didouche, Mohammed Boudiaf, and Belkacem Krim. This group subsequently launched the Algerian Revolution, or war of national liberation, on 1 November 1954. The war lasted until 1962, when Algeria became independent; over the eight years, between 1 million and 1.5 million Algerians and 27,000 French were killed. The war proved intensely divisive, especially as more Algerian Muslims fought as soldiers or harkis on the French side than in the Algerian army

Linguistics Linguists study the primary medium by which culture is transmitted, language. The discipline of linguistics—at first called philology—dates from approximately the same period that biological anthropology and archaeology began, the late eighteenth century. Sir William Jones (1746-1794), a jurist and student of Asian languages assigned to the British East India Company's outpost at modern-day Calcutta, is generally credited with founding the discipline. In 1786, in the course of a speech to the Bengal Asiatic Society, of which he was the founder and president, Jones outlined, for the first time, the family-tree model of linguistic relationships, focusing on what would soon be called the Indo-European language family. Within a generation, comparative philology (now called historical, or diachronic, linguistics) was an established discipline. Scholars such as Jacob Grimm (1785-1863), Franz Bopp (1791-1867), and August Schleicher (1821-1868) had reconstructed what appeared to be the Proto-Indo-European lexicon. Eventually, other language families, such as SinoTibetan (Chinese, Tibetan, Thai, Burmese, etc.) and HamitoSemtic (Ancient Egyptian, Hebrew, Babylonian, Arabic, and other Near Eastern languages), also began to be studied from this perspective. Franz Boas (1858-1942), in addition to being a pioneer sociocultural anthropologist, was also among the first to apply the comparative method to the study of Native American languages. In the early decades of that century, thanks primarily to the efforts of a brilliant Swiss linguist, Ferdinand de Saussure (1857-1913), a new structural approach to the study of language emerged, one that emphasized synchronic studies rather than the historical focus that had dominated during the previous century. De Saussure made a basic distinction between what he called la langue, the basic rules that govern the grammar of a given language, and la parole, the specific speech patterns that occur at any given instant. The linguist's job is to elicit the nature of la langue by recording and analyzing examples of la parole. This approach soon led to two concepts that still dominate anthropological linguistics: the phoneme and morpheme. A phoneme is a minimal sound feature of a language that signals a difference in meaning; a morpheme is an ordered arrangement of such speech sounds that carries an indivisible meaning. Thus, the sounds represented by the English letters d, o, and g are phonemes, while the word dog is a morpheme. Combining the same phonemes in reverse order produces a wholly different morpheme, god. Structural linguists are also concerned with syntax, the arrangement of morphemes into phrases and sentences, and semantics, how meanings are structured by morphemes and their forms and their position and function in sentences. Grammar is the entirety of a language's phonological, morphological, syntactic, and semantic rules that enable humans to communicate and transmit culture. In the course of the last few decades, linguists have debated the extent to which there are universal, innate features that form the fundamental structure of all human languages. The U.S. linguist Noam Chomsky has argued in favor of this proposition. In Syntactic Structures (1957), Chomsky suggested that all human beings have the innate ability to generate every possible sentence in their language. This approach to the study of language is called transformational-generative grammar (TG). However, not all linguists accept this model. A great many hold that, like culture, language is infinitely variable and that there are no proven universal features. The relationship between language and culture has also been a major concern among linguists, especially anthropological linguists. Two pioneers in the study of this relationship were Edward Sapir (1884-1939) and his student Benjamin Lee Whorf (1897-1941), who suggested that there was an intrinsic connection between the fundamental features of a culture and the structure of its language. For example, as Whorf pointed out, the Hopi Indian language does not mark verb tense, a feature that Whorf said is reflected in the absence of a linear time concept in Hope culture. All events are intrinsically linked to one another, and life simply unfolds. Although by no means universally accepted by contemporary anthropologists—some critics object that his approach is tautological and that there is no evidence to support the priority of language over culture—the Sapir-Whorf Hypothesis continues to influence anthropological thinking. Early in the twentieth century, after the publication of books such as Sapir's Language (1921) and Leonard Bloomfield's (1887-1949) book of the same title (1933), linguistics developed into a separate discipline dedicated to the scientific study of language, with connections to the related fields of cognitive science and cognitive psychology, as well as some aspects of computer science (artificial intelligence, machine translation), after the publication of Chomsky's Syntactic Structures. The development of TG grammar produced an explosion of research in both synchronic and diachronic linguistics that continues to extend our understanding of language and mind and how we communicate. Anthropological linguistics exists as a separate but related discipline that emphasizes the relationship between language and culture, but adopts a more holistic and, often, humanistic approach than, for example, cognitive psychology. Anthropological linguists study a variety of language practices, ranging from the relationship between language and music within specific cultures to children's use of language in play. A major focus that distinguishes linguistic anthropology from other branches of linguistics is its focus on questions of politics, power, and social inequality, as these aspects of culture affect language. The study of language ideologies emphasizes the different statuses of certain language practices, in contexts ranging from a bank officer turning down a loan applicant, to political speeches, to bilingual and bicultural contexts (for example, the study of "Spanglish," forms of language developed by Americans who speak both Spanish and English), to the controversies about varieties of English spoken by African Americans

Linguistics Linguists study the primary medium by which culture is transmitted, language. The discipline of linguistics—at first called philology—dates from approximately the same period that biological anthropology and archaeology began, the late eighteenth century. Sir William Jones (1746-1794), a jurist and student of Asian languages assigned to the British East India Company's outpost at modern-day Calcutta, is generally credited with founding the discipline. In 1786, in the course of a speech to the Bengal Asiatic Society, of which he was the founder and president, Jones outlined, for the first time, the family-tree model of linguistic relationships, focusing on what would soon be called the Indo-European language family. Within a generation, comparative philology (now called historical, or diachronic, linguistics) was an established discipline. Scholars such as Jacob Grimm (1785-1863), Franz Bopp (1791-1867), and August Schleicher (1821-1868) had reconstructed what appeared to be the Proto-Indo-European lexicon. Eventually, other language families, such as SinoTibetan (Chinese, Tibetan, Thai, Burmese, etc.) and HamitoSemtic (Ancient Egyptian, Hebrew, Babylonian, Arabic, and other Near Eastern languages), also began to be studied from this perspective. Franz Boas (1858-1942), in addition to being a pioneer sociocultural anthropologist, was also among the first to apply the comparative method to the study of Native American languages. In the early decades of that century, thanks primarily to the efforts of a brilliant Swiss linguist, Ferdinand de Saussure (1857-1913), a new structural approach to the study of language emerged, one that emphasized synchronic studies rather than the historical focus that had dominated during the previous century. De Saussure made a basic distinction between what he called la langue, the basic rules that govern the grammar of a given language, and la parole, the specific speech patterns that occur at any given instant. The linguist's job is to elicit the nature of la langue by recording and analyzing examples of la parole. This approach soon led to two concepts that still dominate anthropological linguistics: the phoneme and morpheme. A phoneme is a minimal sound feature of a language that signals a difference in meaning; a morpheme is an ordered arrangement of such speech sounds that carries an indivisible meaning. Thus, the sounds represented by the English letters d, o, and g are phonemes, while the word dog is a morpheme. Combining the same phonemes in reverse order produces a wholly different morpheme, god. Structural linguists are also concerned with syntax, the arrangement of morphemes into phrases and sentences, and semantics, how meanings are structured by morphemes and their forms and their position and function in sentences. Grammar is the entirety of a language's phonological, morphological, syntactic, and semantic rules that enable humans to communicate and transmit culture. In the course of the last few decades, linguists have debated the extent to which there are universal, innate features that form the fundamental structure of all human languages. The U.S. linguist Noam Chomsky has argued in favor of this proposition. In Syntactic Structures (1957), Chomsky suggested that all human beings have the innate ability to generate every possible sentence in their language. This approach to the study of language is called transformational-generative grammar (TG). However, not all linguists accept this model. A great many hold that, like culture, language is infinitely variable and that there are no proven universal features. The relationship between language and culture has also been a major concern among linguists, especially anthropological linguists. Two pioneers in the study of this relationship were Edward Sapir (1884-1939) and his student Benjamin Lee Whorf (1897-1941), who suggested that there was an intrinsic connection between the fundamental features of a culture and the structure of its language. For example, as Whorf pointed out, the Hopi Indian language does not mark verb tense, a feature that Whorf said is reflected in the absence of a linear time concept in Hope culture. All events are intrinsically linked to one another, and life simply unfolds. Although by no means universally accepted by contemporary anthropologists—some critics object that his approach is tautological and that there is no evidence to support the priority of language over culture—the Sapir-Whorf Hypothesis continues to influence anthropological thinking. Early in the twentieth century, after the publication of books such as Sapir's Language (1921) and Leonard Bloomfield's (1887-1949) book of the same title (1933), linguistics developed into a separate discipline dedicated to the scientific study of language, with connections to the related fields of cognitive science and cognitive psychology, as well as some aspects of computer science (artificial intelligence, machine translation), after the publication of Chomsky's Syntactic Structures. The development of TG grammar produced an explosion of research in both synchronic and diachronic linguistics that continues to extend our understanding of language and mind and how we communicate. Anthropological linguistics exists as a separate but related discipline that emphasizes the relationship between language and culture, but adopts a more holistic and, often, humanistic approach than, for example, cognitive psychology. Anthropological linguists study a variety of language practices, ranging from the relationship between language and music within specific cultures to children's use of language in play. A major focus that distinguishes linguistic anthropology from other branches of linguistics is its focus on questions of politics, power, and social inequality, as these aspects of culture affect language. The study of language ideologies emphasizes the different statuses of certain language practices, in contexts ranging from a bank officer turning down a loan applicant, to political speeches, to bilingual and bicultural contexts (for example, the study of "Spanglish," forms of language developed by Americans who speak both Spanish and English), to the controversies about varieties of English spoken by African Americans

Ordinary Language Philosophy Although there was much disagreement among the logical empiricists their position constituted an immensely influential antimetaphysical paradigm for mid-twentieth-century philosophers, especially after the rise of the Nazis had led to the emigration of the leading philosophers of the group from Central Europe to the United States. While traditional philosophers complained, quite rightly, that the antimetaphysical rhetoric of the position concealed its own metaphysical assumptions, two other lines of criticism were especially important for the subsequent development of analytical philosophy. In Britain, especially after 1945, the logical empiricists' emphasis on logical analysis was felt to be excessively restrictive. It was argued by the defenders of "ordinary language philosophy" such as J. L. Austin and Peter Strawson that formal logic does not adequately capture the complex conceptual structures of our thought and language, and thus that a much more heterogeneous and informal approach to conceptual analysis is required. This work led to the development of a variety of approaches to the study of language, especially speech act theory, which treats speech as a kind of action and therefore conceives of its meaning in the light of the things speakers do by means of their speech acts (for example, making a promise or naming a child). At much the same time Wittgenstein's later Philosophical Investigations (1953) were published, with a similar emphasis on the need to understand our ordinary "language-games" instead of relying on formal logic to capture the structure of thought. One of the most challenging features of Wittgenstein's later investigations was his critical discussion of psychological concepts, and this, together with other work, has helped to direct recent analytical philosophers at least as much to the philosophy of mind as to the philosophy of language. Quine The other main criticism of logical empiricism came from the American philosopher Willard Van Orman Quine (1908-2000), who argued that the logical empiricists had been mistaken in regarding logic as "analytic"—that is, true by definition. Quine argued that logic is of the same type as other beliefs: it is an element of the web of belief through which we make sense of our experience as experience of an objective world. Hence logic is not analytic, since it concerns the world, and it is not a priori, since it is revisable in the light of experience. Quine's arguments remain disputed, but his work has certainly helped to encourage philosophers to address broader disputes in the natural sciences and other areas. There is no enclosed domain for a priori logical and conceptual analysis. Some critics, most notably Richard Rorty, argue that it follows that there is now nothing worth calling "analytical philosophy." But these claims are exaggerated. Although Quine was a critic of the analyticity of logic, he was a distinguished logician and used logical analysis throughout his philosophy; so his practice shows that analytical philosophy does not depend on the analyticity of logic. Second, although Quine's arguments call into question the "linguistic" conception of the a priori as analyticity it is widely accepted that some distinction between the a priori and the empirical has to be made if we are to be able to reason coherently; and as long as that distinction is in place, analytical philosophers can draw on it to characterize the significance of their conclusions. Analytical philosophy today, therefore, continues the tradition captured by Russell and Wittgenstein at the beginning of the twentieth century. It is not "a body of doctrine," it is a "method," typically "logical-analytic," but often informal, of using reasoning to capture and criticize conceptual structures. As such one finds it regularly employed across the whole spectrum of contemporary philosophical debate, by feminists and political philosophers as much as by metaphysicians and epistemologists

Ordinary Language Philosophy Although there was much disagreement among the logical empiricists their position constituted an immensely influential antimetaphysical paradigm for mid-twentieth-century philosophers, especially after the rise of the Nazis had led to the emigration of the leading philosophers of the group from Central Europe to the United States. While traditional philosophers complained, quite rightly, that the antimetaphysical rhetoric of the position concealed its own metaphysical assumptions, two other lines of criticism were especially important for the subsequent development of analytical philosophy. In Britain, especially after 1945, the logical empiricists' emphasis on logical analysis was felt to be excessively restrictive. It was argued by the defenders of "ordinary language philosophy" such as J. L. Austin and Peter Strawson that formal logic does not adequately capture the complex conceptual structures of our thought and language, and thus that a much more heterogeneous and informal approach to conceptual analysis is required. This work led to the development of a variety of approaches to the study of language, especially speech act theory, which treats speech as a kind of action and therefore conceives of its meaning in the light of the things speakers do by means of their speech acts (for example, making a promise or naming a child). At much the same time Wittgenstein's later Philosophical Investigations (1953) were published, with a similar emphasis on the need to understand our ordinary "language-games" instead of relying on formal logic to capture the structure of thought. One of the most challenging features of Wittgenstein's later investigations was his critical discussion of psychological concepts, and this, together with other work, has helped to direct recent analytical philosophers at least as much to the philosophy of mind as to the philosophy of language. Quine The other main criticism of logical empiricism came from the American philosopher Willard Van Orman Quine (1908-2000), who argued that the logical empiricists had been mistaken in regarding logic as "analytic"—that is, true by definition. Quine argued that logic is of the same type as other beliefs: it is an element of the web of belief through which we make sense of our experience as experience of an objective world. Hence logic is not analytic, since it concerns the world, and it is not a priori, since it is revisable in the light of experience. Quine's arguments remain disputed, but his work has certainly helped to encourage philosophers to address broader disputes in the natural sciences and other areas. There is no enclosed domain for a priori logical and conceptual analysis. Some critics, most notably Richard Rorty, argue that it follows that there is now nothing worth calling "analytical philosophy." But these claims are exaggerated. Although Quine was a critic of the analyticity of logic, he was a distinguished logician and used logical analysis throughout his philosophy; so his practice shows that analytical philosophy does not depend on the analyticity of logic. Second, although Quine's arguments call into question the "linguistic" conception of the a priori as analyticity it is widely accepted that some distinction between the a priori and the empirical has to be made if we are to be able to reason coherently; and as long as that distinction is in place, analytical philosophers can draw on it to characterize the significance of their conclusions. Analytical philosophy today, therefore, continues the tradition captured by Russell and Wittgenstein at the beginning of the twentieth century. It is not "a body of doctrine," it is a "method," typically "logical-analytic," but often informal, of using reasoning to capture and criticize conceptual structures. As such one finds it regularly employed across the whole spectrum of contemporary philosophical debate, by feminists and political philosophers as much as by metaphysicians and epistemologists

Peasant Studies and the Idea of Anticolonialism With the shift toward an "autonomous" reading of anticolonialism came a connected interest in focusing on peasant society and consciousness. Pathbreaking works, such as James C. Scott's The Moral Economy of the Peasant, applied models for ANTICOLONIALISM New Dictionary of the History of Ideas 93 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 93 studying peasants to the anticolonial movements of the 1930s (the Saya San Rebellion in British Burma and the Nghe-Tinh Uprising in French Vietnam) in order to understand not just how "Southeast Asians" might have articulated and understood revolt but also in what specific ways peasants would have expressed and made sense of the new colonial order. The work of Scott and others suggested that the economic conditions of the 1930s directly challenged the peasantry's locally defined threshold for subsistence, resulting in the widespread rebellions and resistance that occurred throughout the region. Peasant studies tended to also concentrate on economic causal factors, leading scholars to suggest possible connections between the anticolonial rhetoric and new communist influences that were slowly becoming a part of these and other nationalist movements to come. Yet peasant studies also led to the emerging interest in "everyday" forms of resistance and "avoidance" protest that focused on how peasants and communities may have expressed anticolonial sentiment on a daily basis as opposed to the larger and less frequent rebellions that officials and scholars had grown accustomed to study. Anticolonial behavior could be expressed by sabotage, flight, the dragging of one's feet, and other forms of self-preservation and protest that were directed against authority and/or the colonial state. In a fundamental way, the influence of peasant studies upon the idea of anticolonialism challenged for the first time some of the categories and foci of colonial officials by momentarily shifting attention away from the major rebellions and revolts to the everyday behavior of Southeast Asians. The breadth of scholarship generated by this focus continues to influence the field in the early twenty-first century, by which time the focus on the peasantry had broadened to include minority groups, women, and ethnicities involved in challenges to the state and its apparatus. Postcolonial Studies and Anticolonialism Scholars in the early twenty-first century have returned to the idea of anticolonialism, armed with new perspectives and interdisciplinary approaches. As colonialism continues to challenge scholars, many in the academe have been inspired by suggestions that "knowledge" and "power" are closely connected, which have resulted in studies attempting to show how "knowledge" about Southeast Asia reveals something about the contexts in which it was produced. Invariably, attention has returned to those early colonial official-scholars who first began collecting, cataloguing, inventorying, and labeling what they considered Southeast Asia to be. Following Edward Said's critique of Oriental knowledge production, scholars have demonstrated not only that this construction of Southeast Asian culture by colonial administrators represented European images of the "Orient" but that it also represented an underlying "power" to say what was and what was not "Southeast Asia." Applying these approaches to the study of resistance and protest, it has become clear that the very categories that define resistance, the rebel, the criminal, and anticolonialism itself were produced in particular contexts that reveal as much about the colonizer as they reveal something about the perceptions of anticolonialism. Research directed at prisons, anticolonial legislation/law, and criminality have become the focus of study in order to demonstrate how colonial administrations defined Southeast Asian anticolonialism to fit, serve, and respond to the needs of counterinsurgency policies and the maintenance of colonial order. Where once anticolonialism shed light on forms of Southeast Asian culture, it is now redirected to the forms of colonial knowledge and counterinsurgency.

Peasant Studies and the Idea of Anticolonialism With the shift toward an "autonomous" reading of anticolonialism came a connected interest in focusing on peasant society and consciousness. Pathbreaking works, such as James C. Scott's The Moral Economy of the Peasant, applied models for ANTICOLONIALISM New Dictionary of the History of Ideas 93 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 93 studying peasants to the anticolonial movements of the 1930s (the Saya San Rebellion in British Burma and the Nghe-Tinh Uprising in French Vietnam) in order to understand not just how "Southeast Asians" might have articulated and understood revolt but also in what specific ways peasants would have expressed and made sense of the new colonial order. The work of Scott and others suggested that the economic conditions of the 1930s directly challenged the peasantry's locally defined threshold for subsistence, resulting in the widespread rebellions and resistance that occurred throughout the region. Peasant studies tended to also concentrate on economic causal factors, leading scholars to suggest possible connections between the anticolonial rhetoric and new communist influences that were slowly becoming a part of these and other nationalist movements to come. Yet peasant studies also led to the emerging interest in "everyday" forms of resistance and "avoidance" protest that focused on how peasants and communities may have expressed anticolonial sentiment on a daily basis as opposed to the larger and less frequent rebellions that officials and scholars had grown accustomed to study. Anticolonial behavior could be expressed by sabotage, flight, the dragging of one's feet, and other forms of self-preservation and protest that were directed against authority and/or the colonial state. In a fundamental way, the influence of peasant studies upon the idea of anticolonialism challenged for the first time some of the categories and foci of colonial officials by momentarily shifting attention away from the major rebellions and revolts to the everyday behavior of Southeast Asians. The breadth of scholarship generated by this focus continues to influence the field in the early twenty-first century, by which time the focus on the peasantry had broadened to include minority groups, women, and ethnicities involved in challenges to the state and its apparatus. Postcolonial Studies and Anticolonialism Scholars in the early twenty-first century have returned to the idea of anticolonialism, armed with new perspectives and interdisciplinary approaches. As colonialism continues to challenge scholars, many in the academe have been inspired by suggestions that "knowledge" and "power" are closely connected, which have resulted in studies attempting to show how "knowledge" about Southeast Asia reveals something about the contexts in which it was produced. Invariably, attention has returned to those early colonial official-scholars who first began collecting, cataloguing, inventorying, and labeling what they considered Southeast Asia to be. Following Edward Said's critique of Oriental knowledge production, scholars have demonstrated not only that this construction of Southeast Asian culture by colonial administrators represented European images of the "Orient" but that it also represented an underlying "power" to say what was and what was not "Southeast Asia." Applying these approaches to the study of resistance and protest, it has become clear that the very categories that define resistance, the rebel, the criminal, and anticolonialism itself were produced in particular contexts that reveal as much about the colonizer as they reveal something about the perceptions of anticolonialism. Research directed at prisons, anticolonial legislation/law, and criminality have become the focus of study in order to demonstrate how colonial administrations defined Southeast Asian anticolonialism to fit, serve, and respond to the needs of counterinsurgency policies and the maintenance of colonial order. Where once anticolonialism shed light on forms of Southeast Asian culture, it is now redirected to the forms of colonial knowledge and counterinsurgency.

irst reactions to the United States. The United States was formed in a rebellion from England in 1776. Its revolution was the first successful modern revolution in that it was inspired and justified (at least in part) by philosophical doctrine. The United States' Declaration of Independence invokes philosophy when it argues that "all men are created equal" and endowed with "inalienable rights" such as the rights to "life, liberty, and the pursuit of happiness." Government exists only to secure these rights, and any government that does not secure them is deemed illegitimate. The founders of the United States wrote a Constitution to secure these rights based on limited government and the separation of public and private spheres. At a time when no country on earth was based on the consent of the governed, the success of American democracy proved to the modern world that democratic and representative government could exist. The relationship between the Old and New Worlds (and the two images of America) is intertwined and reciprocal. The American Revolution marked the first major step in the collapse of the European empires founded after Columbus discovered the New World. This revolution was inspired in part by the European philosophical doctrines based on natural rights, which had themselves been partly inspired by the original inhabitants of America. Ironically, the political experiment in the name of natural rights then helped destroy the "natural" people who helped inspire the United States' philosophical forefathers. The American Revolution then helped inspire the French Revolutionaries and other lovers of liberty throughout the world. The complex nature of this relationship is seen in the following quotation from the essay "On the Influence of the American Revolution on Europe" by the French philosopher Marie-Jean Caritat, marquis de Condorcet (1743-1794): "The human race had lost its rights. Montesquieu found them and restored them to us" (Voltaire). It is not enough, however, that these rights be written in the philosophers' works and engraved in the heart of virtuous men. It is also necessary that the ignorant or feeble man be able to read them in the example of a great people. America has given us this example. Its Declaration of Independence is a simple and sublime exposition of AMERICA New Dictionary of the History of Ideas 57 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 57 these rights, so sacred and so long forgotten. Among no nation have they been so well known, or preserved in such perfect integrity. The reciprocal relationship is evident: it moves from Montesquieu and Voltaire, who had been partially inspired by America's original inhabitants, to the Declaration of Independence then back to Condorcet, who authored France's Constitution of 1793. Condorcet's praise of America was typical of the Enlightenment philosophes. Immediate reaction to the American Revolution by Enlightenment thinkers was one of enthusiastic praise. In his popular pamphlet entitled "Observations on the Importance of the American Revolution and the Means of Making it a Benefit of the World," Richard Price (1723-1791) writes, "I see the revolution in favor of universal liberty which has taken place in America; a revolution which opens a new prospect in human affairs, and begins a new era in the history of mankind." Given the unprecedented liberties guaranteed in America, Price is hopeful, nay certain, that liberty will soon spread throughout the world, if unchecked by tyrannical governments. He says the revolution will "raise the species higher" and compares its effect to "opening a new sense." Indeed, he goes so far as to suggest that "next to the introduction of Christianity among mankind, the American revolution may prove the most important step in the progressive course of human improvement." So many hopes has he pinned on America that "perhaps there never existed a people on whose wisdom and virtue more depended; or to whom a station of more importance in the plan of Providence has been assigned." Similarly, Anne-Robert-Jacques Turgot (1727-1781), whose brief stint as finance minister in France marked the last serious attempt at reform before the French Revolution, says in a "Letter to Price" that America is "the hope of the world" and should "become a model to it." The Enlightenment thinkers did not think America was perfect. Slavery was America's greatest flaw. They understood the difficulties in eradicating this execrable institution and argued that America would be judged by the manner of eliminating it as circumstances allowed. The great strengths of America, however, more than outweighed its imperfections. Enlightenment leaders praised the numerous liberties in the United States, including freedom of the press, speech, conscience, and religion. Moreover, America was seen as an inspiration for the world. As Condorcet writes, it is an example "so useful to all the nations who can contemplate it"; "it teaches them that these rights are everywhere the same"; "the example of a free people submitting peacefully to military, as to civil, laws will doubtless have the power to cure us." Europe developed these Enlightenment ideas, but due to its powerfully entrenched institutions, it could not act on them. The Enlightenment philosophes, however, thought that the example of America would inspire the deeds that their words could not. In fact, they were right. The American Revolution inspired the French Revolutionaries in 1789, and it has continued to inspire revolutionaries throughout the world.

irst reactions to the United States. The United States was formed in a rebellion from England in 1776. Its revolution was the first successful modern revolution in that it was inspired and justified (at least in part) by philosophical doctrine. The United States' Declaration of Independence invokes philosophy when it argues that "all men are created equal" and endowed with "inalienable rights" such as the rights to "life, liberty, and the pursuit of happiness." Government exists only to secure these rights, and any government that does not secure them is deemed illegitimate. The founders of the United States wrote a Constitution to secure these rights based on limited government and the separation of public and private spheres. At a time when no country on earth was based on the consent of the governed, the success of American democracy proved to the modern world that democratic and representative government could exist. The relationship between the Old and New Worlds (and the two images of America) is intertwined and reciprocal. The American Revolution marked the first major step in the collapse of the European empires founded after Columbus discovered the New World. This revolution was inspired in part by the European philosophical doctrines based on natural rights, which had themselves been partly inspired by the original inhabitants of America. Ironically, the political experiment in the name of natural rights then helped destroy the "natural" people who helped inspire the United States' philosophical forefathers. The American Revolution then helped inspire the French Revolutionaries and other lovers of liberty throughout the world. The complex nature of this relationship is seen in the following quotation from the essay "On the Influence of the American Revolution on Europe" by the French philosopher Marie-Jean Caritat, marquis de Condorcet (1743-1794): "The human race had lost its rights. Montesquieu found them and restored them to us" (Voltaire). It is not enough, however, that these rights be written in the philosophers' works and engraved in the heart of virtuous men. It is also necessary that the ignorant or feeble man be able to read them in the example of a great people. America has given us this example. Its Declaration of Independence is a simple and sublime exposition of AMERICA New Dictionary of the History of Ideas 57 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 57 these rights, so sacred and so long forgotten. Among no nation have they been so well known, or preserved in such perfect integrity. The reciprocal relationship is evident: it moves from Montesquieu and Voltaire, who had been partially inspired by America's original inhabitants, to the Declaration of Independence then back to Condorcet, who authored France's Constitution of 1793. Condorcet's praise of America was typical of the Enlightenment philosophes. Immediate reaction to the American Revolution by Enlightenment thinkers was one of enthusiastic praise. In his popular pamphlet entitled "Observations on the Importance of the American Revolution and the Means of Making it a Benefit of the World," Richard Price (1723-1791) writes, "I see the revolution in favor of universal liberty which has taken place in America; a revolution which opens a new prospect in human affairs, and begins a new era in the history of mankind." Given the unprecedented liberties guaranteed in America, Price is hopeful, nay certain, that liberty will soon spread throughout the world, if unchecked by tyrannical governments. He says the revolution will "raise the species higher" and compares its effect to "opening a new sense." Indeed, he goes so far as to suggest that "next to the introduction of Christianity among mankind, the American revolution may prove the most important step in the progressive course of human improvement." So many hopes has he pinned on America that "perhaps there never existed a people on whose wisdom and virtue more depended; or to whom a station of more importance in the plan of Providence has been assigned." Similarly, Anne-Robert-Jacques Turgot (1727-1781), whose brief stint as finance minister in France marked the last serious attempt at reform before the French Revolution, says in a "Letter to Price" that America is "the hope of the world" and should "become a model to it." The Enlightenment thinkers did not think America was perfect. Slavery was America's greatest flaw. They understood the difficulties in eradicating this execrable institution and argued that America would be judged by the manner of eliminating it as circumstances allowed. The great strengths of America, however, more than outweighed its imperfections. Enlightenment leaders praised the numerous liberties in the United States, including freedom of the press, speech, conscience, and religion. Moreover, America was seen as an inspiration for the world. As Condorcet writes, it is an example "so useful to all the nations who can contemplate it"; "it teaches them that these rights are everywhere the same"; "the example of a free people submitting peacefully to military, as to civil, laws will doubtless have the power to cure us." Europe developed these Enlightenment ideas, but due to its powerfully entrenched institutions, it could not act on them. The Enlightenment philosophes, however, thought that the example of America would inspire the deeds that their words could not. In fact, they were right. The American Revolution inspired the French Revolutionaries in 1789, and it has continued to inspire revolutionaries throughout the world.

The Latin Middle Ages In the twelfth century, Europeans such as Gerard of Cremona, Daniel of Morley, and Robert of Ketton began to translate more than seventy Arabic texts into Latin, introducing Europeans for the first time to the alchemical corpus and the Arabic term al-kimiya, which became the Latin alquimia, alkimia, alchimia, or alchemia. By the end of the century, translations from Morienus, the Corpus Gabirianum, and al-Razi, as well as a host of other spurious alchemical texts, acquainted Europeans with the central figures, techniques, and ideas of the Greek and Arabic alchemical traditions. A vibrant European technical literature emerged, dealing with alchemical techniques for dye-making, distilling, metallurgy, mineralogy, and, of course, the transmutation of metals. Alongside this practical literature, European alchemists writing in Latin also continued to develop the theoretical foundations of their art. Around 1250 the prominent philosopher Albertus Magnus (c. 1200-1280) legitimated scholarly interest in alchemy when he praised alchemists' ability to imitate nature in his De mineralibus (On minerals). The late-thirteenth-century Summa perfectionis magisterii of Pseudo-Geber (or "the Latin Geber," likely the Italian Franciscan Paul of Taranto) replaced the mercury-sulfur theory with the "mercury alone" theory, which stated that mercury was the fundamental component of metals, while sulfur was a pollutant. Drawing on Aristotle, medieval natural philosophy, and medical theory, Pseudo-Geber also articulated a corpuscular theory of matter, positing that all matter is composed of small particles. In the thirteenth century a debate emerged around alchemy's legitimacy and its relationship to other fields of knowledge. The central issue was whether alchemy, as a form of human art or technology, was capable of successfully imitating or even surpassing nature. Responding initially to ibn Sina's famous rejection of transmutation (often given greater authority by its erroneous attribution to Aristotle), scholars such as Roger Bacon (c. 1220-1292) and Paul of Taranto forcefully argued that true alchemists could indeed use their knowledge of metals to affect real transmutations. In the late thirteenth and fourteenth centuries Thomas Aquinas, Giles of Rome (Aegidius Romanus), and the inquisitor Nicholas Eymeric formulated theological objections to alchemy, arguing that the power to transmute species was reserved for God alone. This religious critique reached its peak around 1317 with Pope John XXII's condemnation of those alchemists who "promise that which they do not produce." Although this papal bull's primary target was alchemical counterfeiting of metals, it sanctioned the increasingly vociferous backlash against alchemists' claims of power over nature. Still, alchemy continued to flourish in the fourteenth century, as evidenced by the popularity of the Fransciscan Johannes de Rupescissa's treatise on the quintessence, or inner essence, of all matter, and a spate of alchemical texts spuriously propagated in the fourteenth century under the name of the thirteenth-century Catalan physician Ramon Lull.

The Latin Middle Ages In the twelfth century, Europeans such as Gerard of Cremona, Daniel of Morley, and Robert of Ketton began to translate more than seventy Arabic texts into Latin, introducing Europeans for the first time to the alchemical corpus and the Arabic term al-kimiya, which became the Latin alquimia, alkimia, alchimia, or alchemia. By the end of the century, translations from Morienus, the Corpus Gabirianum, and al-Razi, as well as a host of other spurious alchemical texts, acquainted Europeans with the central figures, techniques, and ideas of the Greek and Arabic alchemical traditions. A vibrant European technical literature emerged, dealing with alchemical techniques for dye-making, distilling, metallurgy, mineralogy, and, of course, the transmutation of metals. Alongside this practical literature, European alchemists writing in Latin also continued to develop the theoretical foundations of their art. Around 1250 the prominent philosopher Albertus Magnus (c. 1200-1280) legitimated scholarly interest in alchemy when he praised alchemists' ability to imitate nature in his De mineralibus (On minerals). The late-thirteenth-century Summa perfectionis magisterii of Pseudo-Geber (or "the Latin Geber," likely the Italian Franciscan Paul of Taranto) replaced the mercury-sulfur theory with the "mercury alone" theory, which stated that mercury was the fundamental component of metals, while sulfur was a pollutant. Drawing on Aristotle, medieval natural philosophy, and medical theory, Pseudo-Geber also articulated a corpuscular theory of matter, positing that all matter is composed of small particles. In the thirteenth century a debate emerged around alchemy's legitimacy and its relationship to other fields of knowledge. The central issue was whether alchemy, as a form of human art or technology, was capable of successfully imitating or even surpassing nature. Responding initially to ibn Sina's famous rejection of transmutation (often given greater authority by its erroneous attribution to Aristotle), scholars such as Roger Bacon (c. 1220-1292) and Paul of Taranto forcefully argued that true alchemists could indeed use their knowledge of metals to affect real transmutations. In the late thirteenth and fourteenth centuries Thomas Aquinas, Giles of Rome (Aegidius Romanus), and the inquisitor Nicholas Eymeric formulated theological objections to alchemy, arguing that the power to transmute species was reserved for God alone. This religious critique reached its peak around 1317 with Pope John XXII's condemnation of those alchemists who "promise that which they do not produce." Although this papal bull's primary target was alchemical counterfeiting of metals, it sanctioned the increasingly vociferous backlash against alchemists' claims of power over nature. Still, alchemy continued to flourish in the fourteenth century, as evidenced by the popularity of the Fransciscan Johannes de Rupescissa's treatise on the quintessence, or inner essence, of all matter, and a spate of alchemical texts spuriously propagated in the fourteenth century under the name of the thirteenth-century Catalan physician Ramon Lull.

Afrocentricity as the African-American Studies Metaparadigm The implications of Afrocentricity for African-American studies have been considerable. Indeed, Asante argues that only when African-American studies scholars center themselves mentally and intellectually in the African cultural and historical experience will genuine African-American studies come into existence. Until then, Asante maintains, Eurocentric studies of African people and phenomena will continue to parade as African-American studies, with the latter existing only as a subfield of European studies. First, Afrocentricity insists, it must be realized that any idea, concept, or theory, no matter how "neutral" it claims to be, is nonetheless a product of a particular cultural and historical matrix. As such, it carries specific cultural assumptions, often of a metaphysical nature. Hence to embrace a European theory or idea is not as innocent an academic exercise as it may seem. In fact, it is Afrocentricity's contention that unless African scholars are willing to reexamine the process of their own intellectual conversion, which takes place under the guise of "formal education," they will continue to be the easy prey of European intellectual hegemony. What is suggested, instead, is that African intellectuals must consciously and systematically relocate themselves in their own cultural and historical matrix, from which they must draw the criteria by which they evaluate the African experience. Their work must be informed by "centrism," that is, "the groundedness of observation and behavior in one's own historical experiences" (Asante, 1990, p. 12). Africology is the discipline to which those who study African people and phenomena from an Afrocentric perspective belong. Thus it can be said that Afrocentricity emerged as a new paradigm to challenge the Eurocentric paradigm responsible for the intellectual disenfranchisement and the making invisible of African people, even to themselves in many cases. In that respect, Afrocentricity therefore presents itself as the African-American studies metaparadigm. As such, it includes three major aspects: cognitive, structural, and functional. The cognitive aspect involves the metaphysical foundations—such as the organizing principle and set of presuppositions that were outlined above, a methodology, methods, concepts, and theories. The structural aspect refers to the existence of an Afrocentric intellectual community, such as is found at Temple University. Finally, the structural aspect of the Afrocentric paradigm refers to the ability of the latter to activate African people's consciousness and to bring them closer to freedom, the ultimate goal of Afrocentricity. Hence Asante concludes that what can be called the discipline of African-American AFROCENTRICITY New Dictionary of the History of Ideas 31 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 31 studies itself is intimately linked to the development of Afrocentricity and the establishment in the late eighties of the Temple doctoral program, the first Ph.D. program in AfricanAmerican studies in the United States. The Temple Ph.D. program in Africology was immediately successful, as hundreds of national and international applicants sought admission in order to be a part of the Afrocentric epistemological watershed. Although the program has suffered serious setbacks since its inception, there can be little doubt about its influence on African-American studies. Over four hundred dissertations employing the Afrocentric paradigm have been defended, at Temple and at other institutions. Indeed, the Temple Ph.D. program opened the path for the creation of other African-American studies Ph.D. programs in the United States in subsequent years.

Afrocentricity as the African-American Studies Metaparadigm The implications of Afrocentricity for African-American studies have been considerable. Indeed, Asante argues that only when African-American studies scholars center themselves mentally and intellectually in the African cultural and historical experience will genuine African-American studies come into existence. Until then, Asante maintains, Eurocentric studies of African people and phenomena will continue to parade as African-American studies, with the latter existing only as a subfield of European studies. First, Afrocentricity insists, it must be realized that any idea, concept, or theory, no matter how "neutral" it claims to be, is nonetheless a product of a particular cultural and historical matrix. As such, it carries specific cultural assumptions, often of a metaphysical nature. Hence to embrace a European theory or idea is not as innocent an academic exercise as it may seem. In fact, it is Afrocentricity's contention that unless African scholars are willing to reexamine the process of their own intellectual conversion, which takes place under the guise of "formal education," they will continue to be the easy prey of European intellectual hegemony. What is suggested, instead, is that African intellectuals must consciously and systematically relocate themselves in their own cultural and historical matrix, from which they must draw the criteria by which they evaluate the African experience. Their work must be informed by "centrism," that is, "the groundedness of observation and behavior in one's own historical experiences" (Asante, 1990, p. 12). Africology is the discipline to which those who study African people and phenomena from an Afrocentric perspective belong. Thus it can be said that Afrocentricity emerged as a new paradigm to challenge the Eurocentric paradigm responsible for the intellectual disenfranchisement and the making invisible of African people, even to themselves in many cases. In that respect, Afrocentricity therefore presents itself as the African-American studies metaparadigm. As such, it includes three major aspects: cognitive, structural, and functional. The cognitive aspect involves the metaphysical foundations—such as the organizing principle and set of presuppositions that were outlined above, a methodology, methods, concepts, and theories. The structural aspect refers to the existence of an Afrocentric intellectual community, such as is found at Temple University. Finally, the structural aspect of the Afrocentric paradigm refers to the ability of the latter to activate African people's consciousness and to bring them closer to freedom, the ultimate goal of Afrocentricity. Hence Asante concludes that what can be called the discipline of African-American AFROCENTRICITY New Dictionary of the History of Ideas 31 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 31 studies itself is intimately linked to the development of Afrocentricity and the establishment in the late eighties of the Temple doctoral program, the first Ph.D. program in AfricanAmerican studies in the United States. The Temple Ph.D. program in Africology was immediately successful, as hundreds of national and international applicants sought admission in order to be a part of the Afrocentric epistemological watershed. Although the program has suffered serious setbacks since its inception, there can be little doubt about its influence on African-American studies. Over four hundred dissertations employing the Afrocentric paradigm have been defended, at Temple and at other institutions. Indeed, the Temple Ph.D. program opened the path for the creation of other African-American studies Ph.D. programs in the United States in subsequent years.

The Dynamics of Ambiguity Open systems, far removed from (thermodynamic) equilibrium by intense fluxes of resources—such as matter, energy, and information—exceeding certain critical thresholds, undergo dynamic instabilities resulting in the emergence of spatial, temporal, or functional order. These instabilities exhibit a critical region where the transformation has not yet occurred and yet, at the same time, has already occurred. This region hosts ambiguity, an ambiguity that can be captured at the critical state marking the onset of convective motions in an initially still fluid heated from below (for example, think to the critical state of the formation of the Giants Causeway, the hexagonal volcanic rocks of Northern Ireland), at the starting of a chromatic chemical clock during the Belousov-Zhabotinsky autocatalytic reactions, or at the arising of a synchronized, ordered applause from a stochastic clapping when the audience in an auditorium, driven by enthusiasm, demands an encore from the soloist. Dynamic instabilities occur under special critical conditions in nature and in society. They also occur during perception, not seldom but continuously and systematically. Their outcome, at the critical state of the perceptive process, is the emergence of visual thinking. Vivid examples of ambiguity in the mind can be experienced while looking at an ambiguous structure such as Fragment of Psychoplastic Structure (1963, collection of the author). This figure may be conceived as a visual metaphor for a diatomic hydrogen molecule formed by two identical atoms. It helps to visualize both the two lower energy modes of being AMBIGUITY 54 New Dictionary of the History of Ideas Roman copy of Discobolus by Myron, created 2nd century B.C.E. The concept of arrested motion, as seen in Myron's sculpture of a discus thrower, illustrates the substitution of opposite qualities that can sometimes characterize ambiguity. © DAGLI ORTI/CORBIS 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 54 (the so-called stationary states) of this molecule and its resonant behavior during a spectroscopic observation of it. At first this figure, by construction, could be envisaged superficially as a two-dimensional structure exhibiting a center of symmetry. Keep looking at it as passively as possible. Its central region around the center of symmetry could be described in two ways: (1) as belonging 50 percent to the modulus at left and 50 percent to the modulus at right and, paradoxically, (2) as belonging neither to the modulus at left nor to the modulus at right. These two descriptions, though quite acceptable if considered separately, are incompatible if attributed to the same reality simultaneously, as they should be in this case. Indeed, we react instinctively to the absurdity of the situation and hasten to remove the ambiguity built into the figure by letting the two-dimensional figure invade the three-dimensional space and assign its central region to the right- or to the left-hand cubic modulus. Thereafter, visual thinking cannot get stuck in either of these positions: soon an endless sequence of approximately periodic perceptive alternations of right/left/right prospects sets in. As anticipated, the process of perception, leading to the dynamics of visual thinking, turns out to resemble closely the process of measurement of a homonuclear diatomic molecule according to quantum mechanics. Both processes share ambiguity.

The Dynamics of Ambiguity Open systems, far removed from (thermodynamic) equilibrium by intense fluxes of resources—such as matter, energy, and information—exceeding certain critical thresholds, undergo dynamic instabilities resulting in the emergence of spatial, temporal, or functional order. These instabilities exhibit a critical region where the transformation has not yet occurred and yet, at the same time, has already occurred. This region hosts ambiguity, an ambiguity that can be captured at the critical state marking the onset of convective motions in an initially still fluid heated from below (for example, think to the critical state of the formation of the Giants Causeway, the hexagonal volcanic rocks of Northern Ireland), at the starting of a chromatic chemical clock during the Belousov-Zhabotinsky autocatalytic reactions, or at the arising of a synchronized, ordered applause from a stochastic clapping when the audience in an auditorium, driven by enthusiasm, demands an encore from the soloist. Dynamic instabilities occur under special critical conditions in nature and in society. They also occur during perception, not seldom but continuously and systematically. Their outcome, at the critical state of the perceptive process, is the emergence of visual thinking. Vivid examples of ambiguity in the mind can be experienced while looking at an ambiguous structure such as Fragment of Psychoplastic Structure (1963, collection of the author). This figure may be conceived as a visual metaphor for a diatomic hydrogen molecule formed by two identical atoms. It helps to visualize both the two lower energy modes of being AMBIGUITY 54 New Dictionary of the History of Ideas Roman copy of Discobolus by Myron, created 2nd century B.C.E. The concept of arrested motion, as seen in Myron's sculpture of a discus thrower, illustrates the substitution of opposite qualities that can sometimes characterize ambiguity. © DAGLI ORTI/CORBIS 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 54 (the so-called stationary states) of this molecule and its resonant behavior during a spectroscopic observation of it. At first this figure, by construction, could be envisaged superficially as a two-dimensional structure exhibiting a center of symmetry. Keep looking at it as passively as possible. Its central region around the center of symmetry could be described in two ways: (1) as belonging 50 percent to the modulus at left and 50 percent to the modulus at right and, paradoxically, (2) as belonging neither to the modulus at left nor to the modulus at right. These two descriptions, though quite acceptable if considered separately, are incompatible if attributed to the same reality simultaneously, as they should be in this case. Indeed, we react instinctively to the absurdity of the situation and hasten to remove the ambiguity built into the figure by letting the two-dimensional figure invade the three-dimensional space and assign its central region to the right- or to the left-hand cubic modulus. Thereafter, visual thinking cannot get stuck in either of these positions: soon an endless sequence of approximately periodic perceptive alternations of right/left/right prospects sets in. As anticipated, the process of perception, leading to the dynamics of visual thinking, turns out to resemble closely the process of measurement of a homonuclear diatomic molecule according to quantum mechanics. Both processes share ambiguity.

The Philosophical Sources of Agnosticism The term agnosticism, as it is used in common parlance, normally refers to a neutral or undecided position on the question of the existence of God. It is shorthand for a rejection of religious faith on the one hand and of outright atheism on the other. The philosophical sources and Victorian expositions of agnosticism, however, reveal it to signify a much broader set of arguments about the limits of human knowledge, whether religious or scientific. Bernard Lightman's definitive study, The Origins of Agnosticism (1987), places particular emphasis on the concept's Kantian origins. It is true that Kantian views about the limits of speculative reason, the relativity of knowledge, and the active role of the categories of the mind in constituting that knowledge formed an important part of agnosticism. Lightman argues convincingly for the influence of two writers in particular—William Hamilton (1788-1856) and Henry Longueville Mansel (1820- 1871)—on later Victorian agnostics. Hamilton was a Scottish metaphysician who, as well as seeing himself as a defender of the Scottish "common sense" philosophy of Thomas Reid (1710- 1796) and Dugald Stewart (1753-1828), was probably the most important expositor of Kantian philosophy in Britain in the first half of the nineteenth century. Mansel drew heavily on Hamilton's particular version of Kantianism in his controversial 1858 Bampton Lectures, entitled The Limits of Religious Thought. In these lectures, Mansel argued that speculative reason on its own led to all sorts of contradictions if allowed free rein in the area of theology. His conclusion was that only relative knowledge was possible and that the absolute (or the unconditioned, to use Hamilton's term) was not knowable through the faculties of sense and reason. Mansel's conclusion was that in the realm of theology, final authority must rest with revelation rather than reason. While Mansel believed that he had used Kant's philosophy constructively—to demonstrate the necessity of revelation and the authority of the Bible—critics from all sides felt that his arguments constituted, in effect, a complete capitulation in the face of rationalism and modern science and a retreat into an extreme form of fideism. The idea that Kantian philosophy was at the heart of agnosticism needs to be qualified in a couple of ways (as Lightman himself acknowledges). First, Hamilton and Mansel were far from being simply followers of Kant. They tried to make use of his ideas for their own polemical purposes and certainly did not agree with or reproduce his entire system. The attempt to use philosophy to undermine reason in the realm of theology and establish the necessity and authority of revelation is certainly not "Kantian" in the sense of being a teaching of Kant. Second, a recognition of the influence of Kant on Victorian agnostics should not obscure the very important contributions of David Hume (1711-1776), to whom Kant himself famously acknowledged an important debt, and of other philosophers in the Scottish tradition. These included Reid and Stewart, in whose footsteps Hamilton was following, as well as Hamilton's principal philosophical antagonist, the empiricist John Stuart Mill (1806-1873). The agnostic philosophy of Thomas Huxley, for instance, was based on a teaching central to the Scottish school, namely that "mind" and "matter" were merely shorthand terms for unknown realities that underlie the world of experience (which is the only domain in which we can have knowledge).

The Philosophical Sources of Agnosticism The term agnosticism, as it is used in common parlance, normally refers to a neutral or undecided position on the question of the existence of God. It is shorthand for a rejection of religious faith on the one hand and of outright atheism on the other. The philosophical sources and Victorian expositions of agnosticism, however, reveal it to signify a much broader set of arguments about the limits of human knowledge, whether religious or scientific. Bernard Lightman's definitive study, The Origins of Agnosticism (1987), places particular emphasis on the concept's Kantian origins. It is true that Kantian views about the limits of speculative reason, the relativity of knowledge, and the active role of the categories of the mind in constituting that knowledge formed an important part of agnosticism. Lightman argues convincingly for the influence of two writers in particular—William Hamilton (1788-1856) and Henry Longueville Mansel (1820- 1871)—on later Victorian agnostics. Hamilton was a Scottish metaphysician who, as well as seeing himself as a defender of the Scottish "common sense" philosophy of Thomas Reid (1710- 1796) and Dugald Stewart (1753-1828), was probably the most important expositor of Kantian philosophy in Britain in the first half of the nineteenth century. Mansel drew heavily on Hamilton's particular version of Kantianism in his controversial 1858 Bampton Lectures, entitled The Limits of Religious Thought. In these lectures, Mansel argued that speculative reason on its own led to all sorts of contradictions if allowed free rein in the area of theology. His conclusion was that only relative knowledge was possible and that the absolute (or the unconditioned, to use Hamilton's term) was not knowable through the faculties of sense and reason. Mansel's conclusion was that in the realm of theology, final authority must rest with revelation rather than reason. While Mansel believed that he had used Kant's philosophy constructively—to demonstrate the necessity of revelation and the authority of the Bible—critics from all sides felt that his arguments constituted, in effect, a complete capitulation in the face of rationalism and modern science and a retreat into an extreme form of fideism. The idea that Kantian philosophy was at the heart of agnosticism needs to be qualified in a couple of ways (as Lightman himself acknowledges). First, Hamilton and Mansel were far from being simply followers of Kant. They tried to make use of his ideas for their own polemical purposes and certainly did not agree with or reproduce his entire system. The attempt to use philosophy to undermine reason in the realm of theology and establish the necessity and authority of revelation is certainly not "Kantian" in the sense of being a teaching of Kant. Second, a recognition of the influence of Kant on Victorian agnostics should not obscure the very important contributions of David Hume (1711-1776), to whom Kant himself famously acknowledged an important debt, and of other philosophers in the Scottish tradition. These included Reid and Stewart, in whose footsteps Hamilton was following, as well as Hamilton's principal philosophical antagonist, the empiricist John Stuart Mill (1806-1873). The agnostic philosophy of Thomas Huxley, for instance, was based on a teaching central to the Scottish school, namely that "mind" and "matter" were merely shorthand terms for unknown realities that underlie the world of experience (which is the only domain in which we can have knowledge).

(the so-called stationary states) of this molecule and its resonant behavior during a spectroscopic observation of it. At first this figure, by construction, could be envisaged superficially as a two-dimensional structure exhibiting a center of symmetry. Keep looking at it as passively as possible. Its central region around the center of symmetry could be described in two ways: (1) as belonging 50 percent to the modulus at left and 50 percent to the modulus at right and, paradoxically, (2) as belonging neither to the modulus at left nor to the modulus at right. These two descriptions, though quite acceptable if considered separately, are incompatible if attributed to the same reality simultaneously, as they should be in this case. Indeed, we react instinctively to the absurdity of the situation and hasten to remove the ambiguity built into the figure by letting the two-dimensional figure invade the three-dimensional space and assign its central region to the right- or to the left-hand cubic modulus. Thereafter, visual thinking cannot get stuck in either of these positions: soon an endless sequence of approximately periodic perceptive alternations of right/left/right prospects sets in. As anticipated, the process of perception, leading to the dynamics of visual thinking, turns out to resemble closely the process of measurement of a homonuclear diatomic molecule according to quantum mechanics. Both processes share ambiguity. Ambiguity as a Permanent Cultural Value In conclusion, complex concepts of quantum physics and the structure of matter are intimately connected with optical illusions, paradoxes, and ambiguity, features usually attributed to the world of art rather than to science. Both art and science are produced, emotionally and rationally, by our thinking. And our thinking proceeds chaotically, on the jagged watershed of a permanent cultural value: ambiguity

(the so-called stationary states) of this molecule and its resonant behavior during a spectroscopic observation of it. At first this figure, by construction, could be envisaged superficially as a two-dimensional structure exhibiting a center of symmetry. Keep looking at it as passively as possible. Its central region around the center of symmetry could be described in two ways: (1) as belonging 50 percent to the modulus at left and 50 percent to the modulus at right and, paradoxically, (2) as belonging neither to the modulus at left nor to the modulus at right. These two descriptions, though quite acceptable if considered separately, are incompatible if attributed to the same reality simultaneously, as they should be in this case. Indeed, we react instinctively to the absurdity of the situation and hasten to remove the ambiguity built into the figure by letting the two-dimensional figure invade the three-dimensional space and assign its central region to the right- or to the left-hand cubic modulus. Thereafter, visual thinking cannot get stuck in either of these positions: soon an endless sequence of approximately periodic perceptive alternations of right/left/right prospects sets in. As anticipated, the process of perception, leading to the dynamics of visual thinking, turns out to resemble closely the process of measurement of a homonuclear diatomic molecule according to quantum mechanics. Both processes share ambiguity. Ambiguity as a Permanent Cultural Value In conclusion, complex concepts of quantum physics and the structure of matter are intimately connected with optical illusions, paradoxes, and ambiguity, features usually attributed to the world of art rather than to science. Both art and science are produced, emotionally and rationally, by our thinking. And our thinking proceeds chaotically, on the jagged watershed of a permanent cultural value: ambiguity

1920s occasional voices were raised against the project of a homogeneous America, and by the 1930s an ideology of "cultural pluralism" gained currency. With the rise of fascism in Europe, concerns over ethnic, racial, and religious tensions in the United States, the ongoing social and political incorporation of second-generation Americans, scholarly discovery of persistent ethnicity in the cities, and anthropological refutations of racialist doctrines, ideas of America as encompassing a potentially harmonious diversity consistent with assimilation took hold, and were reinforced during the mass mobilization for World War II. Cultural diversity would be tolerated within the context of shared national ideas and sentiments that ensured civic harmony and cooperation. Cultural pluralism in this form represented a powerful reaffirmation of American ideology as a basis of national identity. Pluralism in the 1950s was "predicated on consensus around the American value system despite seeming to place a premium verbally on diversity" (Gleason, p. 62). Subsequently, American universalistic egalitarian and individualistic civic ideals appeared to triumph in passage of the Civil Rights Act of 1964 and the Immigration Reform Act and Voting Rights Act of 1965. The triumph, in the view of some, was short-lived. Americanization after 1965. The emergence and legitimacy of Black Power and other ethnic nationalisms in the mid-1960s, anti-Vietnam War critiques and mass protests, and the adoption of policies encouraging ethnic identification, recognition, and rights, was seen by some to have replaced civic nationalism with a strong version of cultural pluralism, later to be termed multiculturalism. In the process, assimilation joined the already discredited term Americanization as a term of opprobrium. However, reactions against multiculturalism have occasioned calls for the revival of a civic ideology of American identity, and some have attempted to revive a modern ideal of Americanization. Significantly, academic and journalistic critics of multiculturalism rarely claim to seek a return to the demands for homogeneity characteristic of the Americanization movement period, nor do they urge an end to ethnicity. John Higham advocates "pluralistic integration," in which individual rights and needs for group solidarity are balanced, as are universalistic principles and particularistic needs. David Hollinger propounds a model of "postethnic cosmopolitanism," which prefers voluntary to prescribed affiliations, appreciates multiple identities and communities of broad scope, and accepts the formation of new groups as part of the normal life of a democratic society. Peter Salins commends "Assimilation, American Style" that requires citizens to accept English as the national language, take pride in American identity, and believe in America's liberal and democratic egalitarian principles, and to live by a Protestant ethic of self-reliance, hard work, and moral rectitude, but does not demand cultural homogeneity. Even John Miller, who protests "The Unmaking of Americans" and the undermining of an earlier assimilation ethic by multiculturalism, argues not that racial and ethnic identities should be suppressed, but only that their expression remain confined to the private sphere. In its 1997 recommendations, the United States Commission on Immigration Reform recommended "taking back" the word Americanization, since it is "our word" that was "stolen" by racists and xenophobes in the 1920s. The Commission defined Americanization in ways that are consistent with the ideal of civic nationality. "Americanization," the Commission wrote, " is the process of integration by which immigrants become part of our communities and by which our communities and the nation learn from and adapt to their presence," and is "the cultivation of a shared commitment to the American values of liberty, democracy and equal opportunity" (p. 26). "The United States," the Commission continued, is a nation founded on the proposition that each individual is born with certain rights and that the purpose AMERICANIZATION, U.S. 62 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 62 of government is to secure these rights. The United States admits immigrants as individuals. . . . As long as the United States continues to emphasize the rights of individuals over those of groups, we need not fear that the diversity brought by immigrants will lead to ethnic division or disunity. (pp. 28-29) Whether or not subsequent government action is as attentive as the Commission tried to be to the "cosmopolitan" elements in defining American identity is debatable, but government policies certainly evidence an ongoing commitment to Americanization. In 2001 the Congress replaced the Bilingual Education Act with the English Language Acquisition Act, which included replacing the United States Department of Education's Office of Bilingual Education and Minority Language Affairs with an Office of English Language Acquisition. In 2003 Congress established the "Office of Citizenship" in the United States Department of Homeland Security. The Office of Citizenship is meant to work to revive and emphasize "the common civic identity and shared values that are essential to citizenship," according to a government fact sheet. And, despite apparent commitments to multiculturalism in the pubic schools, actual formal and informal practices, particularly those emphasizing the rapid acquisition of English, suggest that schools continue to regard "Americanization" as a priority, even if they do not use that term. What remains absent from the schools is the civics education component of Americanization that predominated during the 1910s.

1920s occasional voices were raised against the project of a homogeneous America, and by the 1930s an ideology of "cultural pluralism" gained currency. With the rise of fascism in Europe, concerns over ethnic, racial, and religious tensions in the United States, the ongoing social and political incorporation of second-generation Americans, scholarly discovery of persistent ethnicity in the cities, and anthropological refutations of racialist doctrines, ideas of America as encompassing a potentially harmonious diversity consistent with assimilation took hold, and were reinforced during the mass mobilization for World War II. Cultural diversity would be tolerated within the context of shared national ideas and sentiments that ensured civic harmony and cooperation. Cultural pluralism in this form represented a powerful reaffirmation of American ideology as a basis of national identity. Pluralism in the 1950s was "predicated on consensus around the American value system despite seeming to place a premium verbally on diversity" (Gleason, p. 62). Subsequently, American universalistic egalitarian and individualistic civic ideals appeared to triumph in passage of the Civil Rights Act of 1964 and the Immigration Reform Act and Voting Rights Act of 1965. The triumph, in the view of some, was short-lived. Americanization after 1965. The emergence and legitimacy of Black Power and other ethnic nationalisms in the mid-1960s, anti-Vietnam War critiques and mass protests, and the adoption of policies encouraging ethnic identification, recognition, and rights, was seen by some to have replaced civic nationalism with a strong version of cultural pluralism, later to be termed multiculturalism. In the process, assimilation joined the already discredited term Americanization as a term of opprobrium. However, reactions against multiculturalism have occasioned calls for the revival of a civic ideology of American identity, and some have attempted to revive a modern ideal of Americanization. Significantly, academic and journalistic critics of multiculturalism rarely claim to seek a return to the demands for homogeneity characteristic of the Americanization movement period, nor do they urge an end to ethnicity. John Higham advocates "pluralistic integration," in which individual rights and needs for group solidarity are balanced, as are universalistic principles and particularistic needs. David Hollinger propounds a model of "postethnic cosmopolitanism," which prefers voluntary to prescribed affiliations, appreciates multiple identities and communities of broad scope, and accepts the formation of new groups as part of the normal life of a democratic society. Peter Salins commends "Assimilation, American Style" that requires citizens to accept English as the national language, take pride in American identity, and believe in America's liberal and democratic egalitarian principles, and to live by a Protestant ethic of self-reliance, hard work, and moral rectitude, but does not demand cultural homogeneity. Even John Miller, who protests "The Unmaking of Americans" and the undermining of an earlier assimilation ethic by multiculturalism, argues not that racial and ethnic identities should be suppressed, but only that their expression remain confined to the private sphere. In its 1997 recommendations, the United States Commission on Immigration Reform recommended "taking back" the word Americanization, since it is "our word" that was "stolen" by racists and xenophobes in the 1920s. The Commission defined Americanization in ways that are consistent with the ideal of civic nationality. "Americanization," the Commission wrote, " is the process of integration by which immigrants become part of our communities and by which our communities and the nation learn from and adapt to their presence," and is "the cultivation of a shared commitment to the American values of liberty, democracy and equal opportunity" (p. 26). "The United States," the Commission continued, is a nation founded on the proposition that each individual is born with certain rights and that the purpose AMERICANIZATION, U.S. 62 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 62 of government is to secure these rights. The United States admits immigrants as individuals. . . . As long as the United States continues to emphasize the rights of individuals over those of groups, we need not fear that the diversity brought by immigrants will lead to ethnic division or disunity. (pp. 28-29) Whether or not subsequent government action is as attentive as the Commission tried to be to the "cosmopolitan" elements in defining American identity is debatable, but government policies certainly evidence an ongoing commitment to Americanization. In 2001 the Congress replaced the Bilingual Education Act with the English Language Acquisition Act, which included replacing the United States Department of Education's Office of Bilingual Education and Minority Language Affairs with an Office of English Language Acquisition. In 2003 Congress established the "Office of Citizenship" in the United States Department of Homeland Security. The Office of Citizenship is meant to work to revive and emphasize "the common civic identity and shared values that are essential to citizenship," according to a government fact sheet. And, despite apparent commitments to multiculturalism in the pubic schools, actual formal and informal practices, particularly those emphasizing the rapid acquisition of English, suggest that schools continue to regard "Americanization" as a priority, even if they do not use that term. What remains absent from the schools is the civics education component of Americanization that predominated during the 1910s.

AFROCENTRICITY. Afrocentricity is a theory that emerged in the early 1980s in the United States within the academic context of African-American studies. Afrocentricity was articulated by Molefi Kete Asante, a professor of AfricanAmerican studies at Temple University and creator of the first Ph.D. program in African-American studies in the nation, in three major essays published between 1980 and 1990. Like most theories, Afrocentricity has come to be associated with different thrusts, some of which may even be contradictory or incompatible with the original definition of Afrocentricity. However, at its core, Afrocentricity is a theory concerned with African epistemological relevance, also referred to as centeredness or location. The ultimate goal of Afrocentricity is the liberation of African people from the grips of Eurocentrism. The primary and indispensable mechanism to achieve this goal is the fostering of African intellectual agency

AFROCENTRICITY. Afrocentricity is a theory that emerged in the early 1980s in the United States within the academic context of African-American studies. Afrocentricity was articulated by Molefi Kete Asante, a professor of AfricanAmerican studies at Temple University and creator of the first Ph.D. program in African-American studies in the nation, in three major essays published between 1980 and 1990. Like most theories, Afrocentricity has come to be associated with different thrusts, some of which may even be contradictory or incompatible with the original definition of Afrocentricity. However, at its core, Afrocentricity is a theory concerned with African epistemological relevance, also referred to as centeredness or location. The ultimate goal of Afrocentricity is the liberation of African people from the grips of Eurocentrism. The primary and indispensable mechanism to achieve this goal is the fostering of African intellectual agency

AFROPESSIMISM. Afropessimism refers to the perception of sub-Saharan Africa as a region too riddled with problems for good governance and economic development. The term gained currency in the 1980s, when many Africanists in Western creditor countries believed that there was no hope for consolidating democracy and achieving sustainable economic development in the region. The earliest use in print of the word was in a 1988 article from the Xinhua News Agency in which Michel Aurillac, France's minister of cooperation, criticized the prevailing pessimism in the West about Africa's economic development and cautioned against what he referred to as an "Afro-pessimism" on the part of some creditors.

AFROPESSIMISM. Afropessimism refers to the perception of sub-Saharan Africa as a region too riddled with problems for good governance and economic development. The term gained currency in the 1980s, when many Africanists in Western creditor countries believed that there was no hope for consolidating democracy and achieving sustainable economic development in the region. The earliest use in print of the word was in a 1988 article from the Xinhua News Agency in which Michel Aurillac, France's minister of cooperation, criticized the prevailing pessimism in the West about Africa's economic development and cautioned against what he referred to as an "Afro-pessimism" on the part of some creditors.

ALGEBRAS. The word algebra refers to a theory, usually mathematical, which is dominated by the use of words (often abbreviated), signs, and symbols to represent the objects under study (such as numbers), means of their combination (such as addition), and relationships between them (such as inequalities or equations). An algebra cannot be characterized solely as the determination of unknowns, for then most mathematics is algebra. For a long time the only known algebra, which was and is widely taught at school, represented numbers and/or geometrical magnitudes, and was principally concerned with solving polynomial equations; this might be called "common algebra." But especially during the nineteenth century other algebras were developed. The discussion below uses a distinction between three modes of algebraic mathematics that was made in 1837 by the great nineteenth-century Irish algebraist W. R. Hamilton (1805-1865): (1) The "practical" is an algebra of some kind, but it only provides a useful set of abbreviations or signs for quantities and operations; (2) In the "theological" mode the algebra furnishes the epistemological basis for the theory involved, which may belong to another branch of mathematics (for example, mechanics); (3) In the "philological" mode the algebra furnishes in some essential way the formal language of the theory. Lack of space prevents much discussion of the motivations and applications of algebras. The most important were geometries, the differential and integral calculus, and algebraic number theory. Not Distant Origins? Several branches of mathematics must have primeval, unknown, origins: for example, arithmetic, geometry, trigonometry, and mechanics. But algebra is not one of them. While Mesopotamian and other ancient cultures show evidence of methods of determining numerical quantities, the means required need only arithmetical calculations; no symbolism is evident, or needed. Concerning the Greeks, the Elements of Euclid (fourth century B.C.E.), a discourse on plane and solid geometry with some arithmetic, was often regarded as "geometric algebra"; that is, the theories thought out in algebraic terms. While it can easily be so rendered, this reading has been discredited as historical. For one reason among many, in algebra one takes the square on length a to be a times a but Euclid worked with geometrical magnitudes such as lines, and never multiplied them together. The only extant Greek case of algebraization is the number theory of Diophantus of Alexandria (fl. c. 250 C.E.), who did use symbols for unknowns and means of their combination; however, others did not take up his system. A similar judgment applies to ancient Chinese ways of solving systems of linear equations. While their brilliant collection of rules can be rendered in terms of the modern manipulation of matrices, they did not create matrix theory.

ALGEBRAS. The word algebra refers to a theory, usually mathematical, which is dominated by the use of words (often abbreviated), signs, and symbols to represent the objects under study (such as numbers), means of their combination (such as addition), and relationships between them (such as inequalities or equations). An algebra cannot be characterized solely as the determination of unknowns, for then most mathematics is algebra. For a long time the only known algebra, which was and is widely taught at school, represented numbers and/or geometrical magnitudes, and was principally concerned with solving polynomial equations; this might be called "common algebra." But especially during the nineteenth century other algebras were developed. The discussion below uses a distinction between three modes of algebraic mathematics that was made in 1837 by the great nineteenth-century Irish algebraist W. R. Hamilton (1805-1865): (1) The "practical" is an algebra of some kind, but it only provides a useful set of abbreviations or signs for quantities and operations; (2) In the "theological" mode the algebra furnishes the epistemological basis for the theory involved, which may belong to another branch of mathematics (for example, mechanics); (3) In the "philological" mode the algebra furnishes in some essential way the formal language of the theory. Lack of space prevents much discussion of the motivations and applications of algebras. The most important were geometries, the differential and integral calculus, and algebraic number theory. Not Distant Origins? Several branches of mathematics must have primeval, unknown, origins: for example, arithmetic, geometry, trigonometry, and mechanics. But algebra is not one of them. While Mesopotamian and other ancient cultures show evidence of methods of determining numerical quantities, the means required need only arithmetical calculations; no symbolism is evident, or needed. Concerning the Greeks, the Elements of Euclid (fourth century B.C.E.), a discourse on plane and solid geometry with some arithmetic, was often regarded as "geometric algebra"; that is, the theories thought out in algebraic terms. While it can easily be so rendered, this reading has been discredited as historical. For one reason among many, in algebra one takes the square on length a to be a times a but Euclid worked with geometrical magnitudes such as lines, and never multiplied them together. The only extant Greek case of algebraization is the number theory of Diophantus of Alexandria (fl. c. 250 C.E.), who did use symbols for unknowns and means of their combination; however, others did not take up his system. A similar judgment applies to ancient Chinese ways of solving systems of linear equations. While their brilliant collection of rules can be rendered in terms of the modern manipulation of matrices, they did not create matrix theory.

AMERICA. America is one of the greatest politicalphilosophical symbols in world history. It is equal in importance to Athens representing philosophy, Jerusalem representing biblical religion, Rome representing both its pagan and Catholic manifestations, and Mecca representing the home of Islam. But what is meant by America? When people refer to it are they signifying the precise measurements of the landmass that incorporates the territory from Canada's Ellesmere Island above the magnetic pole in the north to Tierra del Fuego off the tip of Argentina in the south? Do they want to call AMERICA New Dictionary of the History of Ideas 55 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 55 attention to the area that in the year 2000 was home to fortyfive countries and territories with 900 million people, where dozens of languages are spoken, and where can be found people of almost every ethnic origin, religion, and social and economic class? It is unlikely that they are referring to these basic facts. Facts and figures do not begin to touch what America represents symbolically. Throughout its history, America has stood for two different, almost opposite, things. First, it stands for natural man, the Indians, who are said to represent the world's beginning. Second, it stands for the United States, the great political experiment based on natural rights, which has evoked inspiration and fear and envy. It inspires such strong feelings because the United States is often perceived as the world's future. America thus represents both the world's origins and its endpoint. This essay attempts to shed light on the "idea" of America by tracing its genealogy from America's discovery by Western man until the twenty-first century

AMERICA. America is one of the greatest politicalphilosophical symbols in world history. It is equal in importance to Athens representing philosophy, Jerusalem representing biblical religion, Rome representing both its pagan and Catholic manifestations, and Mecca representing the home of Islam. But what is meant by America? When people refer to it are they signifying the precise measurements of the landmass that incorporates the territory from Canada's Ellesmere Island above the magnetic pole in the north to Tierra del Fuego off the tip of Argentina in the south? Do they want to call AMERICA New Dictionary of the History of Ideas 55 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 55 attention to the area that in the year 2000 was home to fortyfive countries and territories with 900 million people, where dozens of languages are spoken, and where can be found people of almost every ethnic origin, religion, and social and economic class? It is unlikely that they are referring to these basic facts. Facts and figures do not begin to touch what America represents symbolically. Throughout its history, America has stood for two different, almost opposite, things. First, it stands for natural man, the Indians, who are said to represent the world's beginning. Second, it stands for the United States, the great political experiment based on natural rights, which has evoked inspiration and fear and envy. It inspires such strong feelings because the United States is often perceived as the world's future. America thus represents both the world's origins and its endpoint. This essay attempts to shed light on the "idea" of America by tracing its genealogy from America's discovery by Western man until the twenty-first century

AMERICANIZATION, U.S. Americanization refers to processes of "becoming American," and to organized efforts to encourage the transformation of immigrants into "Americans." The term was in informal use in the United States in the midnineteenth century, but it is most prominently associated with the movement of that name during the 1910s and early 1920s. The term is often used interchangeably with assimilation. The "problem" of Americanization arises because American national identity must be constructed in the absence of primordial ethnic mythology, and in the face of exceptional diversity. There is general recognition that the United States is a "civic nation," rather than an "ethnic nation," in which devotion to "founding principles" is the source of national identity and community. The creedal nature of American identity carries the implication that anyone may "become American" by committing himself or herself to the nation's founding principles, and to their expression in distinctively American symbols and ways of living. However, the propositional nature of American identity carries with it the question of who is capable of the necessary understanding of, and commitment to, American principles, and to the ways of living that they are taken to imply. That seed of doubt has led Americans to scrutinize cultural differences, ethnic consociation, and race as potential indicators of the lack of qualification for trusted membership in the polity, and to insist on outward demonstrations of Americanization by those considered for membership. American National Identity and Ideologies of Americanization The definition of American identity in ideological terms was elaborated in the early postindependence period. While the extent to which a new American people would emerge from the fusion of diverse strands of Europeans, as Michel-GuillaumeJean de Crevecoeur's (1735-1813) famous "Letter from an American Farmer" rhapsodized, was questionable, what was firmly established was the association of American identity with individual "transformation."

AMERICANIZATION, U.S. Americanization refers to processes of "becoming American," and to organized efforts to encourage the transformation of immigrants into "Americans." The term was in informal use in the United States in the midnineteenth century, but it is most prominently associated with the movement of that name during the 1910s and early 1920s. The term is often used interchangeably with assimilation. The "problem" of Americanization arises because American national identity must be constructed in the absence of primordial ethnic mythology, and in the face of exceptional diversity. There is general recognition that the United States is a "civic nation," rather than an "ethnic nation," in which devotion to "founding principles" is the source of national identity and community. The creedal nature of American identity carries the implication that anyone may "become American" by committing himself or herself to the nation's founding principles, and to their expression in distinctively American symbols and ways of living. However, the propositional nature of American identity carries with it the question of who is capable of the necessary understanding of, and commitment to, American principles, and to the ways of living that they are taken to imply. That seed of doubt has led Americans to scrutinize cultural differences, ethnic consociation, and race as potential indicators of the lack of qualification for trusted membership in the polity, and to insist on outward demonstrations of Americanization by those considered for membership. American National Identity and Ideologies of Americanization The definition of American identity in ideological terms was elaborated in the early postindependence period. While the extent to which a new American people would emerge from the fusion of diverse strands of Europeans, as Michel-GuillaumeJean de Crevecoeur's (1735-1813) famous "Letter from an American Farmer" rhapsodized, was questionable, what was firmly established was the association of American identity with individual "transformation."

ANALYTICAL PHILOSOPHY. It was only in the 1960s that the phrase "analytical philosophy" came into frequent use as a way of describing the kind of philosophy characteristic of much English-language philosophy of the twentieth century. But occasional references to "analytical" (or "analytic") philosophy as a new kind of philosophy can be found much earlier, where it is primarily used to introduce a contrast with "speculative philosophy." The thought here is that whereas traditional philosophers have attempted by means of speculative arguments to provide knowledge of a kind that is not otherwise possible, "analytic" philosophers aim to use methods of philosophical analysis to deepen the understanding of things that are already known—for example, concerning the past or concerning mathematics. In doing so analytic philosophers will seek to clarify the significance of essentially uncontentious historical or mathematical truths and to explain the possibility of our knowledge of them. This program does not require that analytic philosophers deny the possibility of speculative philosophy; but many did so, most famously those associated with the Vienna Circle such as Rudolph Carnap (1891-1970), who held that "all statements whatever that assert something are of an empirical nature and belong to factual science" and went to claim that, for philosophy, "What remains is not statements, nor a theory, nor a system, but only a method: the method of logical analysis" (1932; 1959, p. 77). Methods of philosophical analysis are in fact as old as philosophy, as in Socrates' dialectic. The method was especially prominent in the theory of ideas characteristic of seventeenthand eighteenth-century philosophy, which involved the analysis of complex ideas into simple ones. One of Immanuel Kant's (1724-1804) insights was to recognize the priority of ANALYTICAL PHILOSOPHY New Dictionary of the History of Ideas 63 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 63 complete judgments over ideas, or concepts, and this led him to hold that analytic methods of inquiry were subordinate to the elucidation of synthetic unities, such as the unity of consciousness. Kant's successors in the tradition of German idealism took this subordination much further as they sought to articulate the internal relations that hold together ever more encompassing "organic wholes" such as the state and the universe. For them, analysis was only ever a preliminary stage of inquiry, a kind of falsification to be transcended once a relevant organic whole and its relationships had been identified.

ANALYTICAL PHILOSOPHY. It was only in the 1960s that the phrase "analytical philosophy" came into frequent use as a way of describing the kind of philosophy characteristic of much English-language philosophy of the twentieth century. But occasional references to "analytical" (or "analytic") philosophy as a new kind of philosophy can be found much earlier, where it is primarily used to introduce a contrast with "speculative philosophy." The thought here is that whereas traditional philosophers have attempted by means of speculative arguments to provide knowledge of a kind that is not otherwise possible, "analytic" philosophers aim to use methods of philosophical analysis to deepen the understanding of things that are already known—for example, concerning the past or concerning mathematics. In doing so analytic philosophers will seek to clarify the significance of essentially uncontentious historical or mathematical truths and to explain the possibility of our knowledge of them. This program does not require that analytic philosophers deny the possibility of speculative philosophy; but many did so, most famously those associated with the Vienna Circle such as Rudolph Carnap (1891-1970), who held that "all statements whatever that assert something are of an empirical nature and belong to factual science" and went to claim that, for philosophy, "What remains is not statements, nor a theory, nor a system, but only a method: the method of logical analysis" (1932; 1959, p. 77). Methods of philosophical analysis are in fact as old as philosophy, as in Socrates' dialectic. The method was especially prominent in the theory of ideas characteristic of seventeenthand eighteenth-century philosophy, which involved the analysis of complex ideas into simple ones. One of Immanuel Kant's (1724-1804) insights was to recognize the priority of ANALYTICAL PHILOSOPHY New Dictionary of the History of Ideas 63 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 63 complete judgments over ideas, or concepts, and this led him to hold that analytic methods of inquiry were subordinate to the elucidation of synthetic unities, such as the unity of consciousness. Kant's successors in the tradition of German idealism took this subordination much further as they sought to articulate the internal relations that hold together ever more encompassing "organic wholes" such as the state and the universe. For them, analysis was only ever a preliminary stage of inquiry, a kind of falsification to be transcended once a relevant organic whole and its relationships had been identified.

ANIMISM. Animism has had a long and important history in anthropology and outside it, as an intellectual concept with important implications not only for the study of religion, but also for the political struggles of indigenous peoples around the world. The anthropological study of animism has been a two-edged sword for indigenous people. It has brought their religious concepts, and thus their rich intellectual and spiritual lives, to the attention of the world, demonstrating the intrinsic value of their cultures. But to the extent that the apparent contrast between monotheistic and animistic religions has been exaggerated and used to create an artificial hierarchy of religious thought, it has also been used against them, to denigrate their beliefs and their intellectual capacities, and thus to deny them full equality with their colonizers.

ANIMISM. Animism has had a long and important history in anthropology and outside it, as an intellectual concept with important implications not only for the study of religion, but also for the political struggles of indigenous peoples around the world. The anthropological study of animism has been a two-edged sword for indigenous people. It has brought their religious concepts, and thus their rich intellectual and spiritual lives, to the attention of the world, demonstrating the intrinsic value of their cultures. But to the extent that the apparent contrast between monotheistic and animistic religions has been exaggerated and used to create an artificial hierarchy of religious thought, it has also been used against them, to denigrate their beliefs and their intellectual capacities, and thus to deny them full equality with their colonizers.

ANTHROPOLOGY. As an academic discipline, anthropology is somewhat less than two centuries old, but speculations, if not rigorous scientific theories, about where we human beings came from and how to account for the physical and cultural differences that distinguish our communities and nations from one another probably began during prehistory. In the United States (but not in most other academic settings, for example, in Europe or Asia), the discipline is conventionally divided into four main subfields: biological (or physical) anthropology, archaeology, linguistics, and sociocultural anthropology. The history and current state of each subfield will be discussed in this entry, as well as how they have ANTHROPOLOGY New Dictionary of the History of Ideas 73 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 73 influenced one another during the last two-hundred-odd years. Although they will be described separately, the four subfields form the logos of anthropos, the broad science that studies the human species. The concept that unites these four subfields is culture. The earliest systematic formulation of the anthropological concept of culture was articulated by Sir Edward Burnett Tylor (1832- 1917) in the first sentence of his pioneering book, Primitive Culture (1871): "Culture, or Civilization, taken in its wide ethnographic sense, is that complex whole which includes knowledge, belief, art, morals, law custom, and any other capabilities and habits acquired by man as a member of society." This entry will use an updated version of Tylor's definition put forth by Daniel G. Bates and Elliot M. Fratkin: "Culture, broadly defined, is a system of shared beliefs, values, customs, and material objects that members of a society use to cope with their world and with one another, and that are transmitted from generation to generation through learning" (1999, p. 5). The work of biological anthropologists seeks—among other things—to discover how, when, and why our remote ancestors evolved the physiological capacity for culture; archaeologists attempt to trace the evolution of culture and seek to reconstruct the nature of prehistoric (as well as historic and contemporary) cultures from the material objects they left behind; linguists describe the principal symbolic system— language—through which cultural learning occurs; and sociocultural anthropologists are concerned with the nature of culture per se and the myriad factors that shaped (and continue to shape) its contemporary manifestations.

ANTHROPOLOGY. As an academic discipline, anthropology is somewhat less than two centuries old, but speculations, if not rigorous scientific theories, about where we human beings came from and how to account for the physical and cultural differences that distinguish our communities and nations from one another probably began during prehistory. In the United States (but not in most other academic settings, for example, in Europe or Asia), the discipline is conventionally divided into four main subfields: biological (or physical) anthropology, archaeology, linguistics, and sociocultural anthropology. The history and current state of each subfield will be discussed in this entry, as well as how they have ANTHROPOLOGY New Dictionary of the History of Ideas 73 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 73 influenced one another during the last two-hundred-odd years. Although they will be described separately, the four subfields form the logos of anthropos, the broad science that studies the human species. The concept that unites these four subfields is culture. The earliest systematic formulation of the anthropological concept of culture was articulated by Sir Edward Burnett Tylor (1832- 1917) in the first sentence of his pioneering book, Primitive Culture (1871): "Culture, or Civilization, taken in its wide ethnographic sense, is that complex whole which includes knowledge, belief, art, morals, law custom, and any other capabilities and habits acquired by man as a member of society." This entry will use an updated version of Tylor's definition put forth by Daniel G. Bates and Elliot M. Fratkin: "Culture, broadly defined, is a system of shared beliefs, values, customs, and material objects that members of a society use to cope with their world and with one another, and that are transmitted from generation to generation through learning" (1999, p. 5). The work of biological anthropologists seeks—among other things—to discover how, when, and why our remote ancestors evolved the physiological capacity for culture; archaeologists attempt to trace the evolution of culture and seek to reconstruct the nature of prehistoric (as well as historic and contemporary) cultures from the material objects they left behind; linguists describe the principal symbolic system— language—through which cultural learning occurs; and sociocultural anthropologists are concerned with the nature of culture per se and the myriad factors that shaped (and continue to shape) its contemporary manifestations.

African Rebirth Still, the conditions that merit pessimism for the future of Africa are not manufactured by Afropessimists; such conditions are empirically verifiable. Since the end of the Cold War, important African leaders such as Presidents Thabo Mbeki of South Africa, Olusegun Obasanjo of Nigeria, and Maître Abdoulaye Wade of Senegal have come to recognize this and have resolved to do something about it. As a result, there has been an honest effort on the continent to address the important issues of good political, economic, and corporate governance and the professionalization of the army in order to diminish chances for destabilizing military coups. These efforts had led the Organization of African Unity (OAU) to found a new institution, the New Partnership for Africa's Development (NEPAD), charged with the responsibility to provide a vision and strategic framework for Africa's renewal. NEPAD attempts to provide African interventions with regard to issues of relative underdevelopment and marginalization. NEPAD's formation and other historic African transformative actions have been referred to as African renaissance.

African Rebirth Still, the conditions that merit pessimism for the future of Africa are not manufactured by Afropessimists; such conditions are empirically verifiable. Since the end of the Cold War, important African leaders such as Presidents Thabo Mbeki of South Africa, Olusegun Obasanjo of Nigeria, and Maître Abdoulaye Wade of Senegal have come to recognize this and have resolved to do something about it. As a result, there has been an honest effort on the continent to address the important issues of good political, economic, and corporate governance and the professionalization of the army in order to diminish chances for destabilizing military coups. These efforts had led the Organization of African Unity (OAU) to found a new institution, the New Partnership for Africa's Development (NEPAD), charged with the responsibility to provide a vision and strategic framework for Africa's renewal. NEPAD attempts to provide African interventions with regard to issues of relative underdevelopment and marginalization. NEPAD's formation and other historic African transformative actions have been referred to as African renaissance.

African-American Conservatism Black conservatism has a long history and includes a large number of prominent African-Americans. Nevertheless, it has never become a salient ideology for a significant number. Jupiter Hammond (1711-c. 1800) during the slave period, William Hooper Council (1848-1909) during the first nadir, George Schuyler (1895-1977) during the 1940s and 1950s, as well as A. G. Gaston (1892-1996) during the 1960s, represented conservative ideologies. However, none of these individuals shared the virulent antiblack positions of contemporary black conservatives. Black neoconservatives differ from previous generations of black conservatives in that they are often alienated from the black community and divorced from its institutional networks, especially political organizations. Black conservatism articulates better with its mainstream equivalent than do the other black ideologies. Conservative approaches are premised on white American cultural values, particularly notions of individualism, which advocate colorblind individual incorporation into the U.S. political economy, polity, and mainstream white civil society. According to psychologist William Cross, Jr., race and the history and continuing practice of racial oppression are either denied or viewed as not salient by black neoconservatives in constructing their identities. Twenty-first-century black neoconservatives, Thelma Duggin, Clarence Thomas, Gwen Daye Richardson, and Ward Connerly, share the assumptions, interpretations, and goals of the leading white neoconservatives. Black neoconservatives blame the dislocations endemic to poverty on welfare and government subsidies. They claim the Great Society programs, instituted in the 1960s by President Lyndon Johnson, created a "culture of dependency" among the black poor. According to this perspective, instead of providing a social safety net, the Great Society created dependent personalities and an antiachievement culture in the black community. Perhaps more fundamental than the adoption of European-American cultural values is their reduction of African-American culture to a composite of pathological behaviors. Conservatives, including autonomic conservatives, generally reject the African cultural survival thesis.

African-American Conservatism Black conservatism has a long history and includes a large number of prominent African-Americans. Nevertheless, it has never become a salient ideology for a significant number. Jupiter Hammond (1711-c. 1800) during the slave period, William Hooper Council (1848-1909) during the first nadir, George Schuyler (1895-1977) during the 1940s and 1950s, as well as A. G. Gaston (1892-1996) during the 1960s, represented conservative ideologies. However, none of these individuals shared the virulent antiblack positions of contemporary black conservatives. Black neoconservatives differ from previous generations of black conservatives in that they are often alienated from the black community and divorced from its institutional networks, especially political organizations. Black conservatism articulates better with its mainstream equivalent than do the other black ideologies. Conservative approaches are premised on white American cultural values, particularly notions of individualism, which advocate colorblind individual incorporation into the U.S. political economy, polity, and mainstream white civil society. According to psychologist William Cross, Jr., race and the history and continuing practice of racial oppression are either denied or viewed as not salient by black neoconservatives in constructing their identities. Twenty-first-century black neoconservatives, Thelma Duggin, Clarence Thomas, Gwen Daye Richardson, and Ward Connerly, share the assumptions, interpretations, and goals of the leading white neoconservatives. Black neoconservatives blame the dislocations endemic to poverty on welfare and government subsidies. They claim the Great Society programs, instituted in the 1960s by President Lyndon Johnson, created a "culture of dependency" among the black poor. According to this perspective, instead of providing a social safety net, the Great Society created dependent personalities and an antiachievement culture in the black community. Perhaps more fundamental than the adoption of European-American cultural values is their reduction of African-American culture to a composite of pathological behaviors. Conservatives, including autonomic conservatives, generally reject the African cultural survival thesis.

Agnosticism in the Twentieth Century The scientific and religious creed of agnosticism died with Leslie Stephen in 1904. However, the philosophical and theological questions around which it was based, especially about the relationship between the observable and the unobservable, persisted into the twentieth century (although not generally under the banner of agnosticism). The logical positivism of the earlier twentieth century, along with more recent antirealist philosophies of science (such as Bas van Fraassen's "constructive empiricism" as developed in his 1980 book, The Scientific Image), have contained some of the radically empiricist elements of agnosticism as endorsed by Huxley (and derived from Hume and Mill). These philosophers have insisted that all true knowledge must be grounded in experience and that since we cannot have direct experience of unobservable substances, entities, laws, or causes, we must treat them as, at best, useful fictions that serve as shorthand for empirical generalizations. Logical positivists dismissed all "metaphysical" discourse, which claimed to describe underlying realities, as meaningless. In this they agreed both with Comtean positivists and with agnostics. In the realm of religion and theology, the problems that were central to the agnostics—especially the difficulty of reconciling religion and morality with a scientific worldview— continued to occupy religious thinkers (see Dixon). Some, such as Thomas Huxley's grandson Julian Sorell Huxley (1887- 1975), put forward "evolutionary humanism" as a scientific religion based on reason and morality but without revelation. Others took a similar approach but while remaining within the Christian tradition. Don Cupitt, for instance, in books such as Taking Leave of God (1980) and The Sea of Faith (1984), adopted a "nonrealist" metaphysics and articulated a post-theological version of the Christian religion. For Cupitt, himself a minister in the Church of England, the claims of Christian theology should not be taken to refer to unseen supernatural realities, such as a personal God, but to be expressions of human values and aspirations. So scientists seeking to give expression to a religious impulse while retaining their intellectual integrity along with theologians looking for an interpretation of the gospel that will resonate in a secular and scientific world have both continued the religious project that the Victorian agnostics had begun.

Agnosticism in the Twentieth Century The scientific and religious creed of agnosticism died with Leslie Stephen in 1904. However, the philosophical and theological questions around which it was based, especially about the relationship between the observable and the unobservable, persisted into the twentieth century (although not generally under the banner of agnosticism). The logical positivism of the earlier twentieth century, along with more recent antirealist philosophies of science (such as Bas van Fraassen's "constructive empiricism" as developed in his 1980 book, The Scientific Image), have contained some of the radically empiricist elements of agnosticism as endorsed by Huxley (and derived from Hume and Mill). These philosophers have insisted that all true knowledge must be grounded in experience and that since we cannot have direct experience of unobservable substances, entities, laws, or causes, we must treat them as, at best, useful fictions that serve as shorthand for empirical generalizations. Logical positivists dismissed all "metaphysical" discourse, which claimed to describe underlying realities, as meaningless. In this they agreed both with Comtean positivists and with agnostics. In the realm of religion and theology, the problems that were central to the agnostics—especially the difficulty of reconciling religion and morality with a scientific worldview— continued to occupy religious thinkers (see Dixon). Some, such as Thomas Huxley's grandson Julian Sorell Huxley (1887- 1975), put forward "evolutionary humanism" as a scientific religion based on reason and morality but without revelation. Others took a similar approach but while remaining within the Christian tradition. Don Cupitt, for instance, in books such as Taking Leave of God (1980) and The Sea of Faith (1984), adopted a "nonrealist" metaphysics and articulated a post-theological version of the Christian religion. For Cupitt, himself a minister in the Church of England, the claims of Christian theology should not be taken to refer to unseen supernatural realities, such as a personal God, but to be expressions of human values and aspirations. So scientists seeking to give expression to a religious impulse while retaining their intellectual integrity along with theologians looking for an interpretation of the gospel that will resonate in a secular and scientific world have both continued the religious project that the Victorian agnostics had begun.

Americanization in the nineteenth century. Nineteenthcentury Americans expected life in the United States to transform European newcomers into culturally compatible neighbors. While not directing specific "Americanization" efforts toward immigrants, American communities placed faith, in particular, in the common schools to be "culture factories" in which to inculcate principles of republican virtue, and to cultivate American habits and identities. A general pattern of acceptance of diversity and confidence in the workings of America's natural "melting pot" was not obtained until the 1890s. The 1890s represent a crucial turning point that intensified the salience of ethnicity as an element of national identity, gave rise to the "Americanization movement," and, ultimately, resulted in long-lasting restrictions on immigration. A massive influx of new immigrants, primarily from southern and eastern Europe, combined with the perception of the frontier having closed, accelerated industrialization, rural emigration, recurring economic distress, perceptions of urban disorder and disorganization, labor conflict, and radical political agitation diminished Americans' faith in the naturally absorptive powers of American life and in a laissez-faire approach to immigrant absorption. So, too, did the development of a distinctively racialist ideology that identified Anglo-Saxon descent with authentic American identity and placed the new immigrants into inferior classifications. Americanization in the first quarter of the twentieth century. The resulting effort to "Americanize" immigrant newcomers was part of the Progressive movement's broader efforts to construct a modern and cohesive social order, and also part of a new purifying national effort to cultivate patriotism among all Americans. As World War I approached, the priorities of immigrant adjustment would yield to the priority of coercively assuring loyalty through insistence on naturalization, quick acquisition and sole use of English, and adherence to "American" cultural norms. Well before the official birth of the "Americanization movement" in 1915, educators began to grapple with what they determined were the needs of the increasing number of foreign-born adults and their children. Settlement houses and other agencies

Americanization in the nineteenth century. Nineteenthcentury Americans expected life in the United States to transform European newcomers into culturally compatible neighbors. While not directing specific "Americanization" efforts toward immigrants, American communities placed faith, in particular, in the common schools to be "culture factories" in which to inculcate principles of republican virtue, and to cultivate American habits and identities. A general pattern of acceptance of diversity and confidence in the workings of America's natural "melting pot" was not obtained until the 1890s. The 1890s represent a crucial turning point that intensified the salience of ethnicity as an element of national identity, gave rise to the "Americanization movement," and, ultimately, resulted in long-lasting restrictions on immigration. A massive influx of new immigrants, primarily from southern and eastern Europe, combined with the perception of the frontier having closed, accelerated industrialization, rural emigration, recurring economic distress, perceptions of urban disorder and disorganization, labor conflict, and radical political agitation diminished Americans' faith in the naturally absorptive powers of American life and in a laissez-faire approach to immigrant absorption. So, too, did the development of a distinctively racialist ideology that identified Anglo-Saxon descent with authentic American identity and placed the new immigrants into inferior classifications. Americanization in the first quarter of the twentieth century. The resulting effort to "Americanize" immigrant newcomers was part of the Progressive movement's broader efforts to construct a modern and cohesive social order, and also part of a new purifying national effort to cultivate patriotism among all Americans. As World War I approached, the priorities of immigrant adjustment would yield to the priority of coercively assuring loyalty through insistence on naturalization, quick acquisition and sole use of English, and adherence to "American" cultural norms. Well before the official birth of the "Americanization movement" in 1915, educators began to grapple with what they determined were the needs of the increasing number of foreign-born adults and their children. Settlement houses and other agencies

Analytical and Continental Philosophy Throughout much of the twentieth century analytical philosophy was very different from the approach to philosophy characteristic of "continental" philosophers such as Edmund Husserl, Martin Heidegger, Jean-Paul Sartre, and Maurice Merleau-Ponty. One reason for this was simply their ignorance of logic, which excluded them from any serious understanding of analytical philosophy. Conversely analytical philosophers, by and large, remained uncomprehending of the phenomenological project of recovering the basic structures of intentionality. By the end of the twentieth century, however, with translations of all the main works involved into the relevant languages, a much greater degree of mutual comprehension has been achieved. As a result, while continental philosophers such as Jacques Derrida have sought to appropriate analytical techniques such as speech-act analysis, analytical philosophers have turned their attention to the theme of intentionality, though sometimes with conclusions far removed from those of continental philosophers. Thus the situation is now one of dialogue despite profound disagreements.

Analytical and Continental Philosophy Throughout much of the twentieth century analytical philosophy was very different from the approach to philosophy characteristic of "continental" philosophers such as Edmund Husserl, Martin Heidegger, Jean-Paul Sartre, and Maurice Merleau-Ponty. One reason for this was simply their ignorance of logic, which excluded them from any serious understanding of analytical philosophy. Conversely analytical philosophers, by and large, remained uncomprehending of the phenomenological project of recovering the basic structures of intentionality. By the end of the twentieth century, however, with translations of all the main works involved into the relevant languages, a much greater degree of mutual comprehension has been achieved. As a result, while continental philosophers such as Jacques Derrida have sought to appropriate analytical techniques such as speech-act analysis, analytical philosophers have turned their attention to the theme of intentionality, though sometimes with conclusions far removed from those of continental philosophers. Thus the situation is now one of dialogue despite profound disagreements.

Black Feminism Like modern black radicalism, black feminism is of comparatively recent origins, although its roots reach back into the nineteenth century. The earliest foreshadowings of this ideology are found in the slave-era speeches and writings of Maria Stewart (1803-1879), Sojouner Truth (c. 1797-1883), and Frances Ellen Watkins (1825-1911). The ideas that would ultimately become the ideology of black feminism developed to a large extent during the first nadir through the speeches, writing, and practices of Ida Wells-Barnett, Mary Church Terrell (1863-1954), and Anna Julia Cooper (1859-1964). While AFRICAN-AMERICAN IDEAS 28 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 28 black women activists, especially women associated with the U.S. Communist Party, such as Esther Cooper Jackson and Claudia Jones, engaged in practices and advocated philosophies that amounted to black feminism, the theory was not formally visible until the 1960s. According to Linda Burnham, contemporary black feminism, with its emphasis on the simultaneity of overlapping oppressions—race, class, gender, sexuality—is a product of the transition from the civil rights to the black power movement. She locates the origin of this iteration of black feminism with the Student Nonviolent Coordinating Committee's (SNCC) Black Women's Liberation Committee and later the Third World Women's Alliance, in both of which Frances Beal (1900-1987) played a leading role. This stream of black feminism has become a core component of contemporary black radicalism. It can be sharply contrasted with another stream of black feminism, womanism, as articulated by Clenora Hudson-Weems, which is more correctly viewed as a tributary of conservative black nationalism.

Black Feminism Like modern black radicalism, black feminism is of comparatively recent origins, although its roots reach back into the nineteenth century. The earliest foreshadowings of this ideology are found in the slave-era speeches and writings of Maria Stewart (1803-1879), Sojouner Truth (c. 1797-1883), and Frances Ellen Watkins (1825-1911). The ideas that would ultimately become the ideology of black feminism developed to a large extent during the first nadir through the speeches, writing, and practices of Ida Wells-Barnett, Mary Church Terrell (1863-1954), and Anna Julia Cooper (1859-1964). While AFRICAN-AMERICAN IDEAS 28 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 28 black women activists, especially women associated with the U.S. Communist Party, such as Esther Cooper Jackson and Claudia Jones, engaged in practices and advocated philosophies that amounted to black feminism, the theory was not formally visible until the 1960s. According to Linda Burnham, contemporary black feminism, with its emphasis on the simultaneity of overlapping oppressions—race, class, gender, sexuality—is a product of the transition from the civil rights to the black power movement. She locates the origin of this iteration of black feminism with the Student Nonviolent Coordinating Committee's (SNCC) Black Women's Liberation Committee and later the Third World Women's Alliance, in both of which Frances Beal (1900-1987) played a leading role. This stream of black feminism has become a core component of contemporary black radicalism. It can be sharply contrasted with another stream of black feminism, womanism, as articulated by Clenora Hudson-Weems, which is more correctly viewed as a tributary of conservative black nationalism.

Conclusion At the juncture of the early twenty-first century, those made uneasy by America's increasing ethnic and linguistic diversity called, through such efforts as attempting to legislate an "official" status for English, for a kind of revived Americanization movement. While a majority supported proposals to make English the "official" language, acceptance of diversity nevertheless appeared embedded in expressed attitudes, and public-opinion studies strongly supported the conclusion that Americans' ". . . preference for an inclusive nationalism coexists with the widespread acceptance of pluralism in cultural practices" (Citrin et al., p. 266). A "cosmopolitan liberal" view of American identity and polity appeared to predominate over either a multiculturalist or nativist view, and so we would expect coercive Americanization crusades to remain a thing of the past

Conclusion At the juncture of the early twenty-first century, those made uneasy by America's increasing ethnic and linguistic diversity called, through such efforts as attempting to legislate an "official" status for English, for a kind of revived Americanization movement. While a majority supported proposals to make English the "official" language, acceptance of diversity nevertheless appeared embedded in expressed attitudes, and public-opinion studies strongly supported the conclusion that Americans' ". . . preference for an inclusive nationalism coexists with the widespread acceptance of pluralism in cultural practices" (Citrin et al., p. 266). A "cosmopolitan liberal" view of American identity and polity appeared to predominate over either a multiculturalist or nativist view, and so we would expect coercive Americanization crusades to remain a thing of the past

In the popular imagination, African-American political thought has been reduced to two ideological streams, black nationalism and integrationism. Harold Cruse, author of the influential but flawed Crisis of the Negro Intellectual, crystallized this binary framework into a Manichean perspective that characterized African-American history as primarily a conflict between proponents of these two ideologies. Cruse did not invent this conceptualization—August Meier had previously asserted it—but he made it the dominant interpretative schema in black political philosophy.

In the popular imagination, African-American political thought has been reduced to two ideological streams, black nationalism and integrationism. Harold Cruse, author of the influential but flawed Crisis of the Negro Intellectual, crystallized this binary framework into a Manichean perspective that characterized African-American history as primarily a conflict between proponents of these two ideologies. Cruse did not invent this conceptualization—August Meier had previously asserted it—but he made it the dominant interpretative schema in black political philosophy.

MBIGUITY. By instinct humans yearn for reassurance and certainties and dream of an orderly universe where the reasoning process corresponds to external reality. This attitude is reflected by the assumption, authoritatively legitimized by Aristotle (384-322 B.C.E.), that no responsible statement can exhibit internal contradictions. In his Categories, Aristotle states that the essential character of a substance seems to be its ability to host opposites. At any instant, however, one can assign either a quality or its opposite to a substance: According to Aristotle, "Nobody can be simultaneously sick and healthy. Similarly, nothing is at the same time white and black. No object exists simultaneously hosting opposites." No alternatives exist besides these two; any third possibility is excluded: tertium non datur.

MBIGUITY. By instinct humans yearn for reassurance and certainties and dream of an orderly universe where the reasoning process corresponds to external reality. This attitude is reflected by the assumption, authoritatively legitimized by Aristotle (384-322 B.C.E.), that no responsible statement can exhibit internal contradictions. In his Categories, Aristotle states that the essential character of a substance seems to be its ability to host opposites. At any instant, however, one can assign either a quality or its opposite to a substance: According to Aristotle, "Nobody can be simultaneously sick and healthy. Similarly, nothing is at the same time white and black. No object exists simultaneously hosting opposites." No alternatives exist besides these two; any third possibility is excluded: tertium non datur.

MIDDLE EAST Between the early nineteenth century and the outbreak of World War I, much of the area between Morocco and what is now Turkey came under different forms of European colonial rule. Thus France began the conquest of Algeria in 1830, took over Tunisia in 1881, and (in partnership with Spain) took over Morocco in 1912. Britain occupied Egypt in 1882, formalizing the occupation by the declaration of a protectorate in 1914, and Italy began its conquest of Libya in 1911.

MIDDLE EAST Between the early nineteenth century and the outbreak of World War I, much of the area between Morocco and what is now Turkey came under different forms of European colonial rule. Thus France began the conquest of Algeria in 1830, took over Tunisia in 1881, and (in partnership with Spain) took over Morocco in 1912. Britain occupied Egypt in 1882, formalizing the occupation by the declaration of a protectorate in 1914, and Italy began its conquest of Libya in 1911.

Spiritual anarchism and anarcho-syndicalism. Parallel to the terrorist acts committed at this time, the Christian pacifist Leo Tolstoy (1828-1910) was developing an antiauthoritarian current of thinking that, in its broadest sense, can be regarded as belonging to the anarchist tradition. Tolstoy promoted a form of religious anarchism that was based on the "law of supreme love" as defined by his personal (anti-doctrinal) reading of the Scriptures. Though he did not see himself as an anarchist, he nevertheless believed that in order for men and women to live in a morally coherent world it was necessary to destroy the state and its institutions. Because of his rejection of the use of force and violence, Tolstoy and his followers advocated civil disobedience, or nonviolent resistance, as a means of achieving the stateless and communally based society they envisioned. It was also around this time that anarchist doctrine experienced another significant metamorphosis. From the late 1890s until the 1930s, anarchist activity was increasingly centered in working-class cultural and economic organizations, ANARCHISM 68 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 68 and the tactics and strategy of the movement were grounded in the theory of revolutionary syndicalism. While not wholly abandoning the use of violence, the anarcho-syndicalists believed that, against the organized forces of big government and monopoly capitalism, the revolutionary élan of the workers could be most effectively channeled through trade union organizations. Using tactics such as the general strike, which was meant to paralyze the economy by linking shutdowns in different industries, the anarcho-syndicalists believed that it would be possible to create the general conditions for a complete collapse of capitalism and the state. Anarcho-syndicalism became an important force in the labor movements in parts of Latin America (Mexico, Argentina) and in European countries such as Italy, France, and Spain. Its greatest impact was felt in Spain. During the Second Republic (1931-1936) and continuing through the civil war period (1936-1939), anarchosyndicalism developed into a powerful mass movement. At its peak the anarcho-syndicalist organizations known as the CNTFAI (National Confederation of Workers and Federation of Iberian Anarchists) counted more than 1.5 million adherents. Their influence over the course of events during the civil war was most dramatically illustrated by the fact that they set up and ran thousands of industrial and agricultural collectives throughout the Republican zone. The triumph of Franco's Nationalist forces in 1939, followed by the outbreak of another global world war that same year, sounded the death knell for anarcho-syndicalism not only in Spain but in other Western European countries as well. It deserves mention here that, by the time World War II began, anarchism's reach extended across the globe. Besides taking root in the Americas, the doctrine had penetrated parts of East Asia and even the subcontinent. In both China and Japan, for example, Western anarchist ideas influenced leading social thinkers such as Mao Zedong and labor organizers who were seeking to establish socialism in those countries. However, the emergence of authoritarian and totalitarian regimes of both the right and left in the 1930s and late 1940s effectively quashed the libertarian tendencies that had been developing up to then. It would take another forty years before anarchist ideas would be resurrected by tiny protest groups (mostly in Japan) that wanted to express their cultural and intellectual dissatisfaction with the status quo.Spiritual anarchism and anarcho-syndicalism. Parallel to the terrorist acts committed at this time, the Christian pacifist Leo Tolstoy (1828-1910) was developing an antiauthoritarian current of thinking that, in its broadest sense, can be regarded as belonging to the anarchist tradition. Tolstoy promoted a form of religious anarchism that was based on the "law of supreme love" as defined by his personal (anti-doctrinal) reading of the Scriptures. Though he did not see himself as an anarchist, he nevertheless believed that in order for men and women to live in a morally coherent world it was necessary to destroy the state and its institutions. Because of his rejection of the use of force and violence, Tolstoy and his followers advocated civil disobedience, or nonviolent resistance, as a means of achieving the stateless and communally based society they envisioned. It was also around this time that anarchist doctrine experienced another significant metamorphosis. From the late 1890s until the 1930s, anarchist activity was increasingly centered in working-class cultural and economic organizations, ANARCHISM 68 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 68 and the tactics and strategy of the movement were grounded in the theory of revolutionary syndicalism. While not wholly abandoning the use of violence, the anarcho-syndicalists believed that, against the organized forces of big government and monopoly capitalism, the revolutionary élan of the workers could be most effectively channeled through trade union organizations. Using tactics such as the general strike, which was meant to paralyze the economy by linking shutdowns in different industries, the anarcho-syndicalists believed that it would be possible to create the general conditions for a complete collapse of capitalism and the state. Anarcho-syndicalism became an important force in the labor movements in parts of Latin America (Mexico, Argentina) and in European countries such as Italy, France, and Spain. Its greatest impact was felt in Spain. During the Second Republic (1931-1936) and continuing through the civil war period (1936-1939), anarchosyndicalism developed into a powerful mass movement. At its peak the anarcho-syndicalist organizations known as the CNTFAI (National Confederation of Workers and Federation of Iberian Anarchists) counted more than 1.5 million adherents. Their influence over the course of events during the civil war was most dramatically illustrated by the fact that they set up and ran thousands of industrial and agricultural collectives throughout the Republican zone. The triumph of Franco's Nationalist forces in 1939, followed by the outbreak of another global world war that same year, sounded the death knell for anarcho-syndicalism not only in Spain but in other Western European countries as well. It deserves mention here that, by the time World War II began, anarchism's reach extended across the globe. Besides taking root in the Americas, the doctrine had penetrated parts of East Asia and even the subcontinent. In both China and Japan, for example, Western anarchist ideas influenced leading social thinkers such as Mao Zedong and labor organizers who were seeking to establish socialism in those countries. However, the emergence of authoritarian and totalitarian regimes of both the right and left in the 1930s and late 1940s effectively quashed the libertarian tendencies that had been developing up to then. It would take another forty years before anarchist ideas would be resurrected by tiny protest groups (mostly in Japan) that wanted to express their cultural and intellectual dissatisfaction with the status quo.

Spiritual anarchism and anarcho-syndicalism. Parallel to the terrorist acts committed at this time, the Christian pacifist Leo Tolstoy (1828-1910) was developing an antiauthoritarian current of thinking that, in its broadest sense, can be regarded as belonging to the anarchist tradition. Tolstoy promoted a form of religious anarchism that was based on the "law of supreme love" as defined by his personal (anti-doctrinal) reading of the Scriptures. Though he did not see himself as an anarchist, he nevertheless believed that in order for men and women to live in a morally coherent world it was necessary to destroy the state and its institutions. Because of his rejection of the use of force and violence, Tolstoy and his followers advocated civil disobedience, or nonviolent resistance, as a means of achieving the stateless and communally based society they envisioned. It was also around this time that anarchist doctrine experienced another significant metamorphosis. From the late 1890s until the 1930s, anarchist activity was increasingly centered in working-class cultural and economic organizations, ANARCHISM 68 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 68 and the tactics and strategy of the movement were grounded in the theory of revolutionary syndicalism. While not wholly abandoning the use of violence, the anarcho-syndicalists believed that, against the organized forces of big government and monopoly capitalism, the revolutionary élan of the workers could be most effectively channeled through trade union organizations. Using tactics such as the general strike, which was meant to paralyze the economy by linking shutdowns in different industries, the anarcho-syndicalists believed that it would be possible to create the general conditions for a complete collapse of capitalism and the state. Anarcho-syndicalism became an important force in the labor movements in parts of Latin America (Mexico, Argentina) and in European countries such as Italy, France, and Spain. Its greatest impact was felt in Spain. During the Second Republic (1931-1936) and continuing through the civil war period (1936-1939), anarchosyndicalism developed into a powerful mass movement. At its peak the anarcho-syndicalist organizations known as the CNTFAI (National Confederation of Workers and Federation of Iberian Anarchists) counted more than 1.5 million adherents. Their influence over the course of events during the civil war was most dramatically illustrated by the fact that they set up and ran thousands of industrial and agricultural collectives throughout the Republican zone. The triumph of Franco's Nationalist forces in 1939, followed by the outbreak of another global world war that same year, sounded the death knell for anarcho-syndicalism not only in Spain but in other Western European countries as well. It deserves mention here that, by the time World War II began, anarchism's reach extended across the globe. Besides taking root in the Americas, the doctrine had penetrated parts of East Asia and even the subcontinent. In both China and Japan, for example, Western anarchist ideas influenced leading social thinkers such as Mao Zedong and labor organizers who were seeking to establish socialism in those countries. However, the emergence of authoritarian and totalitarian regimes of both the right and left in the 1930s and late 1940s effectively quashed the libertarian tendencies that had been developing up to then. It would take another forty years before anarchist ideas would be resurrected by tiny protest groups (mostly in Japan) that wanted to express their cultural and intellectual dissatisfaction with the status quo.Spiritual anarchism and anarcho-syndicalism. Parallel to the terrorist acts committed at this time, the Christian pacifist Leo Tolstoy (1828-1910) was developing an antiauthoritarian current of thinking that, in its broadest sense, can be regarded as belonging to the anarchist tradition. Tolstoy promoted a form of religious anarchism that was based on the "law of supreme love" as defined by his personal (anti-doctrinal) reading of the Scriptures. Though he did not see himself as an anarchist, he nevertheless believed that in order for men and women to live in a morally coherent world it was necessary to destroy the state and its institutions. Because of his rejection of the use of force and violence, Tolstoy and his followers advocated civil disobedience, or nonviolent resistance, as a means of achieving the stateless and communally based society they envisioned. It was also around this time that anarchist doctrine experienced another significant metamorphosis. From the late 1890s until the 1930s, anarchist activity was increasingly centered in working-class cultural and economic organizations, ANARCHISM 68 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 68 and the tactics and strategy of the movement were grounded in the theory of revolutionary syndicalism. While not wholly abandoning the use of violence, the anarcho-syndicalists believed that, against the organized forces of big government and monopoly capitalism, the revolutionary élan of the workers could be most effectively channeled through trade union organizations. Using tactics such as the general strike, which was meant to paralyze the economy by linking shutdowns in different industries, the anarcho-syndicalists believed that it would be possible to create the general conditions for a complete collapse of capitalism and the state. Anarcho-syndicalism became an important force in the labor movements in parts of Latin America (Mexico, Argentina) and in European countries such as Italy, France, and Spain. Its greatest impact was felt in Spain. During the Second Republic (1931-1936) and continuing through the civil war period (1936-1939), anarchosyndicalism developed into a powerful mass movement. At its peak the anarcho-syndicalist organizations known as the CNTFAI (National Confederation of Workers and Federation of Iberian Anarchists) counted more than 1.5 million adherents. Their influence over the course of events during the civil war was most dramatically illustrated by the fact that they set up and ran thousands of industrial and agricultural collectives throughout the Republican zone. The triumph of Franco's Nationalist forces in 1939, followed by the outbreak of another global world war that same year, sounded the death knell for anarcho-syndicalism not only in Spain but in other Western European countries as well. It deserves mention here that, by the time World War II began, anarchism's reach extended across the globe. Besides taking root in the Americas, the doctrine had penetrated parts of East Asia and even the subcontinent. In both China and Japan, for example, Western anarchist ideas influenced leading social thinkers such as Mao Zedong and labor organizers who were seeking to establish socialism in those countries. However, the emergence of authoritarian and totalitarian regimes of both the right and left in the 1930s and late 1940s effectively quashed the libertarian tendencies that had been developing up to then. It would take another forty years before anarchist ideas would be resurrected by tiny protest groups (mostly in Japan) that wanted to express their cultural and intellectual dissatisfaction with the status quo.

Summary The five major ideological expressions of social thought by which African-Americans have sought to reconstruct their racial/ethnic identity, to contemplate the structures, ideologies, and functions of racial oppression, and to envision a future free from that oppression constitute an extraordinarily complex body of evolving political theory. All are shaped by dynamic interaction with their sociohistorical experiences and the dominant and emerging ideologies and discourses in U.S. society and the world, especially pan-African ideas and the black intellectual tradition.

Summary The five major ideological expressions of social thought by which African-Americans have sought to reconstruct their racial/ethnic identity, to contemplate the structures, ideologies, and functions of racial oppression, and to envision a future free from that oppression constitute an extraordinarily complex body of evolving political theory. All are shaped by dynamic interaction with their sociohistorical experiences and the dominant and emerging ideologies and discourses in U.S. society and the world, especially pan-African ideas and the black intellectual tradition.

The Arabic Innovations Common algebra is a theory of manipulating symbols representing constant and unknown numbers and geometrical magnitudes, and especially of expressing polynomial equations and finding roots by an algorithm that produces a formula. Its founders were the Arabs (that is, mathematicians usually writing in Arabic) from the ninth century, the main culture of the world outside the Far East. Some of the inspiration came from interpreting various Greek or Indian authors, including Euclid. The pioneer was Al-Khwarizmi (fl. c. 800-847) with his work Al-jabr wa'l-muqabala, known in English as the Algebra, and over the next five centuries followers elaborated his theory. The problems often came from elsewhere, such as commerce or geometry; solutions usually involved the roots of ALGEBRAS 44 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 44 polynomial equations. Algebra was seen as an extension of arithmetic, working with unknowns in the same way as arithmetic works with knowns. The Arabic manner of expression was verbal: the word shay denoted the unknown, mal its square, ka'b its cube, mal mal for the fourth power, and so on. Arabs also adopted and adapted the Indian place-value system of numerals, including 0 for zero, that is called Hindu-Arabic. They were suspicious of negative numbers, as not being pukka quantities

The Arabic Innovations Common algebra is a theory of manipulating symbols representing constant and unknown numbers and geometrical magnitudes, and especially of expressing polynomial equations and finding roots by an algorithm that produces a formula. Its founders were the Arabs (that is, mathematicians usually writing in Arabic) from the ninth century, the main culture of the world outside the Far East. Some of the inspiration came from interpreting various Greek or Indian authors, including Euclid. The pioneer was Al-Khwarizmi (fl. c. 800-847) with his work Al-jabr wa'l-muqabala, known in English as the Algebra, and over the next five centuries followers elaborated his theory. The problems often came from elsewhere, such as commerce or geometry; solutions usually involved the roots of ALGEBRAS 44 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 44 polynomial equations. Algebra was seen as an extension of arithmetic, working with unknowns in the same way as arithmetic works with knowns. The Arabic manner of expression was verbal: the word shay denoted the unknown, mal its square, ka'b its cube, mal mal for the fourth power, and so on. Arabs also adopted and adapted the Indian place-value system of numerals, including 0 for zero, that is called Hindu-Arabic. They were suspicious of negative numbers, as not being pukka quantities

The Indians From 1492 until the American Revolution, and in some sense continuing into the twenty-first century, America evoked the image of Indians. Archaeologists believe that the American continent was first inhabited by human beings who walked from Siberia to Alaska over the Bering Strait on a frozen land bridge about 30,000 to 40,000 years ago. However, what the Indians represent in the global imagination is a fairly static image informed by media portrayals that starkly depict the Indians either as barbaric savages or as noble stewards of the land living in harmony with nature. These images have a long genealogy

The Indians From 1492 until the American Revolution, and in some sense continuing into the twenty-first century, America evoked the image of Indians. Archaeologists believe that the American continent was first inhabited by human beings who walked from Siberia to Alaska over the Bering Strait on a frozen land bridge about 30,000 to 40,000 years ago. However, what the Indians represent in the global imagination is a fairly static image informed by media portrayals that starkly depict the Indians either as barbaric savages or as noble stewards of the land living in harmony with nature. These images have a long genealogy

The United States When people speak about America, they usually are referring not to the Indians, nor to the hemisphere as a whole, but to the United States of America (USA), the world's most powerful nation since World War II. The global obsession with American power revolves around four axes: cultural, economic, political, and military. American popular culture (e.g., blue jeans, rock and roll and jazz music, cinema and television programming, McDonald's restaurants, and Disneyland) is both highly prized for its energy, ease, accessibility, and speed and condemned as an unwanted cultural intrusion that threatens to swamp indigenous ways. Economically, America has for centuries represented the possibility of riches beyond belief ("streets paved with gold"), and as such has been the goal of tens and tens of millions of immigrants. But since the United States became the world's dominant economic power, its material wealth has become both envied and resented. Politically, America has been lauded as a uniquely favorable place (what the American colonist John Winthrop called a "city on a hill") for the promise of freedom that it offers, and it has been condemned, as in the eyes of the Iranian revolutionary, the Ayatollah Khomeini, as "the great Satan" for what are perceived to be its heathen and materialistic ways. Militarily, the United States has since World War II been the strongest country on earth, and since the collapse of the Soviet Union, it is universally cited as the world's only superpower. This power is sometimes feared and envied by those without it. Moreover, people throughout the globe paradoxically call for the United States to use its power when they want it to do something and condemn the United States as arrogant when it uses it for a cause of which they disapprove. These perceptions of the United States are neither new nor unmediated reactions to perceived facts. Each of these praises and complaints can be traced back almost to the founding of the United States itself. Thus, they cannot be explained merely as a reaction to a particular political administration or to the rise of American power. Deeper phenomena are at play

The United States When people speak about America, they usually are referring not to the Indians, nor to the hemisphere as a whole, but to the United States of America (USA), the world's most powerful nation since World War II. The global obsession with American power revolves around four axes: cultural, economic, political, and military. American popular culture (e.g., blue jeans, rock and roll and jazz music, cinema and television programming, McDonald's restaurants, and Disneyland) is both highly prized for its energy, ease, accessibility, and speed and condemned as an unwanted cultural intrusion that threatens to swamp indigenous ways. Economically, America has for centuries represented the possibility of riches beyond belief ("streets paved with gold"), and as such has been the goal of tens and tens of millions of immigrants. But since the United States became the world's dominant economic power, its material wealth has become both envied and resented. Politically, America has been lauded as a uniquely favorable place (what the American colonist John Winthrop called a "city on a hill") for the promise of freedom that it offers, and it has been condemned, as in the eyes of the Iranian revolutionary, the Ayatollah Khomeini, as "the great Satan" for what are perceived to be its heathen and materialistic ways. Militarily, the United States has since World War II been the strongest country on earth, and since the collapse of the Soviet Union, it is universally cited as the world's only superpower. This power is sometimes feared and envied by those without it. Moreover, people throughout the globe paradoxically call for the United States to use its power when they want it to do something and condemn the United States as arrogant when it uses it for a cause of which they disapprove. These perceptions of the United States are neither new nor unmediated reactions to perceived facts. Each of these praises and complaints can be traced back almost to the founding of the United States itself. Thus, they cannot be explained merely as a reaction to a particular political administration or to the rise of American power. Deeper phenomena are at play

The binary framework has been challenged on two fundamental premises. One group has sought to complicate the categories of black ideologies. Anthropologist Leith Mullings, historian Manning Marable, and political scientists Robert C. Smith and Michael C. Dawson, among others, have offered more comprehensive frameworks. Adding radicalism to nationalism and integrationism, Smith conceives of three major African-American ideologies: black nationalism, integrationism, and radicalism. Mullings and Marable discern three "strategic visions" in black political thought, which they term inclusion, black nationalism, and transformation. Interestingly, Mullings, Marable, and Smith would acknowledge conservatism as a distinct political perspective; yet, because they do not view it as politically salient before the 1990s, they have not conceptualized it as a major ideology among African-Americans. Dawson's framework, in contrast, includes black conservatism among the six "historically important" black ideologies he identifies: black nationalism, black liberalism, including three streams, black feminism, and black radicalism. Taking a very different approach, sociologist John Brown Childs eschews the conventional debates over ideology to identify two worldviews, which he argues constitute the "coherent systematic approach" that undergirds political ideologies. Seeking to uncover the "conceptual currents" beneath strategic conflicts among social justice activists, Childs identifies two irreconcilable worldviews, the vanguard and mutuality perspectives. According to Childs, vanguard approaches posit an elite that possesses knowledge of the "way," which they bestow upon the ignorant and impose on the defiant. In contrast, mutuality approaches advocate praxis built on sociohistorical correspondence, communication, diversity, cooperation, (self-) transformation, and openness, and reject notions of a leading group. Despite the potency of Brown's insights, most scholars of black history and politics have continued to chart African-American social movements and individual activist intellectuals via their ideologies, rather than their worldview or organizing approach.

The binary framework has been challenged on two fundamental premises. One group has sought to complicate the categories of black ideologies. Anthropologist Leith Mullings, historian Manning Marable, and political scientists Robert C. Smith and Michael C. Dawson, among others, have offered more comprehensive frameworks. Adding radicalism to nationalism and integrationism, Smith conceives of three major African-American ideologies: black nationalism, integrationism, and radicalism. Mullings and Marable discern three "strategic visions" in black political thought, which they term inclusion, black nationalism, and transformation. Interestingly, Mullings, Marable, and Smith would acknowledge conservatism as a distinct political perspective; yet, because they do not view it as politically salient before the 1990s, they have not conceptualized it as a major ideology among African-Americans. Dawson's framework, in contrast, includes black conservatism among the six "historically important" black ideologies he identifies: black nationalism, black liberalism, including three streams, black feminism, and black radicalism. Taking a very different approach, sociologist John Brown Childs eschews the conventional debates over ideology to identify two worldviews, which he argues constitute the "coherent systematic approach" that undergirds political ideologies. Seeking to uncover the "conceptual currents" beneath strategic conflicts among social justice activists, Childs identifies two irreconcilable worldviews, the vanguard and mutuality perspectives. According to Childs, vanguard approaches posit an elite that possesses knowledge of the "way," which they bestow upon the ignorant and impose on the defiant. In contrast, mutuality approaches advocate praxis built on sociohistorical correspondence, communication, diversity, cooperation, (self-) transformation, and openness, and reject notions of a leading group. Despite the potency of Brown's insights, most scholars of black history and politics have continued to chart African-American social movements and individual activist intellectuals via their ideologies, rather than their worldview or organizing approach.

Black nationalist, or autonomic, strategies are the oldest ideological approaches developed by African-Americans. The slave revolts, especially those before the nineteenth century, that aimed to create maroon societies modeled after remembrances of African social organizational patterns perhaps best reflect the anteriority of black nationalism among African-American ideologies. Autonomic approaches are also the most varied and complicated of African-American ideologies, but they can be divided into two broad categories: protonationalism and separatism. Protonationalism refers to strategic visions that emphasize autonomy in the realm of civil society, the desire to reside in semiautonomous towns and regions, and the preference for preserving distinct cultural practices. Richard Allen's (1760-1831) creation of the African Episcopal Methodist Church in 1794 and W. E. B. DuBois's call for blacks to build on their group strengths in the 1930s or the 1960s era campaigns for "community control" of African-American communities are examples of protonationalism. Separatism has a more delimited terrain, encompassing emigration and efforts to create an independent African-American nation-state within the United States, such as Martin R. Delany's and others' proposals to emigrate to Africa, Canada, or South America during the 1840s and 1850s; Marcus Garvey's 1920s plan to repatriate to Liberia; or the Republic of New Africa's 1960s desire to create an African-American nation-state in the U.S. South. Autonomic discourses are shaped by their sociohistorical context; they tend to surge and ebb in relationship to the economic and political position of blacks in U.S. society. As a rule, black nationalism swells during sustained economic downturns. Autonomic philosophies are also sensitive to the interplay of dominant and emerging ideologies. This is particularly true regarding questions of cultural difference. For instance, between 1850 and 1925, the era historian Wilson Moses terms the golden age of black nationalism, nationalists were ambivalent toward African culture and rejected Africanisms or cultural carryovers. Like the social Darwinists of the day, they often viewed Africans and African-Americans as "underdeveloped" or even "backward," although they usually ascribed environmental or religious, rather than genetic, causes. Consequently, they preferred European-American high culture. To a large extent, black power, the protonationalism that developed during the "turbulent sixties" (1955-1975) was predicated on Africanization, the adoption of actual or imagined African cultural values and practices. Evidence from several years of the National Black Politics Study from 1979 to 1992 suggests that proto-black nationalism remained the predominant perspective of the majority of African-American people in the late twentieth century. For instance, political scientists Darren W. Davis and Ronald Brown discovered that 84 percent of African-Americans believed blacks should buy from black-owned businesses; 83.3 percent believed blacks should be self-reliant; 73.8 percent wanted blacks to control their communities' economics; and 68.3 percent believed blacks should govern their communities. Another 70.7 percent thought black children should learn an African language, and 56.5 percent advocated participating only in allblack organization

Black nationalist, or autonomic, strategies are the oldest ideological approaches developed by African-Americans. The slave revolts, especially those before the nineteenth century, that aimed to create maroon societies modeled after remembrances of African social organizational patterns perhaps best reflect the anteriority of black nationalism among African-American ideologies. Autonomic approaches are also the most varied and complicated of African-American ideologies, but they can be divided into two broad categories: protonationalism and separatism. Protonationalism refers to strategic visions that emphasize autonomy in the realm of civil society, the desire to reside in semiautonomous towns and regions, and the preference for preserving distinct cultural practices. Richard Allen's (1760-1831) creation of the African Episcopal Methodist Church in 1794 and W. E. B. DuBois's call for blacks to build on their group strengths in the 1930s or the 1960s era campaigns for "community control" of African-American communities are examples of protonationalism. Separatism has a more delimited terrain, encompassing emigration and efforts to create an independent African-American nation-state within the United States, such as Martin R. Delany's and others' proposals to emigrate to Africa, Canada, or South America during the 1840s and 1850s; Marcus Garvey's 1920s plan to repatriate to Liberia; or the Republic of New Africa's 1960s desire to create an African-American nation-state in the U.S. South. Autonomic discourses are shaped by their sociohistorical context; they tend to surge and ebb in relationship to the economic and political position of blacks in U.S. society. As a rule, black nationalism swells during sustained economic downturns. Autonomic philosophies are also sensitive to the interplay of dominant and emerging ideologies. This is particularly true regarding questions of cultural difference. For instance, between 1850 and 1925, the era historian Wilson Moses terms the golden age of black nationalism, nationalists were ambivalent toward African culture and rejected Africanisms or cultural carryovers. Like the social Darwinists of the day, they often viewed Africans and African-Americans as "underdeveloped" or even "backward," although they usually ascribed environmental or religious, rather than genetic, causes. Consequently, they preferred European-American high culture. To a large extent, black power, the protonationalism that developed during the "turbulent sixties" (1955-1975) was predicated on Africanization, the adoption of actual or imagined African cultural values and practices. Evidence from several years of the National Black Politics Study from 1979 to 1992 suggests that proto-black nationalism remained the predominant perspective of the majority of African-American people in the late twentieth century. For instance, political scientists Darren W. Davis and Ronald Brown discovered that 84 percent of African-Americans believed blacks should buy from black-owned businesses; 83.3 percent believed blacks should be self-reliant; 73.8 percent wanted blacks to control their communities' economics; and 68.3 percent believed blacks should govern their communities. Another 70.7 percent thought black children should learn an African language, and 56.5 percent advocated participating only in allblack organization

ANARCHISM. The term anarchy comes from an ancient Greek word meaning "without a leader or ruler." However, proponents of anarchism have most often used the term to refer to a natural state of society in which people are not governed by submission to human-made laws or to any external authority. Anarchism is above all a moral doctrine concerned with maximizing the personal freedom of individuals in society. To achieve this end, leading anarchist social theorists have tended to offer critical analyses of (1) the state and its institutional framework; (2) economics; and (3) religion. Anarchist hostility to the state is reflected in the rejection of the view popularized by contract theorists that a government's sovereignty is legitimated by the consent of its subjects. Anarchists contend that no contractual arrangement among human beings justifies the establishment of a ruling body (government) that subordinates individuals to its authority. From their observations of the historical development of the state, anarchist thinkers such as Pierre-Joseph Proudhon (1809-1865) and Peter Kropotkin (1842-1921) concluded that all forms of government have been used as instruments for establishing monopolies that favor the propertied and privileged. Anarchists also argue that the allencompassing authority of the state allows it to exercise undue influence over the lives of its citizens. It is further maintained by anarchists that the state, using laws and the organs of power at its disposal, can control not only citizens' public and private behavior but also their economic lives. As such, the state, in all its forms, is condemned as an unnecessary evil. From an economic standpoint, most anarchists have identified themselves as members of the anticapitalist socialist movement. In common with socialists, anarchists see capitalism as a system ruled by elites, one that exploits the working or productive members of society economically and represses them culturally and spiritually. Accordingly, anarchists argue ANARCHISM 66 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 66 that the emancipation of the worker will only be achieved by completely destroying the pillars of capitalism. Anarchists differ as to what form of economic arrangements should replace capitalism. Collectivists and mutualists insist that private ownership of the fruits of individuals' labor is desirable, while anarchist communists maintain that individual freedom can only be achieved in a society where all material goods and natural resources are placed under common ownership. Still another group of anarchists known as individualists have advocated a system of "labor for labor" exchange, which they believe could operate in accordance with natural market forces. Anticlericalism is another important dimension of anarchist thinking. Though most anarchists are materialists, they are not opposed to spirituality per se: indeed anarcho-pacifists such as Leo Tolstoy (1828-1910) were self-identified as Christians. Rather, anarchists condemn organized religion, which they see as an agent of cultural repression. They have, for example, attacked the Catholic Church among other religious institutions on the grounds that it has historically served as a means of empowering church government and not of enriching the spiritual lives of its adherents. Anarchists further contend that the church has consistently acted as an ally of secular governments and therefore forms part of the general system of state repression that operates against the common person. Because the heyday of anarchism as an ideological movement was during the nineteenth and early twentieth centuries, the focus here will be on the core beliefs of key anarchist theorists in this period. Thus a discussion of other, less historically significant anarchist strands such as pacifism and individualism will be mentioned only in passing. The impact that classical anarchist theory has had on recent political and social movements will be summarized in the concluding section

ANARCHISM. The term anarchy comes from an ancient Greek word meaning "without a leader or ruler." However, proponents of anarchism have most often used the term to refer to a natural state of society in which people are not governed by submission to human-made laws or to any external authority. Anarchism is above all a moral doctrine concerned with maximizing the personal freedom of individuals in society. To achieve this end, leading anarchist social theorists have tended to offer critical analyses of (1) the state and its institutional framework; (2) economics; and (3) religion. Anarchist hostility to the state is reflected in the rejection of the view popularized by contract theorists that a government's sovereignty is legitimated by the consent of its subjects. Anarchists contend that no contractual arrangement among human beings justifies the establishment of a ruling body (government) that subordinates individuals to its authority. From their observations of the historical development of the state, anarchist thinkers such as Pierre-Joseph Proudhon (1809-1865) and Peter Kropotkin (1842-1921) concluded that all forms of government have been used as instruments for establishing monopolies that favor the propertied and privileged. Anarchists also argue that the allencompassing authority of the state allows it to exercise undue influence over the lives of its citizens. It is further maintained by anarchists that the state, using laws and the organs of power at its disposal, can control not only citizens' public and private behavior but also their economic lives. As such, the state, in all its forms, is condemned as an unnecessary evil. From an economic standpoint, most anarchists have identified themselves as members of the anticapitalist socialist movement. In common with socialists, anarchists see capitalism as a system ruled by elites, one that exploits the working or productive members of society economically and represses them culturally and spiritually. Accordingly, anarchists argue ANARCHISM 66 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 66 that the emancipation of the worker will only be achieved by completely destroying the pillars of capitalism. Anarchists differ as to what form of economic arrangements should replace capitalism. Collectivists and mutualists insist that private ownership of the fruits of individuals' labor is desirable, while anarchist communists maintain that individual freedom can only be achieved in a society where all material goods and natural resources are placed under common ownership. Still another group of anarchists known as individualists have advocated a system of "labor for labor" exchange, which they believe could operate in accordance with natural market forces. Anticlericalism is another important dimension of anarchist thinking. Though most anarchists are materialists, they are not opposed to spirituality per se: indeed anarcho-pacifists such as Leo Tolstoy (1828-1910) were self-identified as Christians. Rather, anarchists condemn organized religion, which they see as an agent of cultural repression. They have, for example, attacked the Catholic Church among other religious institutions on the grounds that it has historically served as a means of empowering church government and not of enriching the spiritual lives of its adherents. Anarchists further contend that the church has consistently acted as an ally of secular governments and therefore forms part of the general system of state repression that operates against the common person. Because the heyday of anarchism as an ideological movement was during the nineteenth and early twentieth centuries, the focus here will be on the core beliefs of key anarchist theorists in this period. Thus a discussion of other, less historically significant anarchist strands such as pacifism and individualism will be mentioned only in passing. The impact that classical anarchist theory has had on recent political and social movements will be summarized in the concluding section

ANCESTOR WORSHIP. Ancestor worship is the reverent devotion expressed by descendants for their deceased forebears through a culturally prescribed set of rituals and observances. The prominence of ancestors as a focus of worship within a broader religious tradition is common in many parts of the world, including Asia, Africa, and Native America, but there are few unifying characteristics cross-culturally. Commonalities include: Only those deceased of appropriate relationship to the living and who have undergone the necessary rites de passage are worshiped. Those that are worshiped usually are recognized by name or title, often a special posthumous one. Services to the ancestors frequently include offerings and libations. That ancestor worship is related to the animistic belief in a spirit or soul surviving the body after death, as proposed by early anthropologist Edward Burnett Tylor (1832-1917), is reasonable, since it is this spirit essence of the ancestor that is believed to continue its relationship with descendants. That ancestor worship is related to the earliest stage of religious expression among humans, however, as Tylor's theory further suggested, is certainly debatable. Other controversies in the study of ancestor worship include whether practices in honor of the deceased constitute actual worship; the extent to which linear versus collateral relatives comprise the worshiping group; the ways in which the living are influenced by the dead; and the individual, family, kin group, or regional variability in practice that can be present in a single cultural tradition. Ancestors in Africa and Asia In his work among the Tallensi of Ghana, Meyer Fortes emphasizes the significance of ancestor worship to patrilineage unification and lineage or segment differentiation. In particular, the father-oldest surviving son relationship is emphasized, the latter having the primary responsibility for performing the appropriate rituals and service. In general, placement of an African ancestral shrine and the performance of its services can also relate to and influence descendants' genealogical position and seniority. In China, Daoist, Confucian, Buddhist, and folk concepts have contributed to the practice of ancestor worship in which heads of patrilineages are emphasized but other patrilineal relatives are included. There are three prominent sites for ancestor worship: family shrines, lineage halls, and tombs or graveyards of relatives. Proper placement and orientation of the latter will take geomancy (feng-shui) into account. Physical remains of the deceased are laid to rest in the tomb/graveyard, which serves as the site of public rituals; ancestral tablets represent the deceased in shrine and temple, in which their spirits are housed, and for which more private and personal observances are made. While the ancestors wield significant authority and influence in the lives of their living descendants, the latter care for and look after their ancestors—for example, by burning paper money at New Year's to contribute to their ancestors' bounty or prosperity. ANCESTOR WORSHIP 70 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 70 Japanese ancestors are also emphasized on the father's side, and their worship is primarily related to Buddhist beliefs and practices. The deceased receive a posthumous or "Buddhist" name, which is written on a tablet and kept in the family's butsudan or Buddhist altar; Buddhist funerary services help purify the corpse from the polluting influences of death. Other services include "death day" memorial services for up to fifty years, New Year's and Bon (or Obon) celebrations, and household prayers. While tradition maintains a differentiation between stem and branch families and a main ancestral altar in the stem house, more modern practice has individual families establishing their own butsudan with the death of a household member. Proper care for the ancestors and observance of appropriate services, offerings, and prayers are believed not only to help the ancestors be restful and in peace, but also to result in blessings and good fortune for the descendants. Among the Inca In his early chronicle of Inca customs, Felipe Guaman Poma de Ayala pictures a mummy with feathered headdress and fine raiment carried on a litter as illustration of November or aya marcay quilla (Quechua for "the month of the dead"). He describes how during the holiday of the dead the deceased were removed from their crypts, adorned with clothing and feathers, given food—through burnt offerings—and drink, and carried dancing and walking through the streets and plazas, then laid to rest with various offerings. Such activities occurred primarily in the worship of royal mummies, as an extension of the concept of the divine nature of the Inca king. While Inca beliefs included the departure of the soul from the body at death, royal bodies were mummified, served burnt offerings and drinks, and cared for by official attendants. Royal ancestors participated in affairs of state—counseling living rulers and contributing to their decision making, and, either in the guise of their mummified remains or as idols making formal appearances and visitations, receiving obeisance from their living subjects. Such beliefs were common in the Andes, as ancestral idols of subject peoples were held in Cuzco, the Inca capital, as a control mechanism. Andean and Inca ancestor worship extended beyond that of royalty, and was probably common among all classes in the pre-Columbian era. Padre Bernabé Cobo attests that when the soul departed from the body, members of the deceased's ayllu (a corporate kin group) and family took and cared for the body, providing the veneration and care that was possible according to the family's means and status. The bodies were kept in relatives' houses, tombs, or shrines and were regularly paid tribute through sacrifice and prayer. This nonroyal worship was performed only by those descended in a direct line, and usually only by the children and possibly grandchildren of the deceased. Such worship was held to directly affect descendants' vitality and fortune, while its lack or disrespect to the ancestors could result in ill health or other maladies. Ancestral Ambivalence Ancestor worship is most likely to be practiced in a society with strong lineages or other consanguineal corporate groups whose continuity, standing, and control of resources extends over generations, and one in which there are strong beliefs in an active spirit world. In such contexts the appropriately related and ritually defined deceased continue to be interactive lineage and family members, cared for and reverenced by the living and in turn contributing to the prosperity of their succeeding generations as sources of or mediators with divine power. In general, ancestors who are worshiped are perceived as guardian or authority figures who are difficult to please, whose degree of influence on the living usually decreases with increasing genealogical distance from descendants. The power of the ancestors is therefore ambivalent: as likely to punish as to reward, they offer security and comfort while also contributing to uncertainty in an equivocal cosmos.

ANCESTOR WORSHIP. Ancestor worship is the reverent devotion expressed by descendants for their deceased forebears through a culturally prescribed set of rituals and observances. The prominence of ancestors as a focus of worship within a broader religious tradition is common in many parts of the world, including Asia, Africa, and Native America, but there are few unifying characteristics cross-culturally. Commonalities include: Only those deceased of appropriate relationship to the living and who have undergone the necessary rites de passage are worshiped. Those that are worshiped usually are recognized by name or title, often a special posthumous one. Services to the ancestors frequently include offerings and libations. That ancestor worship is related to the animistic belief in a spirit or soul surviving the body after death, as proposed by early anthropologist Edward Burnett Tylor (1832-1917), is reasonable, since it is this spirit essence of the ancestor that is believed to continue its relationship with descendants. That ancestor worship is related to the earliest stage of religious expression among humans, however, as Tylor's theory further suggested, is certainly debatable. Other controversies in the study of ancestor worship include whether practices in honor of the deceased constitute actual worship; the extent to which linear versus collateral relatives comprise the worshiping group; the ways in which the living are influenced by the dead; and the individual, family, kin group, or regional variability in practice that can be present in a single cultural tradition. Ancestors in Africa and Asia In his work among the Tallensi of Ghana, Meyer Fortes emphasizes the significance of ancestor worship to patrilineage unification and lineage or segment differentiation. In particular, the father-oldest surviving son relationship is emphasized, the latter having the primary responsibility for performing the appropriate rituals and service. In general, placement of an African ancestral shrine and the performance of its services can also relate to and influence descendants' genealogical position and seniority. In China, Daoist, Confucian, Buddhist, and folk concepts have contributed to the practice of ancestor worship in which heads of patrilineages are emphasized but other patrilineal relatives are included. There are three prominent sites for ancestor worship: family shrines, lineage halls, and tombs or graveyards of relatives. Proper placement and orientation of the latter will take geomancy (feng-shui) into account. Physical remains of the deceased are laid to rest in the tomb/graveyard, which serves as the site of public rituals; ancestral tablets represent the deceased in shrine and temple, in which their spirits are housed, and for which more private and personal observances are made. While the ancestors wield significant authority and influence in the lives of their living descendants, the latter care for and look after their ancestors—for example, by burning paper money at New Year's to contribute to their ancestors' bounty or prosperity. ANCESTOR WORSHIP 70 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 70 Japanese ancestors are also emphasized on the father's side, and their worship is primarily related to Buddhist beliefs and practices. The deceased receive a posthumous or "Buddhist" name, which is written on a tablet and kept in the family's butsudan or Buddhist altar; Buddhist funerary services help purify the corpse from the polluting influences of death. Other services include "death day" memorial services for up to fifty years, New Year's and Bon (or Obon) celebrations, and household prayers. While tradition maintains a differentiation between stem and branch families and a main ancestral altar in the stem house, more modern practice has individual families establishing their own butsudan with the death of a household member. Proper care for the ancestors and observance of appropriate services, offerings, and prayers are believed not only to help the ancestors be restful and in peace, but also to result in blessings and good fortune for the descendants. Among the Inca In his early chronicle of Inca customs, Felipe Guaman Poma de Ayala pictures a mummy with feathered headdress and fine raiment carried on a litter as illustration of November or aya marcay quilla (Quechua for "the month of the dead"). He describes how during the holiday of the dead the deceased were removed from their crypts, adorned with clothing and feathers, given food—through burnt offerings—and drink, and carried dancing and walking through the streets and plazas, then laid to rest with various offerings. Such activities occurred primarily in the worship of royal mummies, as an extension of the concept of the divine nature of the Inca king. While Inca beliefs included the departure of the soul from the body at death, royal bodies were mummified, served burnt offerings and drinks, and cared for by official attendants. Royal ancestors participated in affairs of state—counseling living rulers and contributing to their decision making, and, either in the guise of their mummified remains or as idols making formal appearances and visitations, receiving obeisance from their living subjects. Such beliefs were common in the Andes, as ancestral idols of subject peoples were held in Cuzco, the Inca capital, as a control mechanism. Andean and Inca ancestor worship extended beyond that of royalty, and was probably common among all classes in the pre-Columbian era. Padre Bernabé Cobo attests that when the soul departed from the body, members of the deceased's ayllu (a corporate kin group) and family took and cared for the body, providing the veneration and care that was possible according to the family's means and status. The bodies were kept in relatives' houses, tombs, or shrines and were regularly paid tribute through sacrifice and prayer. This nonroyal worship was performed only by those descended in a direct line, and usually only by the children and possibly grandchildren of the deceased. Such worship was held to directly affect descendants' vitality and fortune, while its lack or disrespect to the ancestors could result in ill health or other maladies. Ancestral Ambivalence Ancestor worship is most likely to be practiced in a society with strong lineages or other consanguineal corporate groups whose continuity, standing, and control of resources extends over generations, and one in which there are strong beliefs in an active spirit world. In such contexts the appropriately related and ritually defined deceased continue to be interactive lineage and family members, cared for and reverenced by the living and in turn contributing to the prosperity of their succeeding generations as sources of or mediators with divine power. In general, ancestors who are worshiped are perceived as guardian or authority figures who are difficult to please, whose degree of influence on the living usually decreases with increasing genealogical distance from descendants. The power of the ancestors is therefore ambivalent: as likely to punish as to reward, they offer security and comfort while also contributing to uncertainty in an equivocal cosmos.

African-American Radicalism Perhaps more than any other ideology, black radicalism is sensitive to the sociohistorical context. During the slavery period, the radical black perspective advocated the immediate destruction of slavery and the extension of full civil rights to black people. After slavery, as most African-Americans were being incorporated into the semicapitalist plantation economy, things became more complex. Confronting sharecropping, black radicalism required more than the advocacy of black landownership. Circumstances required that black radicals develop a critique of the capitalist system that maintained the plantation economy. Whereas during slavery Frederick Douglass and Martin R. Delany were radicals, during Reconstruction and the first nadir (1877-1917) Douglass became the quintessential black liberal while Delany descended into conservatism. This hypersensitivity to sociohistorical context derives from the general premises of radicalism: transformation of the fundamental structural and ideological elements of a society. Black radicalism has consisted of philosophies and practices that sought the essential transformation of the system of racial oppression and the social system that institutionalized it. Peter H. Clark, the first modern black radical, joined the Socialist Party in 1877, during the first nadir. Ironically, Clark's reasons for leaving the socialists probably affected the construction of black radicalism more than his reasons for joining. (Clark left the Socialist Party because it did not confront the specificity of race and racial oppression, preferring to view race as subordinate to class.) The outlines of a distinct black radical perspective, however, did not appear until after World War I, after Hubert H. Harrison (1900-1945), Cyril V. Briggs (1888-1966), and the African Blood Brotherhood (c. 1920s) examined the state of African peoples and the relationship of black and white radicals in the United States. Black radicalism was a unique effort to merge and remake classical Marxist theory and black nationalism into a race-conscious socialist theory. Although Ralph Bunche (1904-1971) and Abram Harris (1899-1963) were radical blacks during the 1930s, unlike W. E. B. DuBois, Langston Hughes (1902-1967), Claudia Jones (1915-1964), and C. L. R. James (1901-1989), their ideas might be too orthodox to be considered foundational for black radicalism. Black radicals, according to Anthony Bogues, are either heretics or prophets. Heretics, usually highly educated in European-American radicalism, use subjugated knowledge from black experiences to challenge white radical orthodoxies and to rework them into theories that can accommodate black experiences and perspectives. Bogues's prophets come from the realm of religion. They are often undereducated by U.S. standards. They tap subjugated knowledge from sources outside the mainstream academy: the Bible, the Koran, particularly esoteric interpretations of both. In addition to derived religious sources, prophets articulate, in Hobsbawm's terms, an inherent ideology, one drawn from the murkier repositories of African survivals and popular culture. Noble Drew Ali (1866-1929) and the Moorish Science Temple, Elijah Muhammad (1897-1975) and the Nation of Islam, and Prince Asiel Ben Israel and the Original Black Hebrew Israelite Nation represent this type of black radicalism. (Note that, by this author's criterion— opposition to the dominant mode of production—they would not be considered black radicals.) In more recent times, Huey P. Newton (1942-1989) and the Black Panther Party (1960s), especially their theory of intercommunialism, the Revolutionary Action Movement (late 1960s), and the League of Revolutionary Black Workers (late 1960s) have represented the black radical perspective. Since the late twentieth century, the Black Radical Congress has represented this ideological tendency.

African-American Radicalism Perhaps more than any other ideology, black radicalism is sensitive to the sociohistorical context. During the slavery period, the radical black perspective advocated the immediate destruction of slavery and the extension of full civil rights to black people. After slavery, as most African-Americans were being incorporated into the semicapitalist plantation economy, things became more complex. Confronting sharecropping, black radicalism required more than the advocacy of black landownership. Circumstances required that black radicals develop a critique of the capitalist system that maintained the plantation economy. Whereas during slavery Frederick Douglass and Martin R. Delany were radicals, during Reconstruction and the first nadir (1877-1917) Douglass became the quintessential black liberal while Delany descended into conservatism. This hypersensitivity to sociohistorical context derives from the general premises of radicalism: transformation of the fundamental structural and ideological elements of a society. Black radicalism has consisted of philosophies and practices that sought the essential transformation of the system of racial oppression and the social system that institutionalized it. Peter H. Clark, the first modern black radical, joined the Socialist Party in 1877, during the first nadir. Ironically, Clark's reasons for leaving the socialists probably affected the construction of black radicalism more than his reasons for joining. (Clark left the Socialist Party because it did not confront the specificity of race and racial oppression, preferring to view race as subordinate to class.) The outlines of a distinct black radical perspective, however, did not appear until after World War I, after Hubert H. Harrison (1900-1945), Cyril V. Briggs (1888-1966), and the African Blood Brotherhood (c. 1920s) examined the state of African peoples and the relationship of black and white radicals in the United States. Black radicalism was a unique effort to merge and remake classical Marxist theory and black nationalism into a race-conscious socialist theory. Although Ralph Bunche (1904-1971) and Abram Harris (1899-1963) were radical blacks during the 1930s, unlike W. E. B. DuBois, Langston Hughes (1902-1967), Claudia Jones (1915-1964), and C. L. R. James (1901-1989), their ideas might be too orthodox to be considered foundational for black radicalism. Black radicals, according to Anthony Bogues, are either heretics or prophets. Heretics, usually highly educated in European-American radicalism, use subjugated knowledge from black experiences to challenge white radical orthodoxies and to rework them into theories that can accommodate black experiences and perspectives. Bogues's prophets come from the realm of religion. They are often undereducated by U.S. standards. They tap subjugated knowledge from sources outside the mainstream academy: the Bible, the Koran, particularly esoteric interpretations of both. In addition to derived religious sources, prophets articulate, in Hobsbawm's terms, an inherent ideology, one drawn from the murkier repositories of African survivals and popular culture. Noble Drew Ali (1866-1929) and the Moorish Science Temple, Elijah Muhammad (1897-1975) and the Nation of Islam, and Prince Asiel Ben Israel and the Original Black Hebrew Israelite Nation represent this type of black radicalism. (Note that, by this author's criterion— opposition to the dominant mode of production—they would not be considered black radicals.) In more recent times, Huey P. Newton (1942-1989) and the Black Panther Party (1960s), especially their theory of intercommunialism, the Revolutionary Action Movement (late 1960s), and the League of Revolutionary Black Workers (late 1960s) have represented the black radical perspective. Since the late twentieth century, the Black Radical Congress has represented this ideological tendency.

Afrocentricity and Its Critics As could be expected, however, Afrocentricity's growing paradigmatic ascendancy over African-American studies also prompted serious critiques, which fall within five broad categories. First, critics have disagreed with some of Afrocentricity's premises, in particular the notion of an African essence that undergirds the notion of center. This criticism is often heard in poststructuralist circles, since the very idea of a center is antithetical to the poststructuralist paradigm. Often associated with this criticism is the additional claim that in its search for Africanness, Afrocentricity does not allow for cultural change. In fact, some argue, Afrocentricity's inability to deal adequately with cultural change prevents it from understanding that being African today also means being at least partly European as a result of colonization and widespread Westernization. Afrocentricity, then, is perceived as too restrictive and incapable of grasping the dialectical complexity of modern African identities. While he denies being an "immutabilist," Asante's response has been that Africans need a place to stand in order to challenge oppressive White structures and systems of knowledge and therefore cannot afford postmodern, evanescent, fluid selves. In any case, any discourse on identity is necessarily essentialist. Afrocentrists also point out that far from denying the Westernization of many Africans' consciousness, they recognize it as a destructive force that must be circumvented. Second, some have taken issue with Afrocentricity's main category, culture. black feminists and black neo-Marxists advance gender and social class, respectively, as the primary contradiction in African-American life. With regard to feminism, however, Afrocentric scholars who tackle gender issues question the relevance of feminist philosophical and political assumptions for African people, including African women. Concerning the question of class, while it is quite feasible and necessary to articulate an Afrocentric economic theory, Afrocentricity maintains that race/culture remains the most socially relevant category in American society. Third, Afrocentricity has also been criticized for making untenable historical claims, especially in relation to ancient Egypt. This argument, probably the most publicized, has stemmed from European classicists who, having subscribed to the Greek Miracle theory, became disturbed by two related developments associated with the spread of Afrocentricity: first, credit was being taken away from Europe for the great civilizations of the Nile Valley (in particular, Egypt); and second, as a consequence the original intellectual achievements of Greece itself were revisited and diminished. For instance, it was pointed out that many Greek philosophers had studied for long periods of time in ancient Africa, and were in reality indebted to their African teachers for many of their ideas. Therefore many European scholars in the United States and Europe proceeded to refute those "Afrocentric" claims. However, it must be noted that the debate over the racial identity of the early Egyptians predates the emergence of Afrocentricity by several decades and is not, therefore, an issue germane to Afrocentricity per se. It must be more correctly understood within the context of Diopian historiography, which places Egypt at the beginning, both chronologically and conceptually, of African civilization. In fact, several of the scholars associated with this thrust, such as Martin Bernal, have never claimed to be Afrocentric. Fourth, Afrocentricity has also been criticized for intellectual bad faith because of wrong attributions and associations. For instance, Afrocentricity has been associated with biologicaldeterministic arguments (such as that around melanin) that were never part of its premises. Finally, criticism of an ideological nature has been voiced. In one instance, Afrocentricity has been blamed as reversed Eurocentrism. Some scholars contend that Afrocentricity merely seeks to replace one geopolitical hegemonic center, Europe, with another hegemonic one, Africa. However, as even a cursory reading of Asante's texts would reveal, Afrocentricity is fundamentally nonhegemonic and welcomes the existence of a multiplicity of cultural centers. It is precisely that position that allowed Afrocentricity to challenge Eurocentrism in the first place. Some have also contended that Afrocentricity undermines the very fabric of American society. By emphasizing the Africans' prerogative to be human as Africans, Afrocentricity is said to threaten the unity of American society, including the American academe. However, Afrocentrists remark that the unspoken fear is not so much about a shattered national unity (which, given racism, could have never truly existed) but about the threat that Afrocentricity poses to Europe's self-serving monopoly over reason. While Afrocentricity continues to exercise a significant influence in the United States, it has also been receiving increased attention in Europe and Africa, where a vigorous intellectual movement has emerged informed by Afrocentric tenets and referred to as the "African Renaissance," thus creating the possibility for Afrocentricity to be transformed into a Pan-African school of thought in the years to come.

Afrocentricity and Its Critics As could be expected, however, Afrocentricity's growing paradigmatic ascendancy over African-American studies also prompted serious critiques, which fall within five broad categories. First, critics have disagreed with some of Afrocentricity's premises, in particular the notion of an African essence that undergirds the notion of center. This criticism is often heard in poststructuralist circles, since the very idea of a center is antithetical to the poststructuralist paradigm. Often associated with this criticism is the additional claim that in its search for Africanness, Afrocentricity does not allow for cultural change. In fact, some argue, Afrocentricity's inability to deal adequately with cultural change prevents it from understanding that being African today also means being at least partly European as a result of colonization and widespread Westernization. Afrocentricity, then, is perceived as too restrictive and incapable of grasping the dialectical complexity of modern African identities. While he denies being an "immutabilist," Asante's response has been that Africans need a place to stand in order to challenge oppressive White structures and systems of knowledge and therefore cannot afford postmodern, evanescent, fluid selves. In any case, any discourse on identity is necessarily essentialist. Afrocentrists also point out that far from denying the Westernization of many Africans' consciousness, they recognize it as a destructive force that must be circumvented. Second, some have taken issue with Afrocentricity's main category, culture. black feminists and black neo-Marxists advance gender and social class, respectively, as the primary contradiction in African-American life. With regard to feminism, however, Afrocentric scholars who tackle gender issues question the relevance of feminist philosophical and political assumptions for African people, including African women. Concerning the question of class, while it is quite feasible and necessary to articulate an Afrocentric economic theory, Afrocentricity maintains that race/culture remains the most socially relevant category in American society. Third, Afrocentricity has also been criticized for making untenable historical claims, especially in relation to ancient Egypt. This argument, probably the most publicized, has stemmed from European classicists who, having subscribed to the Greek Miracle theory, became disturbed by two related developments associated with the spread of Afrocentricity: first, credit was being taken away from Europe for the great civilizations of the Nile Valley (in particular, Egypt); and second, as a consequence the original intellectual achievements of Greece itself were revisited and diminished. For instance, it was pointed out that many Greek philosophers had studied for long periods of time in ancient Africa, and were in reality indebted to their African teachers for many of their ideas. Therefore many European scholars in the United States and Europe proceeded to refute those "Afrocentric" claims. However, it must be noted that the debate over the racial identity of the early Egyptians predates the emergence of Afrocentricity by several decades and is not, therefore, an issue germane to Afrocentricity per se. It must be more correctly understood within the context of Diopian historiography, which places Egypt at the beginning, both chronologically and conceptually, of African civilization. In fact, several of the scholars associated with this thrust, such as Martin Bernal, have never claimed to be Afrocentric. Fourth, Afrocentricity has also been criticized for intellectual bad faith because of wrong attributions and associations. For instance, Afrocentricity has been associated with biologicaldeterministic arguments (such as that around melanin) that were never part of its premises. Finally, criticism of an ideological nature has been voiced. In one instance, Afrocentricity has been blamed as reversed Eurocentrism. Some scholars contend that Afrocentricity merely seeks to replace one geopolitical hegemonic center, Europe, with another hegemonic one, Africa. However, as even a cursory reading of Asante's texts would reveal, Afrocentricity is fundamentally nonhegemonic and welcomes the existence of a multiplicity of cultural centers. It is precisely that position that allowed Afrocentricity to challenge Eurocentrism in the first place. Some have also contended that Afrocentricity undermines the very fabric of American society. By emphasizing the Africans' prerogative to be human as Africans, Afrocentricity is said to threaten the unity of American society, including the American academe. However, Afrocentrists remark that the unspoken fear is not so much about a shattered national unity (which, given racism, could have never truly existed) but about the threat that Afrocentricity poses to Europe's self-serving monopoly over reason. While Afrocentricity continues to exercise a significant influence in the United States, it has also been receiving increased attention in Europe and Africa, where a vigorous intellectual movement has emerged informed by Afrocentric tenets and referred to as the "African Renaissance," thus creating the possibility for Afrocentricity to be transformed into a Pan-African school of thought in the years to come.

Biological Anthropology The advent of geology and the study of fossil sequences in the late eighteenth and early nineteenth centuries by pioneer geologists such as James Hutton (1726-1797) and Sir Charles Lyell (1797-1875) who laid the groundwork for the study of human evolution. Two major events in the 1850s loom large in the history of this most basic of the subfields: (1) the accidental discovery in 1856 of the first premodern human being, the prototype of the Neanderthals, in a quarry near Düsseldorf, Germany, and (2) Charles Darwin's (1809-1882) theory of "natural selection," articulated in On the Origin of Species (1859). This theory gave scholars who wanted to study the course of human evolution systematically a theoretical framework for determining how one species evolved over time into another. The human fossil evidence in Europe, and eventually throughout the Old World, from Africa to China and Indonesia, mounted rapidly, and by the early twentieth century anthropologists had developed several models of human evolution. At first, there appeared to have been two successive species of genus Homo: Homo sapiens, including all modern human beings as well as our immediate precursors, the Neanderthals, and the far older Pithecanthropus erectus, the earliest example of which was found near Solo on the island of Java in 1895. It had become clear that the human species was at least several hundred thousand years old. As the twentieth century unfolded, new and even older hominid fossils were discovered, primarily in Southern and Eastern Africa, and both the dates and descriptions of hominid evolution changed markedly. The 1925 discovery of Australopithecus africanus in South Africa by Raymond Dart (1893-1988) pushed the origin of the hominids back at least a million years and added a new, pre-Homo genus, Australopithecus, or "Southern Ape-Man." It is impossible here to outline the sequence of major fossil discoveries in Africa and elsewhere that have been made since 1925. The names of anthropologists responsible for these finds include the late Louis S. B. Leakey (1903-1972) and Mary Leakey (1913-1996), who, in the late 1950s and early 1960s, discovered a number of extremely important protohominids at Olduvai Gorge in northeast Tanzania. In 1974 Donald Johanson discovered "Lucy," an extremely early Australopithecine that lived in what is now southeastern Ethiopia around 3.1 million years ago, the prototype of Australopithecus afarensis. In 1994 fossil evidence of an even older genus and species of protohominids, Ardipithecus ramidus, more than a million years older than "Lucy" (c. 4.5 million years old), was found in the same region of Africa, and in the last several years fossil fragments found in East Africa push the origin of hominids even farther back, perhaps as much as 5.5 million years. Moreover, it is now suspected that hominid bipedalism evolved as early as 4.5 to 5 million years ago; by freeing our forelimbs, it affected the evolution of the capacity for culture profoundly by enabling our ancestors to use and make tools. Of course, this did not happen overnight. The earliest evidence for the presence of crude tools, again in East Africa, dates from around 2.5 million years ago. By this time, the earliest species of our genus, Homo habilis, had evolved, followed by Homo ergaster (c. 1.9-1.5 million years ago), and then Homo erectus, which dominated the Old World from c. 1.5 million to about 200,000 years ago, when it began to be replaced, at least in Europe—Homo erectus appears to have lingered longer in parts of Asia—by the Neanderthals. They, in turn, were eventually displaced by our own immediate ancestors, anatomically modern hominids (Homo sapiens), who are now thought to have evolved around 130,000 years ago near the southern tip of Africa. By 27,000 years ago, Homo sapiens had replaced all other hominid species everywhere. Some biological anthropologists still subscribe to the "multiregional hypothesis" that human beings became "modern" simultaneously in several parts of the Old World, from Africa to Europe and East Asia about 40,000 to 50,000 years ago, but consensus in the profession supports the "out of Africa" model, strengthened by the absence of any evidence that Neanderthal mitochondrial DNA exists in modern European populations. A significant element in this evolutionary journey was the development of our brains to the point that we were able not only to make crude stone tools but to envelope ourselves and the world around us in what cultural anthropologist Clifford Geertz has called "webs of significance," the capacity for culture. At the same time, it has become abundantly clear that, since the emergence of anatomically modern hominids, no appreciable differences in the capacity for culture have emerged among the several modern human physical types, what are still sometimes erroneously called "races," and that the behavioral ANTHROPOLOGY 74 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 74 and technological differences that separate contemporary human communities are cultural rather than biological. One of the most important contributions of biological anthropology to general knowledge has been to dispel the pernicious myths of racial superiority and inferiority. In addition to tracing the course of human evolution, many biological anthropologists specialize in the comparative study of chimpanzees (e.g., Jane Goodall), gorillas (e.g., the late Dian Fossey [1932-1985]), and other nonhuman primates, hoping to throw additional light on human behavior and the extent to which it is grounded in our primate heritage. Such studies provide a better understanding of the profound biological changes in our ancestors during the last five million years, changes that culminated in Homo sapiens. While biological anthropology is best known for the study of ancient humans and other primates, other branches of the field also make significant contributions. Forensic anthropologists assist law enforcement agencies in gathering and interpreting evidence in cases of homicide, massacres, and genocides; other biological anthropologists study the interaction of culture and biology as it affects our health, longevity, and wellbeing. Such researchers work on a range of topics including the spread of AIDS (autoimmune deficiency syndrome) and other communicable diseases; the relationship between health and social problems such as poverty, racism, and inequality; stress and rapid social change; diet and maternal well-being; and the long-term effects of violence and warfare. There is a close relationship between this type of biologically focused anthropology and the work of medical anthropologists, cultural anthropologists who study the social contexts of medical practice

Biological Anthropology The advent of geology and the study of fossil sequences in the late eighteenth and early nineteenth centuries by pioneer geologists such as James Hutton (1726-1797) and Sir Charles Lyell (1797-1875) who laid the groundwork for the study of human evolution. Two major events in the 1850s loom large in the history of this most basic of the subfields: (1) the accidental discovery in 1856 of the first premodern human being, the prototype of the Neanderthals, in a quarry near Düsseldorf, Germany, and (2) Charles Darwin's (1809-1882) theory of "natural selection," articulated in On the Origin of Species (1859). This theory gave scholars who wanted to study the course of human evolution systematically a theoretical framework for determining how one species evolved over time into another. The human fossil evidence in Europe, and eventually throughout the Old World, from Africa to China and Indonesia, mounted rapidly, and by the early twentieth century anthropologists had developed several models of human evolution. At first, there appeared to have been two successive species of genus Homo: Homo sapiens, including all modern human beings as well as our immediate precursors, the Neanderthals, and the far older Pithecanthropus erectus, the earliest example of which was found near Solo on the island of Java in 1895. It had become clear that the human species was at least several hundred thousand years old. As the twentieth century unfolded, new and even older hominid fossils were discovered, primarily in Southern and Eastern Africa, and both the dates and descriptions of hominid evolution changed markedly. The 1925 discovery of Australopithecus africanus in South Africa by Raymond Dart (1893-1988) pushed the origin of the hominids back at least a million years and added a new, pre-Homo genus, Australopithecus, or "Southern Ape-Man." It is impossible here to outline the sequence of major fossil discoveries in Africa and elsewhere that have been made since 1925. The names of anthropologists responsible for these finds include the late Louis S. B. Leakey (1903-1972) and Mary Leakey (1913-1996), who, in the late 1950s and early 1960s, discovered a number of extremely important protohominids at Olduvai Gorge in northeast Tanzania. In 1974 Donald Johanson discovered "Lucy," an extremely early Australopithecine that lived in what is now southeastern Ethiopia around 3.1 million years ago, the prototype of Australopithecus afarensis. In 1994 fossil evidence of an even older genus and species of protohominids, Ardipithecus ramidus, more than a million years older than "Lucy" (c. 4.5 million years old), was found in the same region of Africa, and in the last several years fossil fragments found in East Africa push the origin of hominids even farther back, perhaps as much as 5.5 million years. Moreover, it is now suspected that hominid bipedalism evolved as early as 4.5 to 5 million years ago; by freeing our forelimbs, it affected the evolution of the capacity for culture profoundly by enabling our ancestors to use and make tools. Of course, this did not happen overnight. The earliest evidence for the presence of crude tools, again in East Africa, dates from around 2.5 million years ago. By this time, the earliest species of our genus, Homo habilis, had evolved, followed by Homo ergaster (c. 1.9-1.5 million years ago), and then Homo erectus, which dominated the Old World from c. 1.5 million to about 200,000 years ago, when it began to be replaced, at least in Europe—Homo erectus appears to have lingered longer in parts of Asia—by the Neanderthals. They, in turn, were eventually displaced by our own immediate ancestors, anatomically modern hominids (Homo sapiens), who are now thought to have evolved around 130,000 years ago near the southern tip of Africa. By 27,000 years ago, Homo sapiens had replaced all other hominid species everywhere. Some biological anthropologists still subscribe to the "multiregional hypothesis" that human beings became "modern" simultaneously in several parts of the Old World, from Africa to Europe and East Asia about 40,000 to 50,000 years ago, but consensus in the profession supports the "out of Africa" model, strengthened by the absence of any evidence that Neanderthal mitochondrial DNA exists in modern European populations. A significant element in this evolutionary journey was the development of our brains to the point that we were able not only to make crude stone tools but to envelope ourselves and the world around us in what cultural anthropologist Clifford Geertz has called "webs of significance," the capacity for culture. At the same time, it has become abundantly clear that, since the emergence of anatomically modern hominids, no appreciable differences in the capacity for culture have emerged among the several modern human physical types, what are still sometimes erroneously called "races," and that the behavioral ANTHROPOLOGY 74 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 74 and technological differences that separate contemporary human communities are cultural rather than biological. One of the most important contributions of biological anthropology to general knowledge has been to dispel the pernicious myths of racial superiority and inferiority. In addition to tracing the course of human evolution, many biological anthropologists specialize in the comparative study of chimpanzees (e.g., Jane Goodall), gorillas (e.g., the late Dian Fossey [1932-1985]), and other nonhuman primates, hoping to throw additional light on human behavior and the extent to which it is grounded in our primate heritage. Such studies provide a better understanding of the profound biological changes in our ancestors during the last five million years, changes that culminated in Homo sapiens. While biological anthropology is best known for the study of ancient humans and other primates, other branches of the field also make significant contributions. Forensic anthropologists assist law enforcement agencies in gathering and interpreting evidence in cases of homicide, massacres, and genocides; other biological anthropologists study the interaction of culture and biology as it affects our health, longevity, and wellbeing. Such researchers work on a range of topics including the spread of AIDS (autoimmune deficiency syndrome) and other communicable diseases; the relationship between health and social problems such as poverty, racism, and inequality; stress and rapid social change; diet and maternal well-being; and the long-term effects of violence and warfare. There is a close relationship between this type of biologically focused anthropology and the work of medical anthropologists, cultural anthropologists who study the social contexts of medical practice

Consolidation and Extensions in the Twentieth Century At the end of the nineteenth century some major review works appeared. The German David Hilbert (1862-1943) published in 1897 a long report on algebraic number theory. The next year the Englishman Alfred North Whitehead (1861-1947) ALGEBRAS 46 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 46 put out a detailed summary of several of them in his large book A Treatise on Universal Algebra, inspired by Grassmann but covering also Boole's logic, aspects of geometries, linear algebra, vectors, and parts of applied mathematics; an abandoned sequel was to have included quaternions. His title, taken from Sylvester, was not happy: no algebra is universal in the sense of embracing all others, and Whitehead did not offer one. Elsewhere, group theory rose further in status, to be joined by other abstract algebras, such as rings, fields (already recognized by Abel and Galois in their studies of polynomial equations), ideals, integral domains, and lattices, each inspired by applications. German-speaking mathematicians were especially prominent, as was the rising new mathematical nationality, the Americans. Building upon the teaching of Emmy Noether (1882-1935) and Emil Artin (1898-1962), B. L. van der Waerden's (1903-1996) book Modern Algebra became a standard text for abstract algebras and several applications, from the first (1930-1931) of its many editions. This abstract approach solved the mystery of the need for complex numbers when finding real roots of real polynomial equations. The key notion is closure: an algebra A is closed relative to an operation O on its objects a, or to a means of combining a and b, if Oa and a·b always belong to A. Now finding roots involved the operations of taking square, cube, . . . roots and complex but not real numbers are closed relative to them. One of the most striking features of mathematics in the twentieth century was the massive development of topology. Algebraic topology and topological groups are two of its parts, and algebras of various kinds have informed several others. Both (abstract) algebras and topology featured strongly in the formalization of pure mathematics expounded mainly after World War II by a team of French mathematicians writing under the collective name "Bourbaki."

Consolidation and Extensions in the Twentieth Century At the end of the nineteenth century some major review works appeared. The German David Hilbert (1862-1943) published in 1897 a long report on algebraic number theory. The next year the Englishman Alfred North Whitehead (1861-1947) ALGEBRAS 46 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 46 put out a detailed summary of several of them in his large book A Treatise on Universal Algebra, inspired by Grassmann but covering also Boole's logic, aspects of geometries, linear algebra, vectors, and parts of applied mathematics; an abandoned sequel was to have included quaternions. His title, taken from Sylvester, was not happy: no algebra is universal in the sense of embracing all others, and Whitehead did not offer one. Elsewhere, group theory rose further in status, to be joined by other abstract algebras, such as rings, fields (already recognized by Abel and Galois in their studies of polynomial equations), ideals, integral domains, and lattices, each inspired by applications. German-speaking mathematicians were especially prominent, as was the rising new mathematical nationality, the Americans. Building upon the teaching of Emmy Noether (1882-1935) and Emil Artin (1898-1962), B. L. van der Waerden's (1903-1996) book Modern Algebra became a standard text for abstract algebras and several applications, from the first (1930-1931) of its many editions. This abstract approach solved the mystery of the need for complex numbers when finding real roots of real polynomial equations. The key notion is closure: an algebra A is closed relative to an operation O on its objects a, or to a means of combining a and b, if Oa and a·b always belong to A. Now finding roots involved the operations of taking square, cube, . . . roots and complex but not real numbers are closed relative to them. One of the most striking features of mathematics in the twentieth century was the massive development of topology. Algebraic topology and topological groups are two of its parts, and algebras of various kinds have informed several others. Both (abstract) algebras and topology featured strongly in the formalization of pure mathematics expounded mainly after World War II by a team of French mathematicians writing under the collective name "Bourbaki."

Cultural Implications In Tylor's original formulation, animism was an argument for the universality of human intellectual and spiritual worlds. The universality of concepts of souls, and hence the universality of religion, is a major contribution of Tylor, one that endures into the twenty-first century. Like the Canelos Quichua, humans everywhere, in one way or another, and with very great differences, conceptualize into cultural systems the spiritual dimensions of life, as well as the corporeal aspects of quotidian existence. With this concept of universalism of fundamental religious thought, Victorian England and the rest of the English-lettered world was exposed to cultural relativism. What constitutes human difference in economy, society, psychology, and religion, then, is cultural, not biological. Although people are very "different" from one another, across space and through time, their mental capacities—cognitive, emotional, and imaginative—are not. As Clifford Geertz puts it: "The doctrine of the psychic unity of mankind, which so far as I am aware, is not seriously questioned by any reputable anthropologist, is but the direct contradictory of the primitive mentality argument" (p. 62). Tylor, however, very much the Victorian gentleman, began his quest for the bases of animism with what he called the "lower races," whom he also labeled "savages," "rude, non-religious tribes," and "tribes very low in the scale of humanity," among other such figures of speech that link evolutionary biology and culture, thereby enforcing the "primitive mentality argument" later expanded by Lucien Lévi-Bruhl in Les fonctions mentales dans les sociétés inférieures (1910; translated in 1996 as How Natives Think). The Victorian contradiction of enlightened cultural relativity, attached to a scalar view of humans as evolving from the "lower races" to the "civilized nations," leads to the racist paradox that a few civilizations evolved while the rest of the world's people "remained" animist. Animism, by this reasoning, is evidence of low-level "relics." This contradiction became canonized by the sixteenth century through the emergence of Western modernity and mercantilist capitalism and remains strong in twenty-first-century Western cosmology. It is, however, a fallacy. Every religious system, including the monotheistic religions such as Christianity and Islam, include representations of the supernatural with strong animistic dimensions. Despite religious scholars' assertions to the contrary, members of monotheistic religions nonetheless act at times as though there are spiritual beings detached from corporeal beings, manifest concern over the fate of their immortal souls, and make these beliefs part of their traditions, such as the jinn of Middle Eastern folklore, or of the dominant religion itself. Nonetheless, the enduring Victorian contradiction between cultural relativity and social evolution continues to cast a shadow over the religious beliefs of indigenous peoples, leading many of the world's people with rich beliefs in spirits and noncorporeal essences of animate and inanimate things—but without a "high god" organizer—to resent the concept animist because of its connotation of savagery. Among the Canelos Quichua, for example, spokespeople to the outside world often express considerable resentment at the use of the word. By the same token, animist symbolism does more than establish a template for understanding quotidian life and the universe. It also undergirds the ideological struggles of indigenous people to establish a place and space in nation-state life. In Amazonian Ecuador, for example, animistic concepts were utilized during political uprisings in 1990 and 1992, and again in 2000, when indigenous people rose up as one mighty body to claim—in part successfully—their territory and their rights. Animism as a concept is very powerful in its relativistic dimensions, but is destructive when used to place people in a universal or particular evolutionary scheme that ranges from primitive to civilized.

Cultural Implications In Tylor's original formulation, animism was an argument for the universality of human intellectual and spiritual worlds. The universality of concepts of souls, and hence the universality of religion, is a major contribution of Tylor, one that endures into the twenty-first century. Like the Canelos Quichua, humans everywhere, in one way or another, and with very great differences, conceptualize into cultural systems the spiritual dimensions of life, as well as the corporeal aspects of quotidian existence. With this concept of universalism of fundamental religious thought, Victorian England and the rest of the English-lettered world was exposed to cultural relativism. What constitutes human difference in economy, society, psychology, and religion, then, is cultural, not biological. Although people are very "different" from one another, across space and through time, their mental capacities—cognitive, emotional, and imaginative—are not. As Clifford Geertz puts it: "The doctrine of the psychic unity of mankind, which so far as I am aware, is not seriously questioned by any reputable anthropologist, is but the direct contradictory of the primitive mentality argument" (p. 62). Tylor, however, very much the Victorian gentleman, began his quest for the bases of animism with what he called the "lower races," whom he also labeled "savages," "rude, non-religious tribes," and "tribes very low in the scale of humanity," among other such figures of speech that link evolutionary biology and culture, thereby enforcing the "primitive mentality argument" later expanded by Lucien Lévi-Bruhl in Les fonctions mentales dans les sociétés inférieures (1910; translated in 1996 as How Natives Think). The Victorian contradiction of enlightened cultural relativity, attached to a scalar view of humans as evolving from the "lower races" to the "civilized nations," leads to the racist paradox that a few civilizations evolved while the rest of the world's people "remained" animist. Animism, by this reasoning, is evidence of low-level "relics." This contradiction became canonized by the sixteenth century through the emergence of Western modernity and mercantilist capitalism and remains strong in twenty-first-century Western cosmology. It is, however, a fallacy. Every religious system, including the monotheistic religions such as Christianity and Islam, include representations of the supernatural with strong animistic dimensions. Despite religious scholars' assertions to the contrary, members of monotheistic religions nonetheless act at times as though there are spiritual beings detached from corporeal beings, manifest concern over the fate of their immortal souls, and make these beliefs part of their traditions, such as the jinn of Middle Eastern folklore, or of the dominant religion itself. Nonetheless, the enduring Victorian contradiction between cultural relativity and social evolution continues to cast a shadow over the religious beliefs of indigenous peoples, leading many of the world's people with rich beliefs in spirits and noncorporeal essences of animate and inanimate things—but without a "high god" organizer—to resent the concept animist because of its connotation of savagery. Among the Canelos Quichua, for example, spokespeople to the outside world often express considerable resentment at the use of the word. By the same token, animist symbolism does more than establish a template for understanding quotidian life and the universe. It also undergirds the ideological struggles of indigenous people to establish a place and space in nation-state life. In Amazonian Ecuador, for example, animistic concepts were utilized during political uprisings in 1990 and 1992, and again in 2000, when indigenous people rose up as one mighty body to claim—in part successfully—their territory and their rights. Animism as a concept is very powerful in its relativistic dimensions, but is destructive when used to place people in a universal or particular evolutionary scheme that ranges from primitive to civilized.

EUROPE AND THE MIDDLE EAST To a modern observer, alchemy likely connotes only the transmutation of base metals into gold, or perhaps a more metaphorical transformation of the soul. In its roughly twothousand-year history, however, alchemy's practices and ideas have ranged much more broadly, encompassing everything from the production of dyes, medicines, precious metals, and gemstones to assaying techniques, matter theory, and spiritual practices linking the manipulation of matter to changes in the alchemist's soul. Although all of these dimensions were present from alchemy's beginnings, practitioners have chosen to highlight particular facets of their art at different times. Any definition of alchemy, therefore, must be both sensitive to its historical permutations and broad enough to include each of its chemical, pharmacological, metallurgical, and spiritual components. To be more precise, one may speak of technical or practical alchemy, spiritual alchemy, natural philosophical alchemy, transmutational alchemy, and medical alchemy (often referred to as iatrochemistry or chimiatria). This essay offers an overview of alchemy's changing meaning over its rich and long history.

EUROPE AND THE MIDDLE EAST To a modern observer, alchemy likely connotes only the transmutation of base metals into gold, or perhaps a more metaphorical transformation of the soul. In its roughly twothousand-year history, however, alchemy's practices and ideas have ranged much more broadly, encompassing everything from the production of dyes, medicines, precious metals, and gemstones to assaying techniques, matter theory, and spiritual practices linking the manipulation of matter to changes in the alchemist's soul. Although all of these dimensions were present from alchemy's beginnings, practitioners have chosen to highlight particular facets of their art at different times. Any definition of alchemy, therefore, must be both sensitive to its historical permutations and broad enough to include each of its chemical, pharmacological, metallurgical, and spiritual components. To be more precise, one may speak of technical or practical alchemy, spiritual alchemy, natural philosophical alchemy, transmutational alchemy, and medical alchemy (often referred to as iatrochemistry or chimiatria). This essay offers an overview of alchemy's changing meaning over its rich and long history.

Explanations Two explanations have been put forth for the conditions that produced the phenomenon of Afropessimism. One is the apparent inability of postcolonial African leaders to practice good governance. Since the 1960s, when most countries in Africa south of the Sahara regained political independence from European colonialists, the standard of living in Africa has fallen below expectations. The achievement of political self-rule naturally came with raised expectations of the good life for Africans who had been subjected to exploitation and subjugation by colonial tyranny. In the exuberance of the freedom moment, the new indigenous leaders of Africa promised their fellow citizens a brighter future. However, by the 1980s, more than twenty years after independence, the African condition (especially for the masses) had fallen far below the continent's potential. For the most part, bad leadership was responsible for the disappointing performance. Independence ushered in an era of political instability, military dictatorships, and gross mismanagement of natural resources by very corrupt African leaders. By the 1980s all these conspired to drive down the standard of living in most African countries, forcing an otherwise resource-rich continent to become severely dependent on foreign aid and foreign debt. Another school of thought locates the source of Africa's social and economic downfall on the international political environment. According to this school, Africa regained self-rule during the era of the Cold War, when the relationship between the countries in the Eastern bloc led by the Soviet Union and countries of the Western bloc led by the United States was marked by a state of military competition and political tension. The rivalry stopped short of actual war between the two superpowers, but it forced Africa to become a surrogate terrain for the hot war between the two camps. In the process, African countries, most of them weak and dependent on the Western or Eastern ideological blocs, became little more than client states. Under this new dispensation, Africans lost the power to choose their own leaders. Africa's dependent dictators owed their offices to the economic and military support of Cold War powers. For the most part they put the interests of the foreign powers on which they were dependent ahead of their own national interests. This situation, which was as exploitative and impoverishing as colonialism, became known as neocolonialism and is blamed for the postcolonial impoverishment of Africans that fueled the fires of Afropessimism in the 1980s. Consequently, with the end of the Cold War, some Africanists came to believe that if the detrimental international conditions it imposed were ultimately reversed, then conditions in Africa would improve through good governance. Those who believe in this are known as Afro-optimists. Challenging the view that sub-Saharan Africa has only regressed since independence, they advance examples of postcolonial triumphs achieved by Africa's political leadership despite the prevailing problems identified by Afropessimists. They argue that the energy and perseverance of African peoples portend hope for the future of the continent.

Explanations Two explanations have been put forth for the conditions that produced the phenomenon of Afropessimism. One is the apparent inability of postcolonial African leaders to practice good governance. Since the 1960s, when most countries in Africa south of the Sahara regained political independence from European colonialists, the standard of living in Africa has fallen below expectations. The achievement of political self-rule naturally came with raised expectations of the good life for Africans who had been subjected to exploitation and subjugation by colonial tyranny. In the exuberance of the freedom moment, the new indigenous leaders of Africa promised their fellow citizens a brighter future. However, by the 1980s, more than twenty years after independence, the African condition (especially for the masses) had fallen far below the continent's potential. For the most part, bad leadership was responsible for the disappointing performance. Independence ushered in an era of political instability, military dictatorships, and gross mismanagement of natural resources by very corrupt African leaders. By the 1980s all these conspired to drive down the standard of living in most African countries, forcing an otherwise resource-rich continent to become severely dependent on foreign aid and foreign debt. Another school of thought locates the source of Africa's social and economic downfall on the international political environment. According to this school, Africa regained self-rule during the era of the Cold War, when the relationship between the countries in the Eastern bloc led by the Soviet Union and countries of the Western bloc led by the United States was marked by a state of military competition and political tension. The rivalry stopped short of actual war between the two superpowers, but it forced Africa to become a surrogate terrain for the hot war between the two camps. In the process, African countries, most of them weak and dependent on the Western or Eastern ideological blocs, became little more than client states. Under this new dispensation, Africans lost the power to choose their own leaders. Africa's dependent dictators owed their offices to the economic and military support of Cold War powers. For the most part they put the interests of the foreign powers on which they were dependent ahead of their own national interests. This situation, which was as exploitative and impoverishing as colonialism, became known as neocolonialism and is blamed for the postcolonial impoverishment of Africans that fueled the fires of Afropessimism in the 1980s. Consequently, with the end of the Cold War, some Africanists came to believe that if the detrimental international conditions it imposed were ultimately reversed, then conditions in Africa would improve through good governance. Those who believe in this are known as Afro-optimists. Challenging the view that sub-Saharan Africa has only regressed since independence, they advance examples of postcolonial triumphs achieved by Africa's political leadership despite the prevailing problems identified by Afropessimists. They argue that the energy and perseverance of African peoples portend hope for the future of the continent.

Medieval Arabic Alchemy When Islamic empires expanded into centers of Hellenistic culture in the seventh century, Muslim natural philosophers and physicians inherited the Greek alchemical tradition. From the eighth to the tenth centuries, scholars in intellectual centers like Baghdad synthesized the basic elements of the Greek alchemical tradition. The anonymous editor of the Turba philosophorum (Crowd of philosophers; c. 900), for instance, assembled excerpts from various Greek alchemical authors into a virtual conversation. Alchemists writing in Arabic also elaborated on the Greek theoretical foundation, contributing a number of key concepts to alchemical matter theory and medicine. The word alchemy, a combination of the Arabic definite article al with the Greek word chemeia, or chymeia (likely derived from the word for smelting metals, cheein), represents this fusion of Greek and Arabic scholarship, while continued use of Arabic alchemical terms such as alkalai, alcohol, alembic, and elixir (al-iksir) highlights the legacy of Arabic scholarship. The translation of Greek alchemical texts into Arabic also underscores a central problem in the history of alchemy: pseudonymous texts. In the first centuries C.E., alchemical texts appeared purportedly authored by figures such as Plato, Socrates, Aristotle, and Cleopatra. Because Arabic and later European translators did not identify these pseudonyms as such, these prominent ancient figures entered the alchemical corpus as legitimate alchemical authors. The authorship of Arabic texts has been equally difficult for scholars to decipher. One of the more influential medieval Arabic texts, for instance, contains a dialogue between King Khalid ibn Yazid (c. 660-704) and a Christian hermit living in Jerusalem, Morienus. Although it is unclear whether Khalid and Morienus actually wrote this text, both figures remained prominent personages in the medieval alchemical tradition. ALCHEMY New Dictionary of the History of Ideas 41 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 41 A collection of thousands of texts dating to the eighth and tenth centuries known in Latin as the Corpus Gabirianum and attributed to Jabir ibn Hayaan (c. 721-c. 815) contained fundamental contributions to the medieval Arabic alchemical corpus. Among the innovations of the Corpus was the concept of a tripartite division of all things into soul, spirit, and body, a division that would play a central role in European alchemical thought of the sixteenth century. The Corpus also introduced the sulfur-mercury theory, which its author had adopted from the ninth-century author Balinus (pseudo-Apollonius of Tyana). This variation on Aristotle's forms and the Stoic pneuma stated that all matter was formed by two qualities, sulfur and mercury, the balance of which existed in varying degrees in different metals. Using the elixir (or philosophers' stone) to shift the balance between these two principles, the alchemist could transmute one metal into another. The Corpus also posited that the elixir could be made of plant or animal substances as well as mineral, and that it could be used both as a panacea in medicine and in transmutation. Just as the elixir "cured" base metals of their impurities by transmuting them into silver or gold, so too could it "cure" sick people of their illnesses. The Persian physician and philosopher Abu Bakr Muhammad ibn Zachariya al-Razi, known in Latin as Rhazes (c. 865-between 923 and 935) is best known for setting out a systematic summary of the "state of the field" of alchemy. AlRazi added a third quality, salt, to the sulfur-mercury theory, and divided the chemical world into animal, vegetable, and mineral realms. Al-Razi's texts show that he was clearly a practicing alchemist, describing experiments, apparatus, and ingredients, as well as the standard steps of the "great work" of making the elixir. Although both al-Razi and the Corpus Gabirianum provided theoretical justifications of the notion of transmutation, not all Arabic-speaking philosophers supported this idea. The physician Abdallah ibn Sina (Latinized as Avicenna; 980-1037) famously inveighed against the possibility of transmutation in his Kitab al-shifa (Book of the remedy), articulating an argument that would prove widely influential in the Latin Middle Ages.

Medieval Arabic Alchemy When Islamic empires expanded into centers of Hellenistic culture in the seventh century, Muslim natural philosophers and physicians inherited the Greek alchemical tradition. From the eighth to the tenth centuries, scholars in intellectual centers like Baghdad synthesized the basic elements of the Greek alchemical tradition. The anonymous editor of the Turba philosophorum (Crowd of philosophers; c. 900), for instance, assembled excerpts from various Greek alchemical authors into a virtual conversation. Alchemists writing in Arabic also elaborated on the Greek theoretical foundation, contributing a number of key concepts to alchemical matter theory and medicine. The word alchemy, a combination of the Arabic definite article al with the Greek word chemeia, or chymeia (likely derived from the word for smelting metals, cheein), represents this fusion of Greek and Arabic scholarship, while continued use of Arabic alchemical terms such as alkalai, alcohol, alembic, and elixir (al-iksir) highlights the legacy of Arabic scholarship. The translation of Greek alchemical texts into Arabic also underscores a central problem in the history of alchemy: pseudonymous texts. In the first centuries C.E., alchemical texts appeared purportedly authored by figures such as Plato, Socrates, Aristotle, and Cleopatra. Because Arabic and later European translators did not identify these pseudonyms as such, these prominent ancient figures entered the alchemical corpus as legitimate alchemical authors. The authorship of Arabic texts has been equally difficult for scholars to decipher. One of the more influential medieval Arabic texts, for instance, contains a dialogue between King Khalid ibn Yazid (c. 660-704) and a Christian hermit living in Jerusalem, Morienus. Although it is unclear whether Khalid and Morienus actually wrote this text, both figures remained prominent personages in the medieval alchemical tradition. ALCHEMY New Dictionary of the History of Ideas 41 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 41 A collection of thousands of texts dating to the eighth and tenth centuries known in Latin as the Corpus Gabirianum and attributed to Jabir ibn Hayaan (c. 721-c. 815) contained fundamental contributions to the medieval Arabic alchemical corpus. Among the innovations of the Corpus was the concept of a tripartite division of all things into soul, spirit, and body, a division that would play a central role in European alchemical thought of the sixteenth century. The Corpus also introduced the sulfur-mercury theory, which its author had adopted from the ninth-century author Balinus (pseudo-Apollonius of Tyana). This variation on Aristotle's forms and the Stoic pneuma stated that all matter was formed by two qualities, sulfur and mercury, the balance of which existed in varying degrees in different metals. Using the elixir (or philosophers' stone) to shift the balance between these two principles, the alchemist could transmute one metal into another. The Corpus also posited that the elixir could be made of plant or animal substances as well as mineral, and that it could be used both as a panacea in medicine and in transmutation. Just as the elixir "cured" base metals of their impurities by transmuting them into silver or gold, so too could it "cure" sick people of their illnesses. The Persian physician and philosopher Abu Bakr Muhammad ibn Zachariya al-Razi, known in Latin as Rhazes (c. 865-between 923 and 935) is best known for setting out a systematic summary of the "state of the field" of alchemy. AlRazi added a third quality, salt, to the sulfur-mercury theory, and divided the chemical world into animal, vegetable, and mineral realms. Al-Razi's texts show that he was clearly a practicing alchemist, describing experiments, apparatus, and ingredients, as well as the standard steps of the "great work" of making the elixir. Although both al-Razi and the Corpus Gabirianum provided theoretical justifications of the notion of transmutation, not all Arabic-speaking philosophers supported this idea. The physician Abdallah ibn Sina (Latinized as Avicenna; 980-1037) famously inveighed against the possibility of transmutation in his Kitab al-shifa (Book of the remedy), articulating an argument that would prove widely influential in the Latin Middle Ages.

Theoretical Foundations in Antiquity Such practical alchemical work received theoretical justification in part from Greek natural philosophy. Although Aristotle (384-322 B.C.E) did not write about alchemy per se, he provided a theory of matter that made it possible to conceptualize the transmutation of metals. Aristotle posited the notion that all things were composed of the same formless, passive matter (materia prima), which was then transformed into a specific substance by an active, shaping form. For alchemical theory, Aristotle's crucial notion was that, because the four elements—earth, fire, water, and air—were composed of the same basic matter, they could be transmuted into one another by altering their forms. Through the application of heat, for example, water could be transmuted into "air" (steam). From this, alchemists developed the idea of isolating the materia prima in metals and transmuting one into another through the use of an agent known as the philosophers' stone or elixir. The rich cultural resources of the Hellenistic world further developed alchemy's theoretical foundations. From Stoic matter theory, pneuma, or spirits, replaced Aristotelian forms as the active defining force of matter. From Babylonian astrological traditions alchemists adopted the identification of the seven metals (gold, silver, mercury, copper, iron, tin, and lead) with the seven planets (sun, moon, Mercury, Venus, Mars, Jupiter, and Saturn) and the division of both metals and planets into male and female. Finally, the central transmutational process of reducing metals to their materia prima before recreating them as gold or silver drew on ideas presented in the Egyptian myth of Isis and Osiris, in which Osiris was killed and dismembered before Isis brought him back to life. A collection of texts written in the first centuries C.E. known as the Hermetica were attributed to Hermes Trismegistos, a figure identified with the Egyptian god Thoth, the mythical creator of the arts and sciences. Although alchemy was only one topic among many in the Hermetica, in the European Middle Ages Hermes came to be known as the legendary first alchemist and alchemy as the "hermetic art." The Hermetica ranged in content from medical, astrological, and magical treatises to much more theosophical ruminations on the redemption of the spirit through gnosis. This association in the Hermetica between practical alchemy and spiritual gnosis found its way into alchemical theory through later authors. The link between spiritual and practical goals of transmutational alchemy is particularly evident in the work of the Alexandrian Zosimos of Panopolis (fl. 300 C.E.; later Latinized as Rosinius). In his compilation of older alchemical writings known as the Cheirokmeta, Zosimos wove his practical alchemy into a mystical theoretical framework that would prove just as enduring as alchemy's more technical concerns. Full of secretive language, dream sequences, and allegories, Zosimos's texts describe alchemical processes metaphorically—as sexual generation, for instance—and highlight the role of spirits in transforming matter. With a clear debt to Gnosticism, Zosimos established an enduring connection between practical laboratory work and spiritual perfection.

Theoretical Foundations in Antiquity Such practical alchemical work received theoretical justification in part from Greek natural philosophy. Although Aristotle (384-322 B.C.E) did not write about alchemy per se, he provided a theory of matter that made it possible to conceptualize the transmutation of metals. Aristotle posited the notion that all things were composed of the same formless, passive matter (materia prima), which was then transformed into a specific substance by an active, shaping form. For alchemical theory, Aristotle's crucial notion was that, because the four elements—earth, fire, water, and air—were composed of the same basic matter, they could be transmuted into one another by altering their forms. Through the application of heat, for example, water could be transmuted into "air" (steam). From this, alchemists developed the idea of isolating the materia prima in metals and transmuting one into another through the use of an agent known as the philosophers' stone or elixir. The rich cultural resources of the Hellenistic world further developed alchemy's theoretical foundations. From Stoic matter theory, pneuma, or spirits, replaced Aristotelian forms as the active defining force of matter. From Babylonian astrological traditions alchemists adopted the identification of the seven metals (gold, silver, mercury, copper, iron, tin, and lead) with the seven planets (sun, moon, Mercury, Venus, Mars, Jupiter, and Saturn) and the division of both metals and planets into male and female. Finally, the central transmutational process of reducing metals to their materia prima before recreating them as gold or silver drew on ideas presented in the Egyptian myth of Isis and Osiris, in which Osiris was killed and dismembered before Isis brought him back to life. A collection of texts written in the first centuries C.E. known as the Hermetica were attributed to Hermes Trismegistos, a figure identified with the Egyptian god Thoth, the mythical creator of the arts and sciences. Although alchemy was only one topic among many in the Hermetica, in the European Middle Ages Hermes came to be known as the legendary first alchemist and alchemy as the "hermetic art." The Hermetica ranged in content from medical, astrological, and magical treatises to much more theosophical ruminations on the redemption of the spirit through gnosis. This association in the Hermetica between practical alchemy and spiritual gnosis found its way into alchemical theory through later authors. The link between spiritual and practical goals of transmutational alchemy is particularly evident in the work of the Alexandrian Zosimos of Panopolis (fl. 300 C.E.; later Latinized as Rosinius). In his compilation of older alchemical writings known as the Cheirokmeta, Zosimos wove his practical alchemy into a mystical theoretical framework that would prove just as enduring as alchemy's more technical concerns. Full of secretive language, dream sequences, and allegories, Zosimos's texts describe alchemical processes metaphorically—as sexual generation, for instance—and highlight the role of spirits in transforming matter. With a clear debt to Gnosticism, Zosimos established an enduring connection between practical laboratory work and spiritual perfection.

Contemporary Anarchism While the Spanish Civil War, World War II (1939-1945), and the rise of totalitarian communist regimes after 1949 were events that effectively ended the further development of the historical anarchist movement, anarchist ideas and sensibilities were not as easily repressed. The political and cultural protest movements of the 1960s and 1970s in Europe and the Americas saw a resurgence of interest in anarchism. Feminists, ecologists, student radicals, pacifists, and others who were eager to question the prevailing social and moral preconceptions of modern society held by both the left and the right were drawn above all to the doctrine's iconoclasm. At this time, elements from a variety of nonlibertarian groups—the Situationists in France, for example—freely borrowed anarchist ideas in developing their own ideological positions. Anarchism has also been enriched by the thinking of some of the twentieth century's leading philosophers, political activists, artists, and intellectuals. Bertrand Russell, Herbert Read, Mahatma Gandhi, Martin Buber, Albert Camus, Michel Foucault, Paul Goodman, Lewis Mumford, and Noam Chomsky are among the notable figures who have been associated with anarchist beliefs and values. From the late twentieth century on, anarchism has continued to branch out in different directions. Anarchist ideas have been influential in the development of radical feminism and the Green and antiglobalist movements that have spread across Europe and the Americas. Contemporary anarchofeminism has its roots in the writings and activism of historically important figures like Emma Goldman, Voltairine de Cleyre, and Federica Montseny. Goldman was among the first female anarchists to emphasize that the emancipation of women in society must begin with psychological change within women themselves. By calling on women to struggle against the repressive and hierarchical structures that dominated their personal lives, Goldman anticipated late-twentieth-century anarcho-feminists, who have insisted that the "personal is political" and have developed a radical critique of everyday life. Anarchist principles also have been adopted by some of the more radical ecological movements of postindustrial societies. Libertarian social ecologists such as Murray Bookchin have attempted to extend the traditional anarchist demand to emancipate society from government rule to our natural environment, calling for an end to human beings' dominating and exploitative relationship with nature. Perhaps because of its shock value in an age crowded by political neologisms, the anarchist label has also been applied to groups that do not properly belong to the anarchist tradition. For example, the term "anarcho-capitalism" is sometimes used to refer to libertarian economic and social thinkers such as Ayn Rand, David Friedman, and other pro-capitalists who hold strong antistatist views. But even though they share the anarchist's contempt for state authority, their commitment to free enterprise and laissez-faire principles places them completely at odds with classical anarchist thinking and practice. Ever since the Cold War ended in 1991, small groups of anarchists around the world have been in the forefront of the antiglobalization movement. Like their predecessors, modern anarchist activists seek to expose the adverse power relationships that affect our daily lives. They are particularly concerned with the impact that the global expansion of corporate leviathans has had on society, not least because of the seemingly unlimited ways in which this advanced form of capitalism can manipulate and control the lives of individuals. While a few anarchist groups still resort to direct-action methods to get their revolutionary message across, a growing number are turning to advanced technologies like the Internet to promote their cause. In short, whether one admires or abhors anarchist principles, it cannot be denied that anarchism offers a critical perspective of authority that appears to be endlessly relevant to those who want to sharpen their awareness of the boundaries of personal freedom.

Contemporary Anarchism While the Spanish Civil War, World War II (1939-1945), and the rise of totalitarian communist regimes after 1949 were events that effectively ended the further development of the historical anarchist movement, anarchist ideas and sensibilities were not as easily repressed. The political and cultural protest movements of the 1960s and 1970s in Europe and the Americas saw a resurgence of interest in anarchism. Feminists, ecologists, student radicals, pacifists, and others who were eager to question the prevailing social and moral preconceptions of modern society held by both the left and the right were drawn above all to the doctrine's iconoclasm. At this time, elements from a variety of nonlibertarian groups—the Situationists in France, for example—freely borrowed anarchist ideas in developing their own ideological positions. Anarchism has also been enriched by the thinking of some of the twentieth century's leading philosophers, political activists, artists, and intellectuals. Bertrand Russell, Herbert Read, Mahatma Gandhi, Martin Buber, Albert Camus, Michel Foucault, Paul Goodman, Lewis Mumford, and Noam Chomsky are among the notable figures who have been associated with anarchist beliefs and values. From the late twentieth century on, anarchism has continued to branch out in different directions. Anarchist ideas have been influential in the development of radical feminism and the Green and antiglobalist movements that have spread across Europe and the Americas. Contemporary anarchofeminism has its roots in the writings and activism of historically important figures like Emma Goldman, Voltairine de Cleyre, and Federica Montseny. Goldman was among the first female anarchists to emphasize that the emancipation of women in society must begin with psychological change within women themselves. By calling on women to struggle against the repressive and hierarchical structures that dominated their personal lives, Goldman anticipated late-twentieth-century anarcho-feminists, who have insisted that the "personal is political" and have developed a radical critique of everyday life. Anarchist principles also have been adopted by some of the more radical ecological movements of postindustrial societies. Libertarian social ecologists such as Murray Bookchin have attempted to extend the traditional anarchist demand to emancipate society from government rule to our natural environment, calling for an end to human beings' dominating and exploitative relationship with nature. Perhaps because of its shock value in an age crowded by political neologisms, the anarchist label has also been applied to groups that do not properly belong to the anarchist tradition. For example, the term "anarcho-capitalism" is sometimes used to refer to libertarian economic and social thinkers such as Ayn Rand, David Friedman, and other pro-capitalists who hold strong antistatist views. But even though they share the anarchist's contempt for state authority, their commitment to free enterprise and laissez-faire principles places them completely at odds with classical anarchist thinking and practice. Ever since the Cold War ended in 1991, small groups of anarchists around the world have been in the forefront of the antiglobalization movement. Like their predecessors, modern anarchist activists seek to expose the adverse power relationships that affect our daily lives. They are particularly concerned with the impact that the global expansion of corporate leviathans has had on society, not least because of the seemingly unlimited ways in which this advanced form of capitalism can manipulate and control the lives of individuals. While a few anarchist groups still resort to direct-action methods to get their revolutionary message across, a growing number are turning to advanced technologies like the Internet to promote their cause. In short, whether one admires or abhors anarchist principles, it cannot be denied that anarchism offers a critical perspective of authority that appears to be endlessly relevant to those who want to sharpen their awareness of the boundaries of personal freedom.

Liberalism is the second oldest black ideology. Scholars often insert "liberal" as an adjective to describe the political values associated with the incorporativist perspective in order to distinguish this set of discourses from conservative and radical approaches, which also advocate incorporation into U.S. society, albeit for radicals into a fundamentally transformed society. And some variants of nationalism and feminism are decidedly liberal in their political social perspectives. Here incorporative should be understood as synonymous with liberal pluralism. Smith has argued that liberalism, or incorporativism, is not so much an ideology as a strategy. Applying the logic Anthony Bogues used regarding Quobna Cugoano's Thoughts and Sentiments on the Evil of Slavery to the work of Douglass, DuBois, Charles Johnson, Alain LeRoy Locke (1886-1954), Martin Luther King, Jr., Thurgood Marshall (1908-1993), Mary Frances Berry (b. 1938), and Lani Guiner (b. 1950), among AFRICAN-AMERICAN IDEAS New Dictionary of the History of Ideas 27 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 27 others, illuminates how their critiques of U.S. society stretched the boundaries of liberal pluralism and provided alternative theories of American democracy. Incorporative approaches advocate structural integration into the U.S. state and political economy, arguing that the inclusion of African-Americans in traditionally white institutions in and of itself constitutes a significant transformation. This is the view that undergirded the legal strategy Charles Hamilton Houston (1895-1950) designed for the National Association for the Advancement of Colored People (NAACP) during the 1930s and of the civil rights movement of the 1960s. Incorporativists are more ambivalent toward inclusion in European-American civil society: they tend to oppose restrictions barring access to traditionally white institutions, but usually oppose dismantling African-American institutions. For instance, Martin Luther King, Jr. never imagined the dissolution of the black church. Historically a few incorporativists, such as Frederick Douglass, have advocated biological assimilation as well as structural integration. Incorporative approaches generally encourage acceptance of bourgeois European-American cultural values and practices, viewing them as quintessentially "modern," even though privately many incorporativists often continue to practice the styles and idioms of African-American culture. While tension exists between the core components of white liberalism and black incorporativism, particularly regarding individualism and notions of equality—equal opportunity versus equality of results—incorporativists generally share their white colleagues' views on the power of knowledge, the value of diversity, and economic enterprise. Incorporativists are also skeptical of American liberalism's assertions of pluralism, especially the assertion that the U.S. state is neutral regarding racial and ethnic disputes. Although incorporativists are often critical of the practice of American liberalism, they tend to accept most of its values

Liberalism is the second oldest black ideology. Scholars often insert "liberal" as an adjective to describe the political values associated with the incorporativist perspective in order to distinguish this set of discourses from conservative and radical approaches, which also advocate incorporation into U.S. society, albeit for radicals into a fundamentally transformed society. And some variants of nationalism and feminism are decidedly liberal in their political social perspectives. Here incorporative should be understood as synonymous with liberal pluralism. Smith has argued that liberalism, or incorporativism, is not so much an ideology as a strategy. Applying the logic Anthony Bogues used regarding Quobna Cugoano's Thoughts and Sentiments on the Evil of Slavery to the work of Douglass, DuBois, Charles Johnson, Alain LeRoy Locke (1886-1954), Martin Luther King, Jr., Thurgood Marshall (1908-1993), Mary Frances Berry (b. 1938), and Lani Guiner (b. 1950), among AFRICAN-AMERICAN IDEAS New Dictionary of the History of Ideas 27 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 27 others, illuminates how their critiques of U.S. society stretched the boundaries of liberal pluralism and provided alternative theories of American democracy. Incorporative approaches advocate structural integration into the U.S. state and political economy, arguing that the inclusion of African-Americans in traditionally white institutions in and of itself constitutes a significant transformation. This is the view that undergirded the legal strategy Charles Hamilton Houston (1895-1950) designed for the National Association for the Advancement of Colored People (NAACP) during the 1930s and of the civil rights movement of the 1960s. Incorporativists are more ambivalent toward inclusion in European-American civil society: they tend to oppose restrictions barring access to traditionally white institutions, but usually oppose dismantling African-American institutions. For instance, Martin Luther King, Jr. never imagined the dissolution of the black church. Historically a few incorporativists, such as Frederick Douglass, have advocated biological assimilation as well as structural integration. Incorporative approaches generally encourage acceptance of bourgeois European-American cultural values and practices, viewing them as quintessentially "modern," even though privately many incorporativists often continue to practice the styles and idioms of African-American culture. While tension exists between the core components of white liberalism and black incorporativism, particularly regarding individualism and notions of equality—equal opportunity versus equality of results—incorporativists generally share their white colleagues' views on the power of knowledge, the value of diversity, and economic enterprise. Incorporativists are also skeptical of American liberalism's assertions of pluralism, especially the assertion that the U.S. state is neutral regarding racial and ethnic disputes. Although incorporativists are often critical of the practice of American liberalism, they tend to accept most of its values

Nationalism and the Idea of Anticolonialism With the exception of the Vietnamese and to a lesser extent the Indonesians (who had to endure the return of the colonial powers following World War II), the eventual exit of the European powers from the political scene created an important intellectual vacuum within which scholars of the former colonies could operate. Many of these "home" scholars sought to repair, renovate, or even remove the histories produced under colonial tutelage. Heeding the needs of nationhood, Southeast Asian scholars, many of whom were trained in European schools, began redressing the histories that were written for them by colonial historians by writing from the perspective of the nation. Where rebels, political activists, and influential religious figures were once marginalized and condemned by colonial historians, they were now transformed into "national" heroes who contributed to the fruition and emergence of the nation-state. Figures such as Java's Dipanagoro (c. 1785-1855), the Philippines' José Rizal (1861-1896), Burma's Saya San (d. 1931), and Vietnam's Tran Van Tra (1918-1996) became part of a common history of the nation and struggle that contributed to the imagining of the nation. Moreover, the rebellions and incidents first identified by colonial officials as being important were appropriated by home scholars for their narratives, intent on recasting the perspective in which they had originally been presented. So "anticolonial" movements became seen as independence movements, affecting the way in which protest and resistance was interpreted. For instance, the tone of the scholarship and the analysis of the movements were sympathetic rather than critical, shifting the movements' role and importance in history to demonstrate a national consciousness that was growing during colonial rule. Earlier elements of resistance that colonial writers had highlighted in order to establish the "backward" nature of political expression (such as tattooing, religious symbols, and language) were played down by nationalist historians in favor of more "objective" economic and political origins, although the interest and focus in causal factors as prescribed by colonial documents was nevertheless maintained. Local conceptions of protest and revolt were unintentionally deemed irrelevant, because nationalist scholars were keen on writing a modern narrative of the new nation. The shape and scope of anticolonialism had not changed, only its interpretation and coloring. While these adjustments were being made by home scholars writing through the lens of the nation, scholars in the West began to reconsider anticolonialism within the context of nation as well, choosing to consider indigenous expressions of protest and revolt (which were ironically being played down by their counterparts in Southeast Asia) as evidence of protonationalism. As a result, the major rebellions and revolts (which continued to dominate the attention of scholars) that had taken on a religious or culturally specific character were deemed important to study under the rubric of "Asian" nationalism, which seemed to make these once dismissed ideological influences important and relevant to scholarly study. Consequently, disturbances and outbreaks of violence that demonstrated religious overtones drew attention on the grounds that they were early expressions of nationalism and therefore warranted closer scrutiny. The Saya San Rebellion (1930-1932) in Burma, which made use of Buddhist ideas in its program, was now being considered as a "Buddhist" protonationalist movement, suggesting that religion and other Southeast Asian ideological sources were important to understanding the growth and expression of Asian nationalism. Similarly, Dipanogoro's rebellion in Java represented an Islamic nationalism that would precede movements in the twentieth century, while the Filipino revolt launched in 1896 by Andres Bonifacio (1863- 1897), which alluded to Christian ideas, seemed to forecast the origins of a national consciousness.

Nationalism and the Idea of Anticolonialism With the exception of the Vietnamese and to a lesser extent the Indonesians (who had to endure the return of the colonial powers following World War II), the eventual exit of the European powers from the political scene created an important intellectual vacuum within which scholars of the former colonies could operate. Many of these "home" scholars sought to repair, renovate, or even remove the histories produced under colonial tutelage. Heeding the needs of nationhood, Southeast Asian scholars, many of whom were trained in European schools, began redressing the histories that were written for them by colonial historians by writing from the perspective of the nation. Where rebels, political activists, and influential religious figures were once marginalized and condemned by colonial historians, they were now transformed into "national" heroes who contributed to the fruition and emergence of the nation-state. Figures such as Java's Dipanagoro (c. 1785-1855), the Philippines' José Rizal (1861-1896), Burma's Saya San (d. 1931), and Vietnam's Tran Van Tra (1918-1996) became part of a common history of the nation and struggle that contributed to the imagining of the nation. Moreover, the rebellions and incidents first identified by colonial officials as being important were appropriated by home scholars for their narratives, intent on recasting the perspective in which they had originally been presented. So "anticolonial" movements became seen as independence movements, affecting the way in which protest and resistance was interpreted. For instance, the tone of the scholarship and the analysis of the movements were sympathetic rather than critical, shifting the movements' role and importance in history to demonstrate a national consciousness that was growing during colonial rule. Earlier elements of resistance that colonial writers had highlighted in order to establish the "backward" nature of political expression (such as tattooing, religious symbols, and language) were played down by nationalist historians in favor of more "objective" economic and political origins, although the interest and focus in causal factors as prescribed by colonial documents was nevertheless maintained. Local conceptions of protest and revolt were unintentionally deemed irrelevant, because nationalist scholars were keen on writing a modern narrative of the new nation. The shape and scope of anticolonialism had not changed, only its interpretation and coloring. While these adjustments were being made by home scholars writing through the lens of the nation, scholars in the West began to reconsider anticolonialism within the context of nation as well, choosing to consider indigenous expressions of protest and revolt (which were ironically being played down by their counterparts in Southeast Asia) as evidence of protonationalism. As a result, the major rebellions and revolts (which continued to dominate the attention of scholars) that had taken on a religious or culturally specific character were deemed important to study under the rubric of "Asian" nationalism, which seemed to make these once dismissed ideological influences important and relevant to scholarly study. Consequently, disturbances and outbreaks of violence that demonstrated religious overtones drew attention on the grounds that they were early expressions of nationalism and therefore warranted closer scrutiny. The Saya San Rebellion (1930-1932) in Burma, which made use of Buddhist ideas in its program, was now being considered as a "Buddhist" protonationalist movement, suggesting that religion and other Southeast Asian ideological sources were important to understanding the growth and expression of Asian nationalism. Similarly, Dipanogoro's rebellion in Java represented an Islamic nationalism that would precede movements in the twentieth century, while the Filipino revolt launched in 1896 by Andres Bonifacio (1863- 1897), which alluded to Christian ideas, seemed to forecast the origins of a national consciousness.

Tunisia, Egypt, and Morocco In the case of Tunisia, Egypt, and Morocco, the decision of Britain and France to take over the reins of government (in 1881, 1882, and 1912) was at least partly precipitated by local opposition to the draconian financial measures that the European powers had forced local governments to impose in order to repay the debts they had contracted on the various European money markets. The ruler of Tunisia, Ahmad Bey (1837-1855), made strenuous efforts both to modernize Tunisia and to assert its independence from Istanbul, and he had been substantially aided by France in the latter objective. By the time of his death, Tunisia had a modern army and a modern navy; the Bey's brother-in-law, who survived him by nearly twenty years, was a modernizing finance minister and prime minister, and an Italian family provided the state's foreign ministers until 1878. In 1861, much to the discomfiture of Muhammad al-Sadiq Bey (1859-1882), Tunisia adopted a constitution and a modern (that is, generally secular) legal system under which the Bey's prerogatives were quite limited. These reforms were better received in the outside world and among the sizeable local European community than within Tunisia, where a rural rising against the new legal system and the new taxes was put down with considerable brutality in 1864. ANTICOLONIALISM New Dictionary of the History of Ideas 89 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 89 As happened in Egypt at much the same time, the contracting of substantial foreign debts (generally used to build the infrastructures that made the reforms possible or to pay the European consultants—officers, engineers, and so forth—in charge of putting them into effect) and the general mismanagement and corruption associated with the loans meant that the country found itself increasingly at the mercy of its foreign creditors. Tunisia declared bankruptcy in 1869 and Egypt in 1876. The sterling efforts of the reformer Khayr al-Din (c. 1825- 1889) to balance the budget were no match for French colonial ambitions, which eventually forced the Bey to accept a protectorate under the terms of the Treaty of Bardo in May 1881. By 1892, four-fifths of cultivated lands were in French hands. The situation in Egypt was similar; the additional taxes imposed as a result of British and French administration of the public debt, initiated in 1876 essentially to ensure that the bond-holders got their money back, eventually gave rise to a nationalist movement. Many of its members had the additional grievance that the government of Egypt was conducted by foreigners, that is, a Turco-Circassian aristocracy consisting of the descendants of the viceroy Muhammad 'Ali (1780-1848) and their courtiers, in which native Egyptians constantly encountered a glass ceiling. Another interesting component of the rebellion led by Ahmad 'Urabi (1839-1911) between 1879 and 1882 was the emphasis on restoring Egypt fully to the Ottoman Empire. Although relatively large numbers of foreigners resided in Egypt, they were generally neither settlers nor colons in the French North African sense: most were not bureaucrats or farmers and had not lived there for generations; they resided mostly in the cities and engaged in commerce or in service occupations. In addition, most of them were not citizens of the occupying power. In spite of a succession of strong rulers for much of the nineteenth century, Morocco was also unable to avoid colonial penetration, first economic (imports of tea, sugar, candles, and cotton cloth; exports of wool, cereals, and ostrich feathers) and then military. The first major confrontation between locals and Europeans occurred in 1859 to 1860, when Spain besieged Tetouan. A month later, Spain demanded an indemnity as the price of withdrawal, and although the terms were punitive half the indemnity was paid within two years. This involved great hardship, particularly the imposition of nontraditional agricultural taxation, which caused considerable unrest. A massive devaluation of the currency took place, as did a near-universal switch to foreign coinage. Like Tunisia and Egypt, Morocco gradually moved from a state of general economic self-sufficiency to dependence on the world market. Morocco gradually became dependent on foreign loans and declared bankrupcty in 1903. Largely to preempt German colonial efforts, France and Britain signed the Entente Cordiale in 1904, under which Britain recognized France's preeminence in Morocco and France formally accepted the British occupation of Egypt. Franco-Spanish occupation of Morocco was formalized in 1912

Tunisia, Egypt, and Morocco In the case of Tunisia, Egypt, and Morocco, the decision of Britain and France to take over the reins of government (in 1881, 1882, and 1912) was at least partly precipitated by local opposition to the draconian financial measures that the European powers had forced local governments to impose in order to repay the debts they had contracted on the various European money markets. The ruler of Tunisia, Ahmad Bey (1837-1855), made strenuous efforts both to modernize Tunisia and to assert its independence from Istanbul, and he had been substantially aided by France in the latter objective. By the time of his death, Tunisia had a modern army and a modern navy; the Bey's brother-in-law, who survived him by nearly twenty years, was a modernizing finance minister and prime minister, and an Italian family provided the state's foreign ministers until 1878. In 1861, much to the discomfiture of Muhammad al-Sadiq Bey (1859-1882), Tunisia adopted a constitution and a modern (that is, generally secular) legal system under which the Bey's prerogatives were quite limited. These reforms were better received in the outside world and among the sizeable local European community than within Tunisia, where a rural rising against the new legal system and the new taxes was put down with considerable brutality in 1864. ANTICOLONIALISM New Dictionary of the History of Ideas 89 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 89 As happened in Egypt at much the same time, the contracting of substantial foreign debts (generally used to build the infrastructures that made the reforms possible or to pay the European consultants—officers, engineers, and so forth—in charge of putting them into effect) and the general mismanagement and corruption associated with the loans meant that the country found itself increasingly at the mercy of its foreign creditors. Tunisia declared bankruptcy in 1869 and Egypt in 1876. The sterling efforts of the reformer Khayr al-Din (c. 1825- 1889) to balance the budget were no match for French colonial ambitions, which eventually forced the Bey to accept a protectorate under the terms of the Treaty of Bardo in May 1881. By 1892, four-fifths of cultivated lands were in French hands. The situation in Egypt was similar; the additional taxes imposed as a result of British and French administration of the public debt, initiated in 1876 essentially to ensure that the bond-holders got their money back, eventually gave rise to a nationalist movement. Many of its members had the additional grievance that the government of Egypt was conducted by foreigners, that is, a Turco-Circassian aristocracy consisting of the descendants of the viceroy Muhammad 'Ali (1780-1848) and their courtiers, in which native Egyptians constantly encountered a glass ceiling. Another interesting component of the rebellion led by Ahmad 'Urabi (1839-1911) between 1879 and 1882 was the emphasis on restoring Egypt fully to the Ottoman Empire. Although relatively large numbers of foreigners resided in Egypt, they were generally neither settlers nor colons in the French North African sense: most were not bureaucrats or farmers and had not lived there for generations; they resided mostly in the cities and engaged in commerce or in service occupations. In addition, most of them were not citizens of the occupying power. In spite of a succession of strong rulers for much of the nineteenth century, Morocco was also unable to avoid colonial penetration, first economic (imports of tea, sugar, candles, and cotton cloth; exports of wool, cereals, and ostrich feathers) and then military. The first major confrontation between locals and Europeans occurred in 1859 to 1860, when Spain besieged Tetouan. A month later, Spain demanded an indemnity as the price of withdrawal, and although the terms were punitive half the indemnity was paid within two years. This involved great hardship, particularly the imposition of nontraditional agricultural taxation, which caused considerable unrest. A massive devaluation of the currency took place, as did a near-universal switch to foreign coinage. Like Tunisia and Egypt, Morocco gradually moved from a state of general economic self-sufficiency to dependence on the world market. Morocco gradually became dependent on foreign loans and declared bankrupcty in 1903. Largely to preempt German colonial efforts, France and Britain signed the Entente Cordiale in 1904, under which Britain recognized France's preeminence in Morocco and France formally accepted the British occupation of Egypt. Franco-Spanish occupation of Morocco was formalized in 1912

Ambiguity According to Aristotle, the substitution of opposite qualities hosted by a substance during a transformation has a discontinuous character. His logic seems to imply a step-by-step flow of time and rules out the intervention of a critical situation where opposite qualities can smoothly cooperate and compete together in the same substance. This schematizes evolution as a quasi-static change of objects rather than a continuous course of events. Aristotle's conception is reflected in the rigid aesthetic canons of the art of antiquity. For instance, in Myron's Discobolus (The discus thrower), fifth century B.C.E., Museo nazionale, Rome, time seems to be frozen in the act of launching the discus. Furthermore, throughout two millennia, the tertium non datur has influenced Mediterranean culture. It is only during the twentieth century that, thanks to an attentive evaluation of the nature of time and the adoption of a probabilistic approach to the evolution of natural systems, ambiguity, meaning the coexistence or confluence of two or more incompatible aspects in the same reality, has acquired a non-negative connotation in the Western world

Ambiguity According to Aristotle, the substitution of opposite qualities hosted by a substance during a transformation has a discontinuous character. His logic seems to imply a step-by-step flow of time and rules out the intervention of a critical situation where opposite qualities can smoothly cooperate and compete together in the same substance. This schematizes evolution as a quasi-static change of objects rather than a continuous course of events. Aristotle's conception is reflected in the rigid aesthetic canons of the art of antiquity. For instance, in Myron's Discobolus (The discus thrower), fifth century B.C.E., Museo nazionale, Rome, time seems to be frozen in the act of launching the discus. Furthermore, throughout two millennia, the tertium non datur has influenced Mediterranean culture. It is only during the twentieth century that, thanks to an attentive evaluation of the nature of time and the adoption of a probabilistic approach to the evolution of natural systems, ambiguity, meaning the coexistence or confluence of two or more incompatible aspects in the same reality, has acquired a non-negative connotation in the Western world

Islam and Anticolonialism A number of factors are crucial to understanding the various manifestations of anticolonialism in the Arab world in the nineteenth and twentieth centuries. In the first place, the colonial period coincided with several movements of Islamic renewal; the same phenomenon can also be observed in the Indian subcontinent, West Africa, Central Asia, and Southeast Asia. Some movements clearly were, or became, reactions to colonialism, but one of the most influential, the Wahhabis in the center of the Arabian peninsula, both predated colonialism in the region and originated in an area relatively distant from any direct colonial activity. In the late eighteenth and nineteenth centuries, such renewal or reform movements spread out over a wide geographical area. Some, such as the Sanusi jihad, based in Saharan Libya, later the backbone of resistance to Italian colonization, exhibited an organizational structure similar to that of the Sufi orders, based on a network of lodges; others were urban-based, often around traditional centers of Islamic learning, while yet others were millenarian. Thus in the 1880s, the Sudanese Mahdi preached that he was the divinely appointed regenerator of Islam and consciously imitated the life and career of the Prophet. The renewal movements were by no means always sympathetic to, or even tolerant of, one another. Muhammad al-Mahdi al-Sanusi (1844- 1902), for example, was at pains to point out that the Mahdi was not entitled to claim either the leadership of the universal Islamic community or a transcendental relationship with the Prophet Muhammad, and Wahhabism (if not checked by more prudent political considerations) has often exhibited considerable intolerance toward other manifestations of Islam. The reform movements fed into anticolonialism in a number of ways. One of their effects was to draw a battle line between those rulers and elites in the Islamic world who were prepared to make accommodations to European colonizers and those sections of the community who were not. Thus 'Abd alQadir (1808-1883), the early leader of the resistance to the French, was quick to make use of a fatwa (legal opinion) obtained from the Mufti of Fez stating that those Muslims who cooperated with non-Muslims against other Muslims could be considered apostate and thus could be killed or enslaved if captured. Later in the nineteenth century, Ba Ahmad, the chamberlain of the Moroccan sultan 'Abd al-'Aziz (r. 1894-1908), believed his only recourse was to buy off or otherwise accommodate the French, who were making incursions into southern Morocco from both Algeria and Senegal. This policy alienated many influential religious and tribal leaders, who were bitterly opposed to the Commander of the Faithful giving up "the lands of Islam" to foreign invaders; some of them considered that this made him illegitimate and transferred their allegiance to a more combative leader.

Islam and Anticolonialism A number of factors are crucial to understanding the various manifestations of anticolonialism in the Arab world in the nineteenth and twentieth centuries. In the first place, the colonial period coincided with several movements of Islamic renewal; the same phenomenon can also be observed in the Indian subcontinent, West Africa, Central Asia, and Southeast Asia. Some movements clearly were, or became, reactions to colonialism, but one of the most influential, the Wahhabis in the center of the Arabian peninsula, both predated colonialism in the region and originated in an area relatively distant from any direct colonial activity. In the late eighteenth and nineteenth centuries, such renewal or reform movements spread out over a wide geographical area. Some, such as the Sanusi jihad, based in Saharan Libya, later the backbone of resistance to Italian colonization, exhibited an organizational structure similar to that of the Sufi orders, based on a network of lodges; others were urban-based, often around traditional centers of Islamic learning, while yet others were millenarian. Thus in the 1880s, the Sudanese Mahdi preached that he was the divinely appointed regenerator of Islam and consciously imitated the life and career of the Prophet. The renewal movements were by no means always sympathetic to, or even tolerant of, one another. Muhammad al-Mahdi al-Sanusi (1844- 1902), for example, was at pains to point out that the Mahdi was not entitled to claim either the leadership of the universal Islamic community or a transcendental relationship with the Prophet Muhammad, and Wahhabism (if not checked by more prudent political considerations) has often exhibited considerable intolerance toward other manifestations of Islam. The reform movements fed into anticolonialism in a number of ways. One of their effects was to draw a battle line between those rulers and elites in the Islamic world who were prepared to make accommodations to European colonizers and those sections of the community who were not. Thus 'Abd alQadir (1808-1883), the early leader of the resistance to the French, was quick to make use of a fatwa (legal opinion) obtained from the Mufti of Fez stating that those Muslims who cooperated with non-Muslims against other Muslims could be considered apostate and thus could be killed or enslaved if captured. Later in the nineteenth century, Ba Ahmad, the chamberlain of the Moroccan sultan 'Abd al-'Aziz (r. 1894-1908), believed his only recourse was to buy off or otherwise accommodate the French, who were making incursions into southern Morocco from both Algeria and Senegal. This policy alienated many influential religious and tribal leaders, who were bitterly opposed to the Commander of the Faithful giving up "the lands of Islam" to foreign invaders; some of them considered that this made him illegitimate and transferred their allegiance to a more combative leader.

Renaissance Europe The fifteenth-century rediscovery of the ancient Hermetic corpus and a concurrent revival of Neoplatonism introduced learned Europeans to ancient connections among alchemy, gnosis, and magic. Alchemy's renewed associations with magic and nature's occult forces broadened the primarily technical and natural philosophical discussions of alchemy of the Middle Ages. Hermetic philosophers began to reenvision themselves as operators who might use knowledge of nature to manipulate their world. At the same time, alchemy's practical utility gained prominence in the sixteenth century, primarily through the work of the Swiss physician and alchemist Paracelsus (1493-1541) and his followers. Although he did not deny the possibility of alchemical transmutation, Paracelsus focused mainly on alchemy's medical applications, advocating a new kind of alchemical medicine, known as chimiatria or iatrochemistry, in which practitioners used alchemical distillation and extraction ALCHEMY 42 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 42 to isolate the quintessence of matter for medicinal purposes. These powerful (and often controversial) new alchemical drugs were designed to attack the specific disease, in contrast to traditional humoral medicine, which sought to treat the balance of humors within the body as a whole. On the theoretical level, Paracelsus also offered a new understanding of matter, complementing the ancient four elements with the tria principia or tria prima, adding salt to the two medieval principles of matter (sulfur and mercury). By refining alchemical matter theory and resituating medicine on a new foundation of alchemy, Paracelsus and his followers gave the ancient art a new prominence both in the study of nature and in the practice of medicine. Transmutation remained a prominent goal for many alchemists, particularly the consumers of a burgeoning trade in printed alchemical books. By the sixteenth century, alchemy had burst the bounds of the world of scholarship and found a much wider audience in Europe. In particular, alchemy's central image of purification and ennoblement resonated with religious reformers such as Desiderius Erasmus, Martin Luther, and the German mystic Jakob Böhme, all of whom drew on alchemical metaphors and imagery in their writings. Alchemical authors also made explicit connections between alchemy and Christianity. Heinrich Khunrath (1560-1605), for instance, specifically articulated such connections, comparing the healing power of the philosophers' stone to the redemptive powers of Christ in the Ampitheatrum sapientiae aeternae solis verae of 1595. Urban readers, literate artisans, and learned ladies also took an interest in alchemy, consuming popular vernacular alchemical literature that tended to focus on the practical benefits of the art, particularly the production of precious metals, gemstones, and medicines. Many self-trained alchemists ultimately found employment among Europe's courts, where many princes generously supported practical and theoretical alchemical work. Faced with an increasingly crowded field of practitioners and no traditional markers of authority such as university degrees, licenses, or guilds, those who consumed alchemical knowledge and products struggled to make their own decisions about what constituted legitimate alchemy. With such a wide variety of people both seeking out knowledge and claiming to be alchemists, new debates emerged in the sixteenth century about what true alchemy was and who could legitimately claim to practice it.

Renaissance Europe The fifteenth-century rediscovery of the ancient Hermetic corpus and a concurrent revival of Neoplatonism introduced learned Europeans to ancient connections among alchemy, gnosis, and magic. Alchemy's renewed associations with magic and nature's occult forces broadened the primarily technical and natural philosophical discussions of alchemy of the Middle Ages. Hermetic philosophers began to reenvision themselves as operators who might use knowledge of nature to manipulate their world. At the same time, alchemy's practical utility gained prominence in the sixteenth century, primarily through the work of the Swiss physician and alchemist Paracelsus (1493-1541) and his followers. Although he did not deny the possibility of alchemical transmutation, Paracelsus focused mainly on alchemy's medical applications, advocating a new kind of alchemical medicine, known as chimiatria or iatrochemistry, in which practitioners used alchemical distillation and extraction ALCHEMY 42 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 42 to isolate the quintessence of matter for medicinal purposes. These powerful (and often controversial) new alchemical drugs were designed to attack the specific disease, in contrast to traditional humoral medicine, which sought to treat the balance of humors within the body as a whole. On the theoretical level, Paracelsus also offered a new understanding of matter, complementing the ancient four elements with the tria principia or tria prima, adding salt to the two medieval principles of matter (sulfur and mercury). By refining alchemical matter theory and resituating medicine on a new foundation of alchemy, Paracelsus and his followers gave the ancient art a new prominence both in the study of nature and in the practice of medicine. Transmutation remained a prominent goal for many alchemists, particularly the consumers of a burgeoning trade in printed alchemical books. By the sixteenth century, alchemy had burst the bounds of the world of scholarship and found a much wider audience in Europe. In particular, alchemy's central image of purification and ennoblement resonated with religious reformers such as Desiderius Erasmus, Martin Luther, and the German mystic Jakob Böhme, all of whom drew on alchemical metaphors and imagery in their writings. Alchemical authors also made explicit connections between alchemy and Christianity. Heinrich Khunrath (1560-1605), for instance, specifically articulated such connections, comparing the healing power of the philosophers' stone to the redemptive powers of Christ in the Ampitheatrum sapientiae aeternae solis verae of 1595. Urban readers, literate artisans, and learned ladies also took an interest in alchemy, consuming popular vernacular alchemical literature that tended to focus on the practical benefits of the art, particularly the production of precious metals, gemstones, and medicines. Many self-trained alchemists ultimately found employment among Europe's courts, where many princes generously supported practical and theoretical alchemical work. Faced with an increasingly crowded field of practitioners and no traditional markers of authority such as university degrees, licenses, or guilds, those who consumed alchemical knowledge and products struggled to make their own decisions about what constituted legitimate alchemy. With such a wide variety of people both seeking out knowledge and claiming to be alchemists, new debates emerged in the sixteenth century about what true alchemy was and who could legitimately claim to practice it.

After Political Independence: The Struggle Continues The attainment of political independence by Asian and African countries left several questions unresolved. There was the question of ideology in the postcolonial period. In many countries, successful nationalist movements were essentially coalition parties, representing several ideological positions and tendencies. In Indonesia, President Sukarno (1901-1970) argued that the nationalist movement must be inclusive, and hence he saw "nothing to prevent Nationalists from working together with Moslems and Marxists." This expedient inclusiveness began to unravel in the postcolonial period. On this matter, it is vital to remember that these ideological questions were debated against the backdrop of the Cold War, which had an indelible impact not only on the texture of decolonization the imperial powers were willing to entertain but also on internal postcolonial ideological tensions. Closely related is that decolonization did not lead to economic freedom or even sustained economic growth and development in most Asian and African countries. What happened? Asian and African countries rarely, if ever, inherited vibrant, varied, and integrated national economies. What they inherited, says Basil Davidson (1974), was a colonial economic system that for centuries had "developed little save the raw materials needed in the Atlantic world." In Africa, the imperial powers both before and after independence imposed "an institutionalized relationship between Africans and Europeans," which facilitated the exploitation of Africans and their resources. As in Asian countries, this system has proved to be very difficult to change. The consequences have been economic stagnation, often regression, and widespread poverty. These economic problems have seriously compromised the essence of the political freedom won after so much sacrifice and determination. Yet in many of these countries, as in Latin America, the ruling elite lead opulent lifestyles amid grinding ANTICOLONIALISM 82 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 82 and widespread poverty. This cannot be taken as an indicator of economic development and social progress; it is the product of corruption, patronage, and oppression. Only "true decolonization," according to Fanon, could prevent the rise of an African national bourgeois eager to strike a self-serving compromise with Western imperialism. The critical point to remember is that in Africa, as in Asia, the inherited "economies remained externally oriented" and did not "provide the basis for a strong national economy." The essence of decolonization has also been frequently compromised by the demands and expectations of foreign aid. For Asian and African countries, such aid has been an almost permanent feature of postcolonial history, but has not led to economic independence and progress; quite often, poverty and economic stagnation have persisted. Since the 1950s, aid from the West to African, Asian, and other developing countries has been guided by shifting political and economic paradigms. These have included import substitution, population control, and expansion of exports. "Structural adjustment" in support of globalization is the paradigm that currently guides the dispensation of Western aid. Still, poverty and poor economic performance have persisted in most Asian and African countries. The formulation and implementation of these development fashions clearly indicate that a conscious effort has been made to sidestep tackling the fundamental and exploitative relationship between imperial powers and former colonies. The result, as described by Mahbub ul-Haq, is that a "poverty curtain" now exists, dividing the world into "two unequal humanities— one embarrassingly rich and the other desperately poor." The struggle also continues in cultural affairs—a struggle over respect for Asian and African cultures. Culture is intricately linked to dignity and identity. Dignity, and with it cultural pride, are especially important for a people whose past has been dominated by alien rule and culture, and colonialism was hostile to the vibrant growth and assertion of local culture. This hostility was clearly evident in the propagation of racial stereotypes demeaning to Asians and Africans, characterizations and beliefs that are, sadly, not yet dead. A matter of critical importance to the people in former colonies is the survival of their cultures in the age of globalization, which has facilitated the rapidly expanding marketing of Western entertainment. Films, music, and general attitudes toward lifestyle promote a sort of global homogenization that is Western-derived and -controlled. This has not stimulated the survival or growth of local cultures and values. World culture is thought to be threatened if diversity is lost. In the postcolonial period, culture has once again been invoked in the West as lying behind the poverty of developing countries. The cultures, values, and attitudes of most Asian, African, and Latin American countries, not their colonial legacy or even their underdevelopment by the West, are said to be at the root of their poverty; their traditional cultures are seen to inescapably impede progress. This is in contrast to Western societies, whose cultural values both inspire and facilitate progress. In the United States, key proponents of this "cultural factor" include Samuel P. Huntington and Lawrence E. Harrison. The principal contentions advanced in arguing the cultural factor are not new. They formed an integral part of imperialism's theory and practice in Asia and Africa in the nineteenth century. They now mark the resurrection of a theory of development that has a distinct imperial lineage—cultural imperialism. As in the past, this theory avoids embracing history in its formulation and analysis. Perhaps even more crucial, it avoids discussing the origin and management of the current Western-dominated international political economy. There is no serious attempt made to analyze how this economy makes it particularly difficult for the majority of Asian and African countries to reap the economic and social benefits of decolonization. The controversial and emotional question of language has emerged as critical in discussions of decolonization. What should be the language of creativity in African and Asian countries newly liberated from Western imperial rule? Many writers in these societies have agonized over this matter, concerned that the continued use of European tongues in literature and sometimes as the national language constitutes "linguistic imperialism." In Africa, the foremost critic of what is called linguistic imperialism is Ngugi wa Thiong'o, Kenya's most eminent writer. In a 1991 interview, Ngugi emphasized the reality that a "very tiny minority, the tip of every nationality, speak French or English or Portuguese." Since most Africans speak their native languages, an African author who writes in a European language (rather than creating literature in an African language, which would then be translated into other African languages) essentially shuts off the huge majority and instead addresses fellow members of the elite. This is inherently undemocratic and is unlikely to serve as the cornerstone of a national literary tradition. Further, Ngugi holds that "African thought, literary thought, is imprisoned in foreign languages" and that African thinkers and writers, "even at their most radical, even at their most revolutionary are alienated from the majority." To Ngugi and his supporters, the language question, "is the key, not the only one, but definitely a very, very important key to the decolonization process."

After Political Independence: The Struggle Continues The attainment of political independence by Asian and African countries left several questions unresolved. There was the question of ideology in the postcolonial period. In many countries, successful nationalist movements were essentially coalition parties, representing several ideological positions and tendencies. In Indonesia, President Sukarno (1901-1970) argued that the nationalist movement must be inclusive, and hence he saw "nothing to prevent Nationalists from working together with Moslems and Marxists." This expedient inclusiveness began to unravel in the postcolonial period. On this matter, it is vital to remember that these ideological questions were debated against the backdrop of the Cold War, which had an indelible impact not only on the texture of decolonization the imperial powers were willing to entertain but also on internal postcolonial ideological tensions. Closely related is that decolonization did not lead to economic freedom or even sustained economic growth and development in most Asian and African countries. What happened? Asian and African countries rarely, if ever, inherited vibrant, varied, and integrated national economies. What they inherited, says Basil Davidson (1974), was a colonial economic system that for centuries had "developed little save the raw materials needed in the Atlantic world." In Africa, the imperial powers both before and after independence imposed "an institutionalized relationship between Africans and Europeans," which facilitated the exploitation of Africans and their resources. As in Asian countries, this system has proved to be very difficult to change. The consequences have been economic stagnation, often regression, and widespread poverty. These economic problems have seriously compromised the essence of the political freedom won after so much sacrifice and determination. Yet in many of these countries, as in Latin America, the ruling elite lead opulent lifestyles amid grinding ANTICOLONIALISM 82 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 82 and widespread poverty. This cannot be taken as an indicator of economic development and social progress; it is the product of corruption, patronage, and oppression. Only "true decolonization," according to Fanon, could prevent the rise of an African national bourgeois eager to strike a self-serving compromise with Western imperialism. The critical point to remember is that in Africa, as in Asia, the inherited "economies remained externally oriented" and did not "provide the basis for a strong national economy." The essence of decolonization has also been frequently compromised by the demands and expectations of foreign aid. For Asian and African countries, such aid has been an almost permanent feature of postcolonial history, but has not led to economic independence and progress; quite often, poverty and economic stagnation have persisted. Since the 1950s, aid from the West to African, Asian, and other developing countries has been guided by shifting political and economic paradigms. These have included import substitution, population control, and expansion of exports. "Structural adjustment" in support of globalization is the paradigm that currently guides the dispensation of Western aid. Still, poverty and poor economic performance have persisted in most Asian and African countries. The formulation and implementation of these development fashions clearly indicate that a conscious effort has been made to sidestep tackling the fundamental and exploitative relationship between imperial powers and former colonies. The result, as described by Mahbub ul-Haq, is that a "poverty curtain" now exists, dividing the world into "two unequal humanities— one embarrassingly rich and the other desperately poor." The struggle also continues in cultural affairs—a struggle over respect for Asian and African cultures. Culture is intricately linked to dignity and identity. Dignity, and with it cultural pride, are especially important for a people whose past has been dominated by alien rule and culture, and colonialism was hostile to the vibrant growth and assertion of local culture. This hostility was clearly evident in the propagation of racial stereotypes demeaning to Asians and Africans, characterizations and beliefs that are, sadly, not yet dead. A matter of critical importance to the people in former colonies is the survival of their cultures in the age of globalization, which has facilitated the rapidly expanding marketing of Western entertainment. Films, music, and general attitudes toward lifestyle promote a sort of global homogenization that is Western-derived and -controlled. This has not stimulated the survival or growth of local cultures and values. World culture is thought to be threatened if diversity is lost. In the postcolonial period, culture has once again been invoked in the West as lying behind the poverty of developing countries. The cultures, values, and attitudes of most Asian, African, and Latin American countries, not their colonial legacy or even their underdevelopment by the West, are said to be at the root of their poverty; their traditional cultures are seen to inescapably impede progress. This is in contrast to Western societies, whose cultural values both inspire and facilitate progress. In the United States, key proponents of this "cultural factor" include Samuel P. Huntington and Lawrence E. Harrison. The principal contentions advanced in arguing the cultural factor are not new. They formed an integral part of imperialism's theory and practice in Asia and Africa in the nineteenth century. They now mark the resurrection of a theory of development that has a distinct imperial lineage—cultural imperialism. As in the past, this theory avoids embracing history in its formulation and analysis. Perhaps even more crucial, it avoids discussing the origin and management of the current Western-dominated international political economy. There is no serious attempt made to analyze how this economy makes it particularly difficult for the majority of Asian and African countries to reap the economic and social benefits of decolonization. The controversial and emotional question of language has emerged as critical in discussions of decolonization. What should be the language of creativity in African and Asian countries newly liberated from Western imperial rule? Many writers in these societies have agonized over this matter, concerned that the continued use of European tongues in literature and sometimes as the national language constitutes "linguistic imperialism." In Africa, the foremost critic of what is called linguistic imperialism is Ngugi wa Thiong'o, Kenya's most eminent writer. In a 1991 interview, Ngugi emphasized the reality that a "very tiny minority, the tip of every nationality, speak French or English or Portuguese." Since most Africans speak their native languages, an African author who writes in a European language (rather than creating literature in an African language, which would then be translated into other African languages) essentially shuts off the huge majority and instead addresses fellow members of the elite. This is inherently undemocratic and is unlikely to serve as the cornerstone of a national literary tradition. Further, Ngugi holds that "African thought, literary thought, is imprisoned in foreign languages" and that African thinkers and writers, "even at their most radical, even at their most revolutionary are alienated from the majority." To Ngugi and his supporters, the language question, "is the key, not the only one, but definitely a very, very important key to the decolonization process."

Aims and Objectives Political freedom from colonial rule was viewed by nationalists and their supporters as the instrument to redress the economic and social neglect and injustices of the colonial era. Kwame Nkrumah of Ghana urged fellow nationalists throughout Africa to "seek first the political kingdom and all else would be added unto" them. There could be no meaningful social and economic progress without political independence. Nationalist activists and their followers expected an improvement in their living conditions. This desire and expectation to live in dignity partly explains the political support given by the masses to the nationalists and nationalism. In India, Jawaharlal Nehru (1889-1964) argued emphatically that without political freedom, Indians would have no power to shape their destiny. They would remain "hopeless victims of external forces" that oppressed and exploited them. In semicolonial China, Mao Zedong (1893-1976) also saw the critical value of national political freedom from external domination. Without this freedom, there could be no advancement of his revolutionary social and economic program. It would have been futile to dream of building a communist society without a nation in which to construct it. Nationalism identified exploitation as the primary economic mission of colonialism—exploitation of the colonized people and their labor and resources. In Africa, the expansion of cash crop production, land alienation, mineral exploitation, and even limited manufacturing toward the end of colonial rule enriched the Western capitalist countries, and not Africans. In the struggle for decolonization, there was a general outcry by the nationalists against this exploitation. The colonial economic record in Asia, and especially in India, provided nationalists with rich evidence of exploitation. There was the familiar example of British investment in railways that remained quite profitable to the investors but did little for Indian economic advancement. Colonialism had deemphasized the production of food crops while actively promoting the growing of cotton, jute, indigo, and opium, which fetched high prices overseas. The profits that accrued from these exports and other British capital were, to a large extent, invested in "white settler countries," such as Canada, Australia, and New Zealand, and not in India. In addition, European powers did not encourage, nor support, the industrialization of their colonies. In India, this aggravated economic and social problems on the eve of decolonization. Socially and culturally, colonialism was a racist system. The era of "modern" nineteenth-century imperialism was also the era of scientific racism. Colonialism, mediated through racism and racist policies, limited and even forbade meaningful crosscultural dialogue between colonizer and colonized. Throughout most of Africa and Asia, racism was "not an incidental detail, but . . . a consubstantial part of colonialism." Harsh, brutal, and deliberately discriminatory treatment at the hands of European colonizers was the constant, painful reminder to Africans and Asians that they were a colonized and humiliated people. All nationalists, irrespective of ideological differences, were generally agreed that such treatment was indefensible; it must be ended.

Aims and Objectives Political freedom from colonial rule was viewed by nationalists and their supporters as the instrument to redress the economic and social neglect and injustices of the colonial era. Kwame Nkrumah of Ghana urged fellow nationalists throughout Africa to "seek first the political kingdom and all else would be added unto" them. There could be no meaningful social and economic progress without political independence. Nationalist activists and their followers expected an improvement in their living conditions. This desire and expectation to live in dignity partly explains the political support given by the masses to the nationalists and nationalism. In India, Jawaharlal Nehru (1889-1964) argued emphatically that without political freedom, Indians would have no power to shape their destiny. They would remain "hopeless victims of external forces" that oppressed and exploited them. In semicolonial China, Mao Zedong (1893-1976) also saw the critical value of national political freedom from external domination. Without this freedom, there could be no advancement of his revolutionary social and economic program. It would have been futile to dream of building a communist society without a nation in which to construct it. Nationalism identified exploitation as the primary economic mission of colonialism—exploitation of the colonized people and their labor and resources. In Africa, the expansion of cash crop production, land alienation, mineral exploitation, and even limited manufacturing toward the end of colonial rule enriched the Western capitalist countries, and not Africans. In the struggle for decolonization, there was a general outcry by the nationalists against this exploitation. The colonial economic record in Asia, and especially in India, provided nationalists with rich evidence of exploitation. There was the familiar example of British investment in railways that remained quite profitable to the investors but did little for Indian economic advancement. Colonialism had deemphasized the production of food crops while actively promoting the growing of cotton, jute, indigo, and opium, which fetched high prices overseas. The profits that accrued from these exports and other British capital were, to a large extent, invested in "white settler countries," such as Canada, Australia, and New Zealand, and not in India. In addition, European powers did not encourage, nor support, the industrialization of their colonies. In India, this aggravated economic and social problems on the eve of decolonization. Socially and culturally, colonialism was a racist system. The era of "modern" nineteenth-century imperialism was also the era of scientific racism. Colonialism, mediated through racism and racist policies, limited and even forbade meaningful crosscultural dialogue between colonizer and colonized. Throughout most of Africa and Asia, racism was "not an incidental detail, but . . . a consubstantial part of colonialism." Harsh, brutal, and deliberately discriminatory treatment at the hands of European colonizers was the constant, painful reminder to Africans and Asians that they were a colonized and humiliated people. All nationalists, irrespective of ideological differences, were generally agreed that such treatment was indefensible; it must be ended.

First Half of the Twentieth Century The closing decades of the nineteenth century, as well as seeing a new interest in "altruism" as an economic and political doctrine, witnessed an accelerated professionalization of intellectual discussions of the subject. Whereas writers like Lewes, Eliot, Mill, and Spencer had pursued their intellectual projects outside the universities (they were, to use Collini's phrase, "public moralists"), it was increasingly the case by the turn of the twentieth century that rigorous academic discussions of moral philosophy, economics, psychology, and sociology were conducted by university-based experts. The resultant discussions were thus both more detached from public political life and more fragmented. In the first half of the twentieth century the influence of the ethos of logical positivism meant that those working in the human and social sciences were inclined to avoid or even to deny the meaningfulness of questions with ethical and religious overtones. G. E. Moore claimed (in his 1903 work Principia ethica) that any system of ethics that tried to draw moral conclusions on the basis of a scientific account of human nature and society (as the systems of both Comte and Spencer had done) committed the "naturalistic fallacy." (See Maienschein and Ruse's collection of essays investigating the possibility of founding ethics on biology.) Finally, the success of the neo-Darwinian synthesis in biology and the rejection of the doctrine of the inheritance of acquired characteristics seemed to undermine earlier theories of the gradual evolution of greater altruism. All that was left was a starkly amoral vision of nature as the domain of competition and natural selection. All of these factors meant that even though philosophers, sociologists, and economists continued to discuss concepts of altruism, the first fifty or sixty years of the twentieth century saw a reduction of academic interest in the subject.

First Half of the Twentieth Century The closing decades of the nineteenth century, as well as seeing a new interest in "altruism" as an economic and political doctrine, witnessed an accelerated professionalization of intellectual discussions of the subject. Whereas writers like Lewes, Eliot, Mill, and Spencer had pursued their intellectual projects outside the universities (they were, to use Collini's phrase, "public moralists"), it was increasingly the case by the turn of the twentieth century that rigorous academic discussions of moral philosophy, economics, psychology, and sociology were conducted by university-based experts. The resultant discussions were thus both more detached from public political life and more fragmented. In the first half of the twentieth century the influence of the ethos of logical positivism meant that those working in the human and social sciences were inclined to avoid or even to deny the meaningfulness of questions with ethical and religious overtones. G. E. Moore claimed (in his 1903 work Principia ethica) that any system of ethics that tried to draw moral conclusions on the basis of a scientific account of human nature and society (as the systems of both Comte and Spencer had done) committed the "naturalistic fallacy." (See Maienschein and Ruse's collection of essays investigating the possibility of founding ethics on biology.) Finally, the success of the neo-Darwinian synthesis in biology and the rejection of the doctrine of the inheritance of acquired characteristics seemed to undermine earlier theories of the gradual evolution of greater altruism. All that was left was a starkly amoral vision of nature as the domain of competition and natural selection. All of these factors meant that even though philosophers, sociologists, and economists continued to discuss concepts of altruism, the first fifty or sixty years of the twentieth century saw a reduction of academic interest in the subject.

Moore A good place to mark the start of analytical philosophy is therefore with the young G. E. Moore's (1873-1958) emphatic denunciation of this idealist philosophy. Moore rejected internal relations and organic wholes, and in their place he gives priority to individual judgments, or propositions, and their constituent concepts. Since he holds that true propositions are real structures that do not represent facts, but constitute them, it follows that an analysis of a proposition into its constituent concepts is equally an analysis of a fact into its elements: as he puts it "A thing becomes intelligible first when it is analysed into its constituent concepts" (1899; 1993, p. 8). Thus in Moore's early work a method of conceptual analysis is employed to identify the basic properties of things. This is manifest in Moore's Principia Ethica (1903), where Moore famously argues that goodness is the basic ethical property and thus that ethical theory is the theory of the good. It should be observed, however, that Moore's method of analysis does not specify the content of his theory of the good, even though this is also supposed to be a priori. Moore's method of metaethical analysis is therefore combined with an appeal to intuitive reflection concerning synthetic a priori ethical truths; and one of the issues that has remained a matter of debate is just what contribution conceptual analysis has to offer to ethical theory.

Moore A good place to mark the start of analytical philosophy is therefore with the young G. E. Moore's (1873-1958) emphatic denunciation of this idealist philosophy. Moore rejected internal relations and organic wholes, and in their place he gives priority to individual judgments, or propositions, and their constituent concepts. Since he holds that true propositions are real structures that do not represent facts, but constitute them, it follows that an analysis of a proposition into its constituent concepts is equally an analysis of a fact into its elements: as he puts it "A thing becomes intelligible first when it is analysed into its constituent concepts" (1899; 1993, p. 8). Thus in Moore's early work a method of conceptual analysis is employed to identify the basic properties of things. This is manifest in Moore's Principia Ethica (1903), where Moore famously argues that goodness is the basic ethical property and thus that ethical theory is the theory of the good. It should be observed, however, that Moore's method of analysis does not specify the content of his theory of the good, even though this is also supposed to be a priori. Moore's method of metaethical analysis is therefore combined with an appeal to intuitive reflection concerning synthetic a priori ethical truths; and one of the issues that has remained a matter of debate is just what contribution conceptual analysis has to offer to ethical theory.

AGNOSTICISM. The heyday of agnosticism was in Victorian Britain between the 1860s and the 1890s. Its leading exponents were Herbert Spencer (1820-1903), Thomas Henry Huxley (1825-1895) (who coined the term), Leslie Stephen (1832-1904), John Tyndall (1820-1893), and William Kingdon Clifford (1845-1879). This group all shared a disillusionment with orthodox Christianity; an opposition to the dominance of British science and education by the Anglican establishment; belief in the theory of evolution and in the importance of science more broadly; and an aspiration to replace dogmatism and superstition with a freethinking, scientific, and ethical religion (see Lightman, 1987, 1989, 2002; Pyle; Turner, 1974, 1993). While agnosticism may have been an antitheological and secularist movement, it was certainly not antireligious. The Victorian agnostics were intensely moralistic people who had a deep sense of the spiritual, especially as evoked by the wonders of the natural world.

AGNOSTICISM. The heyday of agnosticism was in Victorian Britain between the 1860s and the 1890s. Its leading exponents were Herbert Spencer (1820-1903), Thomas Henry Huxley (1825-1895) (who coined the term), Leslie Stephen (1832-1904), John Tyndall (1820-1893), and William Kingdon Clifford (1845-1879). This group all shared a disillusionment with orthodox Christianity; an opposition to the dominance of British science and education by the Anglican establishment; belief in the theory of evolution and in the importance of science more broadly; and an aspiration to replace dogmatism and superstition with a freethinking, scientific, and ethical religion (see Lightman, 1987, 1989, 2002; Pyle; Turner, 1974, 1993). While agnosticism may have been an antitheological and secularist movement, it was certainly not antireligious. The Victorian agnostics were intensely moralistic people who had a deep sense of the spiritual, especially as evoked by the wonders of the natural world.

Anarchist Principles in Context The ideas associated with modern anarchism can be traced to the period of the French Revolution (1789-1799), although they did not crystallize into a formal political doctrine until the middle part of the nineteenth century. The first book that offered the clearest intimation of the anarchist conception of society was William Godwin's An Enquiry concerning Political Justice and Its Influence upon General Virtue and Happiness (London, 1793). In this, Godwin identifies the state as the main source of all social ills. In order for humans to live freely and harmoniously, Godwin advocates the establishment of a stateless society in which individuals are no longer subject to the economic exploitation of others. Despite its antistatist message, the ideas found in Godwin's magnum opus belong to a tradition of British political radicalism that cannot be classified as anarchist. In fact his work had its greatest influence on the liberal thinkers of his age as well as on Robert Owen, John Gray, and other early socialist reformers. Of far greater significance to the development of modern anarchist ideology is the French social philosopher PierreJoseph Proudhon, whose indictment of private property under capitalism was made famous in his book What is Property? (1840). Proudhon's main contributions to the anarchist view of society lay in his theories of mutualism and federalism. In the former he argued that the exploitative capitalist system could be undermined by creating economic organizations such as the People's Bank, an institution of mutual credit meant to restore the equilibrium between what individuals produce and what they consume. Because he believed that concentrating political power in the hands of the state militated against the economic forms he was proposing, Proudhon argued for a society in which power radiated from the bottom upward and was distributed along federal or regional lines. Though he himself never belonged to any party or political organization, Proudhon's writings inspired a substantial following among freethinkers, liberal intellectuals, and workers across Europe, particularly in France and Spain. One of his most famous disciples was the Russian anarchist Mikhail Bakunin (1814-1876). Like Proudhon, Bakunin was an eclectic thinker who was constantly revising and reformulating his views on society. More so than Proudhon, who did not believe that the transition to an anarchist society demanded violent and sweeping changes, Bakunin gave both physical and ideological expression to the view that revolutionary upheaval was a necessary and unavoidable process of social development, a view summed up in his oft-quoted dictum, "The urge to destroy is also a creative urge." At the core of his creed was collectivism, by which he meant that the land and means of production should be collectively owned and that future society should be organized around voluntary associations—such as peasant communes or trade unions—that were not regulated or controlled by any form of government. Too impatient to set forth a systematic exegesis of his antiauthoritarian beliefs, Bakunin tended to express his concepts in tracts that could be used by revolutionary bodies (for example, the Alliance of Social Democracy) with which he was associated. Indeed Bakunin's most enduring legacy to anarchism resides in his conception of revolutionary transformation. According to him, the dispossessed and most repressed elements of society— particularly the working classes and peasantry in economically backward countries such as Spain, Italy, and Russia—were by nature revolutionary and only needed proper leadership to rise up against their oppressors. Because he adamantly rejected the Marxian notion that conquering political power was a precondition for overthrowing capitalism, Bakunin was convinced that the exploited masses should be led into revolt by a small and dedicated group of individuals who were acting in their interests. It was his belief that revolution could not be achieved until the state was completely abolished, which brought him into conflict with Karl Marx and his followers, who insisted that a "dictatorship of the proletariat" was a necessary phase in the transition to a stateless society (communism). Bakunin's antipolitical conception of revolutionary change as well as his forceful repudiation of the authoritarian communist principles embodied in the Marxism of his day drove a wedge between his adherents in the First International (1864-1876) and those of Karl Marx, thus establishing a divide in the European socialist movement that would never be bridged. ANARCHISM New Dictionary of the History of Ideas 67 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 67 However, not all anarchists were hostile toward the idea of communism. Another Russian aristocrat turned revolutionary, Peter Kropotkin, developed over the course of his lifetime a sociological theory of anarchism that combined the antiauthoritarian beliefs espoused by his predecessors in the anarchist movement with those of communism. Unlike Proudhon and Bakunin, both of whom believed in the right of individual possession of products produced from one's labor, Kropotkin advocated an economic system based on the communal ownership of all goods and services that operated on the socialist principle "from each according to his abilities, to each according to his needs." By distributing society's wealth and resources in this way, Kropotkin and other anarchist communists believed that everyone, including those who were unproductive, would be able to live in relative abundance. Kropotkin's greatest contributions to anarchist theory, however, were his attempts to present anarchism as an empirically verifiable worldview, one that was based on "a mechanical explanation of world phenomena, embracing the whole of nature." Following in the positivist tradition laid down by Auguste Comte (1798-1857), Herbert Spencer (1820-1903), and other forerunners of modern social science, Kropotkin believed that the study of society was analogous to that of the world of nature. In Modern Science and Anarchism (1912), for example, Kropotkin contends that the anarchist method of sociological enquiry is that of the exact natural sciences. "Its aim," he says, "is to construct a synthetic philosophy comprehending in one generalization all the phenomena of nature—and therefore also the life of societies." In developing his views on human nature, Kropotkin went farther in extending the analogy between society and the natural world. Like the Social Darwinists of his era, Kropotkin maintained that all human behavior was a reflection of our biological condition. But while most Social Darwinists argued that a "tooth and nail" impulse in the struggle for existence was the dominant natural law governing the evolution of human behavior, Kropotkin insisted that the instinct of cooperation was an even more important factor in this process. According to him, it is the species in nature that shows the greatest tendency toward mutual aid—not cutthroat competition—that is the one most likely to survive and flourish over time. By arguing in this way, Kropotkin was attempting to demonstrate that anarchism was a highly evolved state of human nature but one that could not be obtained until the state and other coercive institutions were completely abolished

Anarchist Principles in Context The ideas associated with modern anarchism can be traced to the period of the French Revolution (1789-1799), although they did not crystallize into a formal political doctrine until the middle part of the nineteenth century. The first book that offered the clearest intimation of the anarchist conception of society was William Godwin's An Enquiry concerning Political Justice and Its Influence upon General Virtue and Happiness (London, 1793). In this, Godwin identifies the state as the main source of all social ills. In order for humans to live freely and harmoniously, Godwin advocates the establishment of a stateless society in which individuals are no longer subject to the economic exploitation of others. Despite its antistatist message, the ideas found in Godwin's magnum opus belong to a tradition of British political radicalism that cannot be classified as anarchist. In fact his work had its greatest influence on the liberal thinkers of his age as well as on Robert Owen, John Gray, and other early socialist reformers. Of far greater significance to the development of modern anarchist ideology is the French social philosopher PierreJoseph Proudhon, whose indictment of private property under capitalism was made famous in his book What is Property? (1840). Proudhon's main contributions to the anarchist view of society lay in his theories of mutualism and federalism. In the former he argued that the exploitative capitalist system could be undermined by creating economic organizations such as the People's Bank, an institution of mutual credit meant to restore the equilibrium between what individuals produce and what they consume. Because he believed that concentrating political power in the hands of the state militated against the economic forms he was proposing, Proudhon argued for a society in which power radiated from the bottom upward and was distributed along federal or regional lines. Though he himself never belonged to any party or political organization, Proudhon's writings inspired a substantial following among freethinkers, liberal intellectuals, and workers across Europe, particularly in France and Spain. One of his most famous disciples was the Russian anarchist Mikhail Bakunin (1814-1876). Like Proudhon, Bakunin was an eclectic thinker who was constantly revising and reformulating his views on society. More so than Proudhon, who did not believe that the transition to an anarchist society demanded violent and sweeping changes, Bakunin gave both physical and ideological expression to the view that revolutionary upheaval was a necessary and unavoidable process of social development, a view summed up in his oft-quoted dictum, "The urge to destroy is also a creative urge." At the core of his creed was collectivism, by which he meant that the land and means of production should be collectively owned and that future society should be organized around voluntary associations—such as peasant communes or trade unions—that were not regulated or controlled by any form of government. Too impatient to set forth a systematic exegesis of his antiauthoritarian beliefs, Bakunin tended to express his concepts in tracts that could be used by revolutionary bodies (for example, the Alliance of Social Democracy) with which he was associated. Indeed Bakunin's most enduring legacy to anarchism resides in his conception of revolutionary transformation. According to him, the dispossessed and most repressed elements of society— particularly the working classes and peasantry in economically backward countries such as Spain, Italy, and Russia—were by nature revolutionary and only needed proper leadership to rise up against their oppressors. Because he adamantly rejected the Marxian notion that conquering political power was a precondition for overthrowing capitalism, Bakunin was convinced that the exploited masses should be led into revolt by a small and dedicated group of individuals who were acting in their interests. It was his belief that revolution could not be achieved until the state was completely abolished, which brought him into conflict with Karl Marx and his followers, who insisted that a "dictatorship of the proletariat" was a necessary phase in the transition to a stateless society (communism). Bakunin's antipolitical conception of revolutionary change as well as his forceful repudiation of the authoritarian communist principles embodied in the Marxism of his day drove a wedge between his adherents in the First International (1864-1876) and those of Karl Marx, thus establishing a divide in the European socialist movement that would never be bridged. ANARCHISM New Dictionary of the History of Ideas 67 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 67 However, not all anarchists were hostile toward the idea of communism. Another Russian aristocrat turned revolutionary, Peter Kropotkin, developed over the course of his lifetime a sociological theory of anarchism that combined the antiauthoritarian beliefs espoused by his predecessors in the anarchist movement with those of communism. Unlike Proudhon and Bakunin, both of whom believed in the right of individual possession of products produced from one's labor, Kropotkin advocated an economic system based on the communal ownership of all goods and services that operated on the socialist principle "from each according to his abilities, to each according to his needs." By distributing society's wealth and resources in this way, Kropotkin and other anarchist communists believed that everyone, including those who were unproductive, would be able to live in relative abundance. Kropotkin's greatest contributions to anarchist theory, however, were his attempts to present anarchism as an empirically verifiable worldview, one that was based on "a mechanical explanation of world phenomena, embracing the whole of nature." Following in the positivist tradition laid down by Auguste Comte (1798-1857), Herbert Spencer (1820-1903), and other forerunners of modern social science, Kropotkin believed that the study of society was analogous to that of the world of nature. In Modern Science and Anarchism (1912), for example, Kropotkin contends that the anarchist method of sociological enquiry is that of the exact natural sciences. "Its aim," he says, "is to construct a synthetic philosophy comprehending in one generalization all the phenomena of nature—and therefore also the life of societies." In developing his views on human nature, Kropotkin went farther in extending the analogy between society and the natural world. Like the Social Darwinists of his era, Kropotkin maintained that all human behavior was a reflection of our biological condition. But while most Social Darwinists argued that a "tooth and nail" impulse in the struggle for existence was the dominant natural law governing the evolution of human behavior, Kropotkin insisted that the instinct of cooperation was an even more important factor in this process. According to him, it is the species in nature that shows the greatest tendency toward mutual aid—not cutthroat competition—that is the one most likely to survive and flourish over time. By arguing in this way, Kropotkin was attempting to demonstrate that anarchism was a highly evolved state of human nature but one that could not be obtained until the state and other coercive institutions were completely abolished

European Developments to the Seventeenth Century Common algebra came into an awakening Europe during the thirteenth century. Among the various sources involved, Latin translations of some Arab authors were important. A significant homegrown source was the Italian Leonardo Fibonacci, who rendered the theory into Latin, with res, census, and cubus denoting the unknown and its powers. He and some translators of Arabic texts also adopted the Indian system of numerals. Communities developed, initially of Italian abbacists and later of German Rechenmeister, practicing arithmetic and common algebra with applications—some for a living. The title of al-Khwarizmi's book included the word al-jabr, which named the operation of adding terms to each side of an equation when necessary so that all of them were positive. Maybe following his successor Thabit ibn Qurra (836-901), in the sixteenth century Europeans took this word to refer to the entire subject. Its theoretical side principally tackled properties of polynomial equations, especially finding their roots. An early authority was Girolano Cardano (1501-1576), with his Ars magna (1545); successors include François Viète (1540-1601) with In artem analyticem isagoge (1591), who applied algebra to both geometry and trigonometry. The Europeans gradually replaced the words for unknowns, their powers, means of combination (including taking roots), and relationships by symbols, either letters of the alphabet or special signs. Apart from the finally chosen symbols for the arithmetic numerals, no system became definitive. In his book Algebra (1572), Rafael Bombelli (1526-1572) gave an extensive treatment on the theory of equations as then known, and puzzled over the mystery that the formula for the (positive) roots of a cubic equation with real coefficients could use complex numbers to determine them even if they were real; for example (one of his), given x 3 15x 4 / root x (2 1) (2 1) 4. The formula involved had been found early in the century by Scipione del Ferro (1465-1526), and controversially published later by Cardano. It could be adapted to solve the quartic equation, but no formula was found for the quintic. Developments with Equations from Descartes to Abel René Descartes's (1596-1650) Géométrie (1637) was an important publication in the history algebra. While its title shows his main concern, in it he introduced analytic geometry, representing constants and also variable geometric magnitudes by letters. He even found an algebraic means of determining the normal to a curve. Both this method and the representation of variables were to help in the creation of the calculus by Isaac Newton (1642-1727) in the 1660s and Gottfried Wilhelm Leibniz (1647-1716) a decade later. During the seventeenth century algebra came to be a staple part of mathematics, with textbooks beginning to be published. The binomial theorem was studied, with Newton extending it to non-integral exponents; and functions were given algebraic expression, including as power series. Algebraic number theory developed further, especially with Pierre de Fermat (1601-1665). Negative and complex numbers found friends, including Newton and Leonhard Euler (1707-1783); but some anxiety continued, especially in Britain. The theory of polynomial equations and their roots remained prominent. In particular, in Descartes's time "the fundamental theorem of algebra" (a later name) was recognized though not proved: that for any positive integer n a polynomial equation of degree n has n roots, real and/or complex. The Italian mathematician J. L. Lagrange (1736-1813) and others tried to prove it during the eighteenth century, but the real breakthrough came from 1799 by the (young) C. F. Gauss (1777-1855), who was to produce three more difficult and ALGEBRAS New Dictionary of the History of Ideas 45 Postage stamp bearing the likeness of Al-Khwarizmi. The earlyninth-century Arab librarian and astronomer was a pioneer in the field of mathematics, setting forth theories that were expanded upon for centuries. The term algebra was first derived from his groundbreaking work Al-jabr wa'l-muqabala. KEITH BAUMAN, TNA ASSOCIATES, FRANKLIN, MICHIGAN 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 45 not always rigorous proofs in 1816 and 1850. He and others also interpreted complex numbers geometrically instead of algebraically, a reading that gradually became popular. Another major question concerning equations was finding the roots of a quintic: Lagrange tried various procedures, some elaborated by his compatriot Italian Paolo Ruffini (1765-1822). The suspicion developed that there was no algebraic formula for the roots: the young Norwegian Niels Henrik Abel (1802-1829) showed its correctness in 1826 with a proof that was independent of Lagrange's procedures. Lagrange was the leading algebraist of the time: from the 1770s he not only worked on problems in algebra but also tried philologically to algebraize other branches of mathematics. He based the calculus upon an infinite power series (the Taylor series); however, his assumption was to be refuted by Augustin-Louis Cauchy (1789-1857) and W. R. Hamilton (1805-1865). He also grounded mechanics upon principles such as that of "least action" because they could be formulated exclusively in algebraic terms: while much mechanics was encompassed, Newtonian and energy mechanics were more pliable in many contexts.

European Developments to the Seventeenth Century Common algebra came into an awakening Europe during the thirteenth century. Among the various sources involved, Latin translations of some Arab authors were important. A significant homegrown source was the Italian Leonardo Fibonacci, who rendered the theory into Latin, with res, census, and cubus denoting the unknown and its powers. He and some translators of Arabic texts also adopted the Indian system of numerals. Communities developed, initially of Italian abbacists and later of German Rechenmeister, practicing arithmetic and common algebra with applications—some for a living. The title of al-Khwarizmi's book included the word al-jabr, which named the operation of adding terms to each side of an equation when necessary so that all of them were positive. Maybe following his successor Thabit ibn Qurra (836-901), in the sixteenth century Europeans took this word to refer to the entire subject. Its theoretical side principally tackled properties of polynomial equations, especially finding their roots. An early authority was Girolano Cardano (1501-1576), with his Ars magna (1545); successors include François Viète (1540-1601) with In artem analyticem isagoge (1591), who applied algebra to both geometry and trigonometry. The Europeans gradually replaced the words for unknowns, their powers, means of combination (including taking roots), and relationships by symbols, either letters of the alphabet or special signs. Apart from the finally chosen symbols for the arithmetic numerals, no system became definitive. In his book Algebra (1572), Rafael Bombelli (1526-1572) gave an extensive treatment on the theory of equations as then known, and puzzled over the mystery that the formula for the (positive) roots of a cubic equation with real coefficients could use complex numbers to determine them even if they were real; for example (one of his), given x 3 15x 4 / root x (2 1) (2 1) 4. The formula involved had been found early in the century by Scipione del Ferro (1465-1526), and controversially published later by Cardano. It could be adapted to solve the quartic equation, but no formula was found for the quintic. Developments with Equations from Descartes to Abel René Descartes's (1596-1650) Géométrie (1637) was an important publication in the history algebra. While its title shows his main concern, in it he introduced analytic geometry, representing constants and also variable geometric magnitudes by letters. He even found an algebraic means of determining the normal to a curve. Both this method and the representation of variables were to help in the creation of the calculus by Isaac Newton (1642-1727) in the 1660s and Gottfried Wilhelm Leibniz (1647-1716) a decade later. During the seventeenth century algebra came to be a staple part of mathematics, with textbooks beginning to be published. The binomial theorem was studied, with Newton extending it to non-integral exponents; and functions were given algebraic expression, including as power series. Algebraic number theory developed further, especially with Pierre de Fermat (1601-1665). Negative and complex numbers found friends, including Newton and Leonhard Euler (1707-1783); but some anxiety continued, especially in Britain. The theory of polynomial equations and their roots remained prominent. In particular, in Descartes's time "the fundamental theorem of algebra" (a later name) was recognized though not proved: that for any positive integer n a polynomial equation of degree n has n roots, real and/or complex. The Italian mathematician J. L. Lagrange (1736-1813) and others tried to prove it during the eighteenth century, but the real breakthrough came from 1799 by the (young) C. F. Gauss (1777-1855), who was to produce three more difficult and ALGEBRAS New Dictionary of the History of Ideas 45 Postage stamp bearing the likeness of Al-Khwarizmi. The earlyninth-century Arab librarian and astronomer was a pioneer in the field of mathematics, setting forth theories that were expanded upon for centuries. The term algebra was first derived from his groundbreaking work Al-jabr wa'l-muqabala. KEITH BAUMAN, TNA ASSOCIATES, FRANKLIN, MICHIGAN 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 45 not always rigorous proofs in 1816 and 1850. He and others also interpreted complex numbers geometrically instead of algebraically, a reading that gradually became popular. Another major question concerning equations was finding the roots of a quintic: Lagrange tried various procedures, some elaborated by his compatriot Italian Paolo Ruffini (1765-1822). The suspicion developed that there was no algebraic formula for the roots: the young Norwegian Niels Henrik Abel (1802-1829) showed its correctness in 1826 with a proof that was independent of Lagrange's procedures. Lagrange was the leading algebraist of the time: from the 1770s he not only worked on problems in algebra but also tried philologically to algebraize other branches of mathematics. He based the calculus upon an infinite power series (the Taylor series); however, his assumption was to be refuted by Augustin-Louis Cauchy (1789-1857) and W. R. Hamilton (1805-1865). He also grounded mechanics upon principles such as that of "least action" because they could be formulated exclusively in algebraic terms: while much mechanics was encompassed, Newtonian and energy mechanics were more pliable in many contexts.

Independence The Latin American movement most closely associated with anticolonialism corresponds to the period at the beginning of the nineteenth century during which most of the region gained its political independence from European colonial powers. This "postcolonial era began before many territories became colonial," Robert Young notes, and "before some European imperial powers, such as Germany and Italy, had even become nations themselves" (p. 193). As in the United States, independence represented a shift of economic wealth and political power from a colonial elite to a domestic elite. In Latin America, this was expressed as a struggle between peninsulares (those born on the Iberian peninsula, i.e., Spain and Portugal) and creoles (those born in the New World). Independence did not result in any corresponding shift in social relations, nor did it result in the abolition of slavery or more rights for women. In fact, without the paternalistic protection of the European crowns the position of peasants and Indians actually worsened. The 1780 Tupac Amaru uprising in the South American Andes is one of the largest, earliest, and most significant anticolonial movements in the history of Latin America. The leader of this uprising, José Gabriel Condorcanqui (d. 1781), a descendant of the Incas, first attempted to petition for the rights of his people through legal channels. When legal attempts failed, he took the name of the last Inca ruler (Tupac Amaru) and led an uprising that quickly spread throughout the southern Andes. The insurgents sacked Spanish haciendas and obrajes (textile mills), driven by messianic dreams of a renewed Inca empire that would free the indigenous peoples from hunger, injustice, oppression, and exploitation. The Spanish captured Tupac Amaru and other leaders of the uprising six months later and executed them in Cuzco, the former capital of the Inca empire. This did not end the rebellion but shifted its focus south to Bolivia, where under the leadership of Aymara people it entered a more radical, violent, and explicitly anticolonial phase. In this phase, the insurgents captured and held the city of La Paz for several months and threatened the silver mines at Potosí—a direct challenge to Spanish wealth and power. The Spanish finally captured and executed the leaders and the uprising eventually collapsed. This revolt has sometimes been seen as a forward-looking antecedent to the successful creole independence movements that came forty years later and sometimes as a reactionary messianic movement that sought to return to the time of the Inca empire. Sinclair Thomson positions these uprisings in the context of local struggles against abusive colonial practices and for self-determination ANTICOLONIALISM 84 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 84 and equality. Although the uprising ultimately failed, it reveals a widening gap between the colonial elites and the subaltern masses, as well as a refusal of indigenous peoples to passively accept their marginalized role in society. The Haitian slave revolt provides another stark contrast to the creole independence movements and in essence underscores the lack of a compelling anticolonial discourse in those events. Haiti was a French colony, and its production of sugar, cotton, and indigo made it one of the most important colonies in the world. Soaring sugar profits for French planters in the eighteenth century led to a dramatic increase in the number of African slaves they imported to work the plantations. By the end of the century, about 80 percent of the Haitian people were overworked and underfed slaves. Nevertheless, Haitian independence movements began in 1789 not as a slave revolt but from the small elite class of planters, who had been influenced by the French Revolution's rhetoric of "liberty, equality, fraternity." For the planters, liberty meant home rule and freedom from French tariff structures. The whites armed the slaves to fight the French, but instead, under the leadership of Toussaint L'Ouverture (1743-1803), slaves took advantage of the opportunity to revolt and destroyed the old society. The result was perhaps one of the few true social revolutions the world has ever seen, in which members of a mass movement completely obliterated the ancien régime and claimed power for itself. By the time Jean-Jacques Dessalines declared Haitian independence in 1804, the sugar economy had disappeared, having been displaced by subsistence agriculture. The example of a black slave republic sent a terrifying chill through creole elites, which had begun to agitate for independence elsewhere in Latin America. The only other independent country in the hemisphere, the United States, refused to recognize the Haitian government. The dangers exemplified by the first successful anticolonial movement in Latin America put the brakes on other independence mov

Independence The Latin American movement most closely associated with anticolonialism corresponds to the period at the beginning of the nineteenth century during which most of the region gained its political independence from European colonial powers. This "postcolonial era began before many territories became colonial," Robert Young notes, and "before some European imperial powers, such as Germany and Italy, had even become nations themselves" (p. 193). As in the United States, independence represented a shift of economic wealth and political power from a colonial elite to a domestic elite. In Latin America, this was expressed as a struggle between peninsulares (those born on the Iberian peninsula, i.e., Spain and Portugal) and creoles (those born in the New World). Independence did not result in any corresponding shift in social relations, nor did it result in the abolition of slavery or more rights for women. In fact, without the paternalistic protection of the European crowns the position of peasants and Indians actually worsened. The 1780 Tupac Amaru uprising in the South American Andes is one of the largest, earliest, and most significant anticolonial movements in the history of Latin America. The leader of this uprising, José Gabriel Condorcanqui (d. 1781), a descendant of the Incas, first attempted to petition for the rights of his people through legal channels. When legal attempts failed, he took the name of the last Inca ruler (Tupac Amaru) and led an uprising that quickly spread throughout the southern Andes. The insurgents sacked Spanish haciendas and obrajes (textile mills), driven by messianic dreams of a renewed Inca empire that would free the indigenous peoples from hunger, injustice, oppression, and exploitation. The Spanish captured Tupac Amaru and other leaders of the uprising six months later and executed them in Cuzco, the former capital of the Inca empire. This did not end the rebellion but shifted its focus south to Bolivia, where under the leadership of Aymara people it entered a more radical, violent, and explicitly anticolonial phase. In this phase, the insurgents captured and held the city of La Paz for several months and threatened the silver mines at Potosí—a direct challenge to Spanish wealth and power. The Spanish finally captured and executed the leaders and the uprising eventually collapsed. This revolt has sometimes been seen as a forward-looking antecedent to the successful creole independence movements that came forty years later and sometimes as a reactionary messianic movement that sought to return to the time of the Inca empire. Sinclair Thomson positions these uprisings in the context of local struggles against abusive colonial practices and for self-determination ANTICOLONIALISM 84 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 84 and equality. Although the uprising ultimately failed, it reveals a widening gap between the colonial elites and the subaltern masses, as well as a refusal of indigenous peoples to passively accept their marginalized role in society. The Haitian slave revolt provides another stark contrast to the creole independence movements and in essence underscores the lack of a compelling anticolonial discourse in those events. Haiti was a French colony, and its production of sugar, cotton, and indigo made it one of the most important colonies in the world. Soaring sugar profits for French planters in the eighteenth century led to a dramatic increase in the number of African slaves they imported to work the plantations. By the end of the century, about 80 percent of the Haitian people were overworked and underfed slaves. Nevertheless, Haitian independence movements began in 1789 not as a slave revolt but from the small elite class of planters, who had been influenced by the French Revolution's rhetoric of "liberty, equality, fraternity." For the planters, liberty meant home rule and freedom from French tariff structures. The whites armed the slaves to fight the French, but instead, under the leadership of Toussaint L'Ouverture (1743-1803), slaves took advantage of the opportunity to revolt and destroyed the old society. The result was perhaps one of the few true social revolutions the world has ever seen, in which members of a mass movement completely obliterated the ancien régime and claimed power for itself. By the time Jean-Jacques Dessalines declared Haitian independence in 1804, the sugar economy had disappeared, having been displaced by subsistence agriculture. The example of a black slave republic sent a terrifying chill through creole elites, which had begun to agitate for independence elsewhere in Latin America. The only other independent country in the hemisphere, the United States, refused to recognize the Haitian government. The dangers exemplified by the first successful anticolonial movement in Latin America put the brakes on other independence mov

Nineteenth-century views of the United States. Nineteenth-century views of the United States are seen through the lens of the French Revolution. After the French Revolution devolved into terror, anarchy, and despotism, no major thinker ever again unqualifiedly praised the American Revolution. This is peculiar. Thinkers might have said that the French got it wrong, the Americans right, so let us praise the Americans and further intensify the study of it. Instead, they let the horrors of the French Revolution color their understanding of the American. This shows once again how the perceptions of America were based more on European dynamics than on the reality of America itself. Despite the failure of the French Revolution, the existence of the United States, coupled with the Enlightenment belief in progress, led to a general feeling that the United States was the future. If the French proved that the path to the future was not simple and smooth, the perception of what the future was to be like, as embodied in the United States, was also ambivalent. Interest in the United States was heightened because everyone had a stake in the future, which the United States seemed to represent. In the aftermath of the French Revolution, criticism arose about the United States. The substance of this criticism was similar across the ideological spectrum of the nineteenth century and is familiar to anyone aware of contemporary critiques of the United States. What America had become and what critics thought Europe would become—democratic—was regarded as a mixed blessing. The greatest representative of this ambivalence is Alexis de Tocqueville (1805-1859), the great French thinker and statesman. According to Tocqueville, democratic government is inefficient, meandering, and petty. But it has its advantages. It gets more done by energizing the people to do things themselves: "it does that which the most skillful government often cannot do: it spreads throughout the body social a restless activity, superabundant force, and energy never found elsewhere, which, however little favored by circumstance, can do wonders. Those are its true advantages" (Democracy in America). Democracy is not conducive, however, to refinement, elevated manners, poetry, glory, or heroic virtues. All of the main political theorists of the nineteenth century agreed with this ambivalent assessment of America— and of the budding liberalism of Europe. America was seen as epitomizing the self-interested individualism of the new commercial society and as representing the centralization of power by the new middle-class regime. As such, four criticisms were repeatedly leveled at it. First, America was said to embody the disorder caused by collapsing institutions. The authority of all previous standards— experience, age, birth, genius, talent, and virtue—was undercut in America. Second, America represented a growing obsession with money. It was because of this that all other standards of human value were ignored. Third, America represented unchecked equality. The new type of man preferred equality to liberty, as Tocqueville and John Stuart Mill (1806-1873) warned. Finally, the new form of government represented the power of the majority, the "tyranny of the majority" in Tocqueville's famous phrase. This stifled creativity and AMERICA 58 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 58 individuality. It guaranteed that society would be geared to the mediocre middle at the expense of individual refinement, the cultivation of culture, and the emergence of spiritual sublimity and greatness. These are essentially the same charges leveled against the United States in the late twentieth and early twenty-first centuries by traditional authorities in Africa, Asia, and the Middle East, by the educated elites in Europe and elsewhere, and by the antimodern radicals, such as the Ayatollah Khomeini, Hizbollah, and Al Qaeda.

Nineteenth-century views of the United States. Nineteenth-century views of the United States are seen through the lens of the French Revolution. After the French Revolution devolved into terror, anarchy, and despotism, no major thinker ever again unqualifiedly praised the American Revolution. This is peculiar. Thinkers might have said that the French got it wrong, the Americans right, so let us praise the Americans and further intensify the study of it. Instead, they let the horrors of the French Revolution color their understanding of the American. This shows once again how the perceptions of America were based more on European dynamics than on the reality of America itself. Despite the failure of the French Revolution, the existence of the United States, coupled with the Enlightenment belief in progress, led to a general feeling that the United States was the future. If the French proved that the path to the future was not simple and smooth, the perception of what the future was to be like, as embodied in the United States, was also ambivalent. Interest in the United States was heightened because everyone had a stake in the future, which the United States seemed to represent. In the aftermath of the French Revolution, criticism arose about the United States. The substance of this criticism was similar across the ideological spectrum of the nineteenth century and is familiar to anyone aware of contemporary critiques of the United States. What America had become and what critics thought Europe would become—democratic—was regarded as a mixed blessing. The greatest representative of this ambivalence is Alexis de Tocqueville (1805-1859), the great French thinker and statesman. According to Tocqueville, democratic government is inefficient, meandering, and petty. But it has its advantages. It gets more done by energizing the people to do things themselves: "it does that which the most skillful government often cannot do: it spreads throughout the body social a restless activity, superabundant force, and energy never found elsewhere, which, however little favored by circumstance, can do wonders. Those are its true advantages" (Democracy in America). Democracy is not conducive, however, to refinement, elevated manners, poetry, glory, or heroic virtues. All of the main political theorists of the nineteenth century agreed with this ambivalent assessment of America— and of the budding liberalism of Europe. America was seen as epitomizing the self-interested individualism of the new commercial society and as representing the centralization of power by the new middle-class regime. As such, four criticisms were repeatedly leveled at it. First, America was said to embody the disorder caused by collapsing institutions. The authority of all previous standards— experience, age, birth, genius, talent, and virtue—was undercut in America. Second, America represented a growing obsession with money. It was because of this that all other standards of human value were ignored. Third, America represented unchecked equality. The new type of man preferred equality to liberty, as Tocqueville and John Stuart Mill (1806-1873) warned. Finally, the new form of government represented the power of the majority, the "tyranny of the majority" in Tocqueville's famous phrase. This stifled creativity and AMERICA 58 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 58 individuality. It guaranteed that society would be geared to the mediocre middle at the expense of individual refinement, the cultivation of culture, and the emergence of spiritual sublimity and greatness. These are essentially the same charges leveled against the United States in the late twentieth and early twenty-first centuries by traditional authorities in Africa, Asia, and the Middle East, by the educated elites in Europe and elsewhere, and by the antimodern radicals, such as the Ayatollah Khomeini, Hizbollah, and Al Qaeda.

Role of Cosmology The doctrinal aspects of alchemy are the main focus of many sources dating from the Tang period (seventh to tenth centuries) onward. These sources formulate their teachings and practices by borrowing the language and the abstract emblems of correlative cosmology, a comprehensive system designed to explicate the nature and properties of different domains—primarily the cosmos and the human being—and the relations that occur among them. The main work that reflects these changes and provides them with textual authority is the Zhouyi cantong qi (Token of the Agreement of the Three According to the Book of Changes; the "three" mentioned in the title are, according to some commentaries, Daoism, cosmology, and alchemy). Virtually the entire alchemical tradition from the Tang period onward acknowledges this text as its most important scriptural source. Despite this, the Cantong qi does not primarily deal with either waidan or neidan and only occasionally alludes to both of them. Its main purpose is to illustrate the nonduality of the Dao and the cosmos; the task of explicating the details of this doctrinal view, and of applying it to waidan and neidan, is left to the commentaries and to a large number of related texts. The emblems of correlative cosmology—typically arranged in patterns that include Yin and Yang, the five agents (wuxing), the eight trigrams and the sixty-four hexagrams of the Book of Changes (Yijing), and so forth—play two main roles closely related to each other. First, they illustrate the relation between unity, duality, and the various other stages of the propagation of Original Pneuma (yuanqi) into the "ten thousand things." In this function, cosmological emblems serve to show how space, time, multiplicity, and change are related to the spacelessness, timelessness, nonduality, and constancy of the Dao. For instance, the Cantong qi describes the five agents (which define, in particular, the main spatial and temporal coordinates of the cosmos) as unfolding from the center, which contains them all, runs through them, and "endows them with its efficacy." In their second role, the emblems of the cosmological system are used to formulate the relation of the alchemical practice to the doctrinal principles. For instance, the trigrams of the Book of Changes illustrate how the alchemical process consists in extracting the pre-cosmic Real Yin (zhenyin) and Real Yang (zhenyang) from Yang and Yin as they appear in the cosmos, respectively, and in joining them to produce an elixir that represents their original oneness (Pregadio, 2000, pp. 182-184). In the traditions based on the Cantong qi, alchemy is primarily a figurative language to represent doctrinal principles. The waidan process loses its ritual features, and the compounding of the elixir is based on two emblematic metals, mercury and lead. The refined states of these metals—respectively obtained from cinnabar and from native lead—represent Yin and Yang in their original, pre-cosmic state, and their conjunction produces an elixir whose properties are said to be equivalent to Pure Yang. The central role played by cosmology in these waidan traditions is reflected in two works related to the Cantong qi, which respectively state that "compounding the Great Elixir is not a matter of ingredients, but always of the Five Agents," and even that "you do not use ingredients, you use the Five Agents."

Role of Cosmology The doctrinal aspects of alchemy are the main focus of many sources dating from the Tang period (seventh to tenth centuries) onward. These sources formulate their teachings and practices by borrowing the language and the abstract emblems of correlative cosmology, a comprehensive system designed to explicate the nature and properties of different domains—primarily the cosmos and the human being—and the relations that occur among them. The main work that reflects these changes and provides them with textual authority is the Zhouyi cantong qi (Token of the Agreement of the Three According to the Book of Changes; the "three" mentioned in the title are, according to some commentaries, Daoism, cosmology, and alchemy). Virtually the entire alchemical tradition from the Tang period onward acknowledges this text as its most important scriptural source. Despite this, the Cantong qi does not primarily deal with either waidan or neidan and only occasionally alludes to both of them. Its main purpose is to illustrate the nonduality of the Dao and the cosmos; the task of explicating the details of this doctrinal view, and of applying it to waidan and neidan, is left to the commentaries and to a large number of related texts. The emblems of correlative cosmology—typically arranged in patterns that include Yin and Yang, the five agents (wuxing), the eight trigrams and the sixty-four hexagrams of the Book of Changes (Yijing), and so forth—play two main roles closely related to each other. First, they illustrate the relation between unity, duality, and the various other stages of the propagation of Original Pneuma (yuanqi) into the "ten thousand things." In this function, cosmological emblems serve to show how space, time, multiplicity, and change are related to the spacelessness, timelessness, nonduality, and constancy of the Dao. For instance, the Cantong qi describes the five agents (which define, in particular, the main spatial and temporal coordinates of the cosmos) as unfolding from the center, which contains them all, runs through them, and "endows them with its efficacy." In their second role, the emblems of the cosmological system are used to formulate the relation of the alchemical practice to the doctrinal principles. For instance, the trigrams of the Book of Changes illustrate how the alchemical process consists in extracting the pre-cosmic Real Yin (zhenyin) and Real Yang (zhenyang) from Yang and Yin as they appear in the cosmos, respectively, and in joining them to produce an elixir that represents their original oneness (Pregadio, 2000, pp. 182-184). In the traditions based on the Cantong qi, alchemy is primarily a figurative language to represent doctrinal principles. The waidan process loses its ritual features, and the compounding of the elixir is based on two emblematic metals, mercury and lead. The refined states of these metals—respectively obtained from cinnabar and from native lead—represent Yin and Yang in their original, pre-cosmic state, and their conjunction produces an elixir whose properties are said to be equivalent to Pure Yang. The central role played by cosmology in these waidan traditions is reflected in two works related to the Cantong qi, which respectively state that "compounding the Great Elixir is not a matter of ingredients, but always of the Five Agents," and even that "you do not use ingredients, you use the Five Agents."

The Development of Nationalism The development of nationalism was not uniform throughout the countries of Asia and Africa. The majority of nationalist movements, however, roughly followed a distinct pattern. First, there were local protest movements; some were culturally based, while others were created by the local elite to protest against local and specific grievances, usually economic. Second, there was the crucial period of mass nationalism. In most countries of Asia and Africa, this occurred after World War II. In Africa, this rough pattern applies to all the countries that attained their political independence before 1975. The same is true in Asia, except for those countries that secured their freedom after protracted armed struggle led by communist parties. ANTICOLONIALISM New Dictionary of the History of Ideas 81 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 81 This constitutes what can be characterized as the first and dominant phase of African and Asian nationalism. The second phase applies in Africa to the former Portuguese colonies of Guinea-Bissau, Angola, and Mozambique, in addition to Zimbabwe, South Africa, and Namibia. In Asia, it applies to China and Vietnam, even though their wars of liberation had achieved key victories before 1975. In almost every case, political independence in the second phase was achieved after protracted guerrilla warfare. Liberation movements in China, Vietnam, and former Portuguese colonies in Africa moved to be more precise in their definition of national liberation and societal development. Further, they adopted a Marxist ideology as a guide to their struggle, much more deliberately and consistently than had the majority of the nationalist movements in the first phase. Although the nationalist movements in the second phase did not undertake to pursue identical policies, they nonetheless embarked on a far more detailed socioeconomic analysis of colonial and imperial domination. In Africa, these Marxist-leaning movements were influenced by the revolutionary thought of Frantz Fanon (1925- 1961). In his most famous book, The Wretched of the Earth (Les damnés de la terre, 1961; Eng. trans., 1963), Fanon argued passionately that true decolonization must be the product of violence. Colonialism, "created and maintained by violence," could be truly uprooted only through "mass participation in violent decolonization." This violence was to be the product of an organized revolutionary movement composed of urban intellectuals and peasants. Fanon also argued that revolutionary violence against the colonialists was "a cleansing force" for individual participants: "it frees the native from his inferiority complex and his despair and inaction." The meaning and implications of this observation continue to generate ideological and intellectual controversy. Still, it should be said that Fanon's formulation was not a "lyrical celebration of violence" (see L. Adele Jinadu) but rather a strategy to be employed by the colonized in pursuit of true liberation. After World War II, European powers were not eager to liquidate their empires in Asia and Africa. Although drastically weakened by the war, neither Britain nor France (the major imperial powers) seemed anxious to grant political independence to their colonies. Indeed, the dominant postcolonial policies seemed to favor reassertion of imperial authority. To this end, France fought costly wars in Vietnam and Algeria as it tried unsuccessfully to suppress nationalism. Britain fought against a determined anticolonial movement in Malaya, as well as against the Mau Mau nationalist peasant revolt in Kenya. Even in those colonies where there was little or no armed resistance, European powers were quick to employ brutal force to try to stem the tide of nationalism. Colonial administrators routinely dismissed the legitimacy of Asian and African nationalism. Any stirring of nationalism was seen as an alien, artificial, almost inappropriate creation, imposed by Asian or African elites on unwilling or otherwise ignorant masses. This was a disastrously mistaken claim and belief. It would have been impossible for nationalist activists to achieve any success without the sustained and spirited support of those masses. The ideological strife of the Cold War, however, led many colonial administrators to mistakenly view nationalist agitation as the unfolding of a global communist conspiracy. In the end, imperial maneuvers, brutal force, and concessions failed to derail the drive toward decolonization. In the political climate of the postwar years, imperial powers were forced to see that there could be no compromise between direct imperial domination and the basic, nonnegotiable demands of nationalism. On balance, it is fair to conclude that internal factors in each colony proved the chief determinant in the nature of the struggle for, and attainment of, political independence. Nationalist and revolutionary movements were essentially local in inspiration and objectives. These movements were not "exportable commodities," says Amilcar Cabral, but rather were "determined and conditioned by the historical reality of each people." This underscores the vitality and integrity of Asian and African nationalism and amply demonstrates (in Geoffrey Barraclough's words) that "the will, the courage, the determination, and the deep human motivation" that propelled it forward "owed little, if anything, to Western example."

The Development of Nationalism The development of nationalism was not uniform throughout the countries of Asia and Africa. The majority of nationalist movements, however, roughly followed a distinct pattern. First, there were local protest movements; some were culturally based, while others were created by the local elite to protest against local and specific grievances, usually economic. Second, there was the crucial period of mass nationalism. In most countries of Asia and Africa, this occurred after World War II. In Africa, this rough pattern applies to all the countries that attained their political independence before 1975. The same is true in Asia, except for those countries that secured their freedom after protracted armed struggle led by communist parties. ANTICOLONIALISM New Dictionary of the History of Ideas 81 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 81 This constitutes what can be characterized as the first and dominant phase of African and Asian nationalism. The second phase applies in Africa to the former Portuguese colonies of Guinea-Bissau, Angola, and Mozambique, in addition to Zimbabwe, South Africa, and Namibia. In Asia, it applies to China and Vietnam, even though their wars of liberation had achieved key victories before 1975. In almost every case, political independence in the second phase was achieved after protracted guerrilla warfare. Liberation movements in China, Vietnam, and former Portuguese colonies in Africa moved to be more precise in their definition of national liberation and societal development. Further, they adopted a Marxist ideology as a guide to their struggle, much more deliberately and consistently than had the majority of the nationalist movements in the first phase. Although the nationalist movements in the second phase did not undertake to pursue identical policies, they nonetheless embarked on a far more detailed socioeconomic analysis of colonial and imperial domination. In Africa, these Marxist-leaning movements were influenced by the revolutionary thought of Frantz Fanon (1925- 1961). In his most famous book, The Wretched of the Earth (Les damnés de la terre, 1961; Eng. trans., 1963), Fanon argued passionately that true decolonization must be the product of violence. Colonialism, "created and maintained by violence," could be truly uprooted only through "mass participation in violent decolonization." This violence was to be the product of an organized revolutionary movement composed of urban intellectuals and peasants. Fanon also argued that revolutionary violence against the colonialists was "a cleansing force" for individual participants: "it frees the native from his inferiority complex and his despair and inaction." The meaning and implications of this observation continue to generate ideological and intellectual controversy. Still, it should be said that Fanon's formulation was not a "lyrical celebration of violence" (see L. Adele Jinadu) but rather a strategy to be employed by the colonized in pursuit of true liberation. After World War II, European powers were not eager to liquidate their empires in Asia and Africa. Although drastically weakened by the war, neither Britain nor France (the major imperial powers) seemed anxious to grant political independence to their colonies. Indeed, the dominant postcolonial policies seemed to favor reassertion of imperial authority. To this end, France fought costly wars in Vietnam and Algeria as it tried unsuccessfully to suppress nationalism. Britain fought against a determined anticolonial movement in Malaya, as well as against the Mau Mau nationalist peasant revolt in Kenya. Even in those colonies where there was little or no armed resistance, European powers were quick to employ brutal force to try to stem the tide of nationalism. Colonial administrators routinely dismissed the legitimacy of Asian and African nationalism. Any stirring of nationalism was seen as an alien, artificial, almost inappropriate creation, imposed by Asian or African elites on unwilling or otherwise ignorant masses. This was a disastrously mistaken claim and belief. It would have been impossible for nationalist activists to achieve any success without the sustained and spirited support of those masses. The ideological strife of the Cold War, however, led many colonial administrators to mistakenly view nationalist agitation as the unfolding of a global communist conspiracy. In the end, imperial maneuvers, brutal force, and concessions failed to derail the drive toward decolonization. In the political climate of the postwar years, imperial powers were forced to see that there could be no compromise between direct imperial domination and the basic, nonnegotiable demands of nationalism. On balance, it is fair to conclude that internal factors in each colony proved the chief determinant in the nature of the struggle for, and attainment of, political independence. Nationalist and revolutionary movements were essentially local in inspiration and objectives. These movements were not "exportable commodities," says Amilcar Cabral, but rather were "determined and conditioned by the historical reality of each people." This underscores the vitality and integrity of Asian and African nationalism and amply demonstrates (in Geoffrey Barraclough's words) that "the will, the courage, the determination, and the deep human motivation" that propelled it forward "owed little, if anything, to Western example."

Autonomous History and the Idea of Anticolonialism In the early 1960s, shifts within Southeast Asian studies began promoting research that sought an alternative approach to the ways in which Southeast Asian culture and history had been conceptualized by earlier scholars. Following the call of John Smail to produce histories of Southeast Asia that were not bound to the European narratives, chronologies, and categories of analysis, scholars began directing their attention to writing about and studying what they perceived as indigenous history, which had finally attained its "autonomy" from the priorities and perspective of European-centered history. This trend affected the way in which anticolonialism came to be understood, in that Southeast Asian conceptions of resistance and protest were now being studied for what they revealed about the region's cultural heritage and conceptions of the world. Where scholars might have considered how revolts inspired by Islamic, Buddhist, or Christian ideas operated under the rubric of nationalism, emphasis was now directed toward understanding how these mentalities revealed something about the very nature of Southeast Asian culture. This new direction in thinking led scholars to write some of the most important works about anticolonialism and Southeast Asian culture. For example, Reynaldo C. Ileto, author of the seminal work Payson and Revolution, studied the ways in which Filipino-Catholic conceptions of rebellion were articulated through the imagery, scenes, and narratives associated with the Passion story of Christ. It inspired a new interest in millenarianism, or the idea of the coming millennium (or end of the world/cycle), and its relation to religious anticolonial movements. Historians such as Emanuel Sarkisyanz demonstrated how Buddhist conceptions about the end of the world framed the way Burmese made sense of the rapid social and economic changes occurring around them and how the notion of a future Buddha was associated with leaders promising a return to precolonial social norms. Michael Adas would take this paradigm and extend it comparatively within the region and beyond, showing in his Prophets of Rebellion that anticolonial movements were forged by the charismatic leadership of men who used religious notions of the millennium in order to gain popular support among the peasantry. Most importantly, these studies and many others began using the idea of anticolonialism in order to flesh out what were perceived as indigenous conceptions of the Southeast Asian world.

Autonomous History and the Idea of Anticolonialism In the early 1960s, shifts within Southeast Asian studies began promoting research that sought an alternative approach to the ways in which Southeast Asian culture and history had been conceptualized by earlier scholars. Following the call of John Smail to produce histories of Southeast Asia that were not bound to the European narratives, chronologies, and categories of analysis, scholars began directing their attention to writing about and studying what they perceived as indigenous history, which had finally attained its "autonomy" from the priorities and perspective of European-centered history. This trend affected the way in which anticolonialism came to be understood, in that Southeast Asian conceptions of resistance and protest were now being studied for what they revealed about the region's cultural heritage and conceptions of the world. Where scholars might have considered how revolts inspired by Islamic, Buddhist, or Christian ideas operated under the rubric of nationalism, emphasis was now directed toward understanding how these mentalities revealed something about the very nature of Southeast Asian culture. This new direction in thinking led scholars to write some of the most important works about anticolonialism and Southeast Asian culture. For example, Reynaldo C. Ileto, author of the seminal work Payson and Revolution, studied the ways in which Filipino-Catholic conceptions of rebellion were articulated through the imagery, scenes, and narratives associated with the Passion story of Christ. It inspired a new interest in millenarianism, or the idea of the coming millennium (or end of the world/cycle), and its relation to religious anticolonial movements. Historians such as Emanuel Sarkisyanz demonstrated how Buddhist conceptions about the end of the world framed the way Burmese made sense of the rapid social and economic changes occurring around them and how the notion of a future Buddha was associated with leaders promising a return to precolonial social norms. Michael Adas would take this paradigm and extend it comparatively within the region and beyond, showing in his Prophets of Rebellion that anticolonial movements were forged by the charismatic leadership of men who used religious notions of the millennium in order to gain popular support among the peasantry. Most importantly, these studies and many others began using the idea of anticolonialism in order to flesh out what were perceived as indigenous conceptions of the Southeast Asian world.

Categories and Features of Anticolonialism In order to make sense of the variety of ways in which Southeast Asians responded to colonialism, expressions of protest and resistance might be approached under three general categories: traditional, synthesis, and radical movements. Although problematic in terminology, traditional movements represent those initial "knee-jerk" reactions to the immediate military and pacification operations of the colonial powers that preceded the establishment of administrative governments. These movements were generally led by elites of the traditional order, using the vocabulary and symbols of leadership to which their followers would associate with precolonial authority. Designed to resurrect the institutions and social networks that were dismantled by the encroaching Europeans, ex-princes, ministers, and priests (or monks) rallied their immediate followers to resist colonial encroachment at locations of significant religious, political, and cultural importance. Because these movements were based on patron-client, village, and locally defined networks of relations, these outbreaks of resistance were limited in scale. These types of responses were generally found throughout the region but were more locally oriented and unsuccessful in realizing the return of precolonial sociopolitical orders. The second category of anticolonialism, which includes those expressions that exemplify a synthesis of indigenous and European ideals, refers generally to the types of programs championed by educated indigenous elites who wanted to initiate change and reform through the colonial system, using the vocabulary and procedures adopted from European education. These forms of protest were undertaken after colonial administrative and social institutions had already been entrenched in local soil, producing a generation of social reformers who saw the means for change within the apparatus and mechanics of ANTICOLONIALISM New Dictionary of the History of Ideas 91 69544_DHI_A_001-194.qxd 10/15/04 10:45 AM Page 91 the colonial system but who hoped to localize Western ideals of civil society and individualism through traditional symbols and belief systems. Unlike earlier responses that aimed to return to precolonial orders, these programs sought to initiate social reform within the parameters of colonial law and convention. Many who initiated such reforms were challenged by the inability to connect with rural populations, whose concerns, experiences, and conceptions of the world were much different from their more urbanized, Western-educated counterparts. The third type of anticolonial response, which were more radical than the earlier "East-West" attempts to synthesize, describes the initiatives of younger, educated urban students and activists who sought complete independence from colonial authorities using the organizational and sometimes ideological blueprints inherited from Europe, Japan, and America. In contrast to the generation of educated elites who hoped to initiate social reform through the system, the leaders of these movements aimed to uproot the colonial powers using the language of anticolonial nationalism in order to replace the system. Based in cities but able to penetrate the countryside, these movements attempted to bridge the rural-urban gap by making the colonial experience itself the common inspiration to launch popular movements toward independence. These three categories of analysis offer a preliminary structure to distinguish the different types of social and political protest that might be considered "anticolonial," while they also take into account the sociopolitical changes that occurred within Southeast Asian colonial society during the late nineteenth and early twentieth centuries as it affected local populations and communities. While anticolonial sentiment developed along these general lines, differences in the methods and natures of the colonial administrations, and the periods in which they were implemented, account for the variations and departures from the stages within this scheme. One common feature that binds the scholarly understanding of anticolonialism in the region is that it was mainly directed toward institutions, individuals, and policies that had come to represent the way in which colonial authority threatened or affected the lifestyle, worldviews, or identities of local peoples. Symbols of the colonial state (such as infrastructural edifices, district offices, and administrators) were common targets for anticolonial protest, though local indigenous elites who were deemed collaborators or at least sympathetic to the colonial authorities were often subjected to distrust, scorn, and sometimes violence as well. Attacks on local headmen outnumbered attacks on British officials during the initial outbreak of the Saya San Rebellion in Burma in 1930, as these British-appointed headmen were perceived as acting on behalf of the newly formed British village administration. While rebellions, riots, marches, and boycotts are all illustrative of more obvious forms of resistance, anticolonialism was expressed in a variety of other modes, harnessing local forms of public expression and media to articulate displeasure or disagreement with policies and pressures imposed by the colonial state. The growth of print culture alongside local theater, religious festivals, and other cultural outlets enabled anticolonialism to be articulated in a wide range of forms, much of which contributed to the scholarly understanding of culture, peasants, and nationalism in Southeast Asia. While these contexts represent more recent scholarly approaches to thinking about anticolonialism, the earliest versions of the idea can be found in the writings of colonial scholar-officials.

Categories and Features of Anticolonialism In order to make sense of the variety of ways in which Southeast Asians responded to colonialism, expressions of protest and resistance might be approached under three general categories: traditional, synthesis, and radical movements. Although problematic in terminology, traditional movements represent those initial "knee-jerk" reactions to the immediate military and pacification operations of the colonial powers that preceded the establishment of administrative governments. These movements were generally led by elites of the traditional order, using the vocabulary and symbols of leadership to which their followers would associate with precolonial authority. Designed to resurrect the institutions and social networks that were dismantled by the encroaching Europeans, ex-princes, ministers, and priests (or monks) rallied their immediate followers to resist colonial encroachment at locations of significant religious, political, and cultural importance. Because these movements were based on patron-client, village, and locally defined networks of relations, these outbreaks of resistance were limited in scale. These types of responses were generally found throughout the region but were more locally oriented and unsuccessful in realizing the return of precolonial sociopolitical orders. The second category of anticolonialism, which includes those expressions that exemplify a synthesis of indigenous and European ideals, refers generally to the types of programs championed by educated indigenous elites who wanted to initiate change and reform through the colonial system, using the vocabulary and procedures adopted from European education. These forms of protest were undertaken after colonial administrative and social institutions had already been entrenched in local soil, producing a generation of social reformers who saw the means for change within the apparatus and mechanics of ANTICOLONIALISM New Dictionary of the History of Ideas 91 69544_DHI_A_001-194.qxd 10/15/04 10:45 AM Page 91 the colonial system but who hoped to localize Western ideals of civil society and individualism through traditional symbols and belief systems. Unlike earlier responses that aimed to return to precolonial orders, these programs sought to initiate social reform within the parameters of colonial law and convention. Many who initiated such reforms were challenged by the inability to connect with rural populations, whose concerns, experiences, and conceptions of the world were much different from their more urbanized, Western-educated counterparts. The third type of anticolonial response, which were more radical than the earlier "East-West" attempts to synthesize, describes the initiatives of younger, educated urban students and activists who sought complete independence from colonial authorities using the organizational and sometimes ideological blueprints inherited from Europe, Japan, and America. In contrast to the generation of educated elites who hoped to initiate social reform through the system, the leaders of these movements aimed to uproot the colonial powers using the language of anticolonial nationalism in order to replace the system. Based in cities but able to penetrate the countryside, these movements attempted to bridge the rural-urban gap by making the colonial experience itself the common inspiration to launch popular movements toward independence. These three categories of analysis offer a preliminary structure to distinguish the different types of social and political protest that might be considered "anticolonial," while they also take into account the sociopolitical changes that occurred within Southeast Asian colonial society during the late nineteenth and early twentieth centuries as it affected local populations and communities. While anticolonial sentiment developed along these general lines, differences in the methods and natures of the colonial administrations, and the periods in which they were implemented, account for the variations and departures from the stages within this scheme. One common feature that binds the scholarly understanding of anticolonialism in the region is that it was mainly directed toward institutions, individuals, and policies that had come to represent the way in which colonial authority threatened or affected the lifestyle, worldviews, or identities of local peoples. Symbols of the colonial state (such as infrastructural edifices, district offices, and administrators) were common targets for anticolonial protest, though local indigenous elites who were deemed collaborators or at least sympathetic to the colonial authorities were often subjected to distrust, scorn, and sometimes violence as well. Attacks on local headmen outnumbered attacks on British officials during the initial outbreak of the Saya San Rebellion in Burma in 1930, as these British-appointed headmen were perceived as acting on behalf of the newly formed British village administration. While rebellions, riots, marches, and boycotts are all illustrative of more obvious forms of resistance, anticolonialism was expressed in a variety of other modes, harnessing local forms of public expression and media to articulate displeasure or disagreement with policies and pressures imposed by the colonial state. The growth of print culture alongside local theater, religious festivals, and other cultural outlets enabled anticolonialism to be articulated in a wide range of forms, much of which contributed to the scholarly understanding of culture, peasants, and nationalism in Southeast Asia. While these contexts represent more recent scholarly approaches to thinking about anticolonialism, the earliest versions of the idea can be found in the writings of colonial scholar-officials.

Resistance to Colonialism Twentieth-century resistance to colonialism inevitably partook of the general experience of its time, including assertions of national and ethnic identity, which were given added meaning ANTICOLONIALISM 88 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 88 and purpose in the face of alien colonizing. The press, the radio, and political parties and clubs provided new opportunities for disseminating the ideologies of anticolonialism. To these must be added the example first of Germany in the 1930s—a previously fragmented state that had turned its recent unification into a means of challenging the old colonizers, Britain and France; and for much of the 1940s, 1950s, and 1960s the Soviet Union as a new form of social and economic organization, under which a previously feudal regime was being transformed into an egalitarian welfare state. Such visions were especially attractive to those who had not experienced the realities of daily life under such regimes.

Resistance to Colonialism Twentieth-century resistance to colonialism inevitably partook of the general experience of its time, including assertions of national and ethnic identity, which were given added meaning ANTICOLONIALISM 88 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 88 and purpose in the face of alien colonizing. The press, the radio, and political parties and clubs provided new opportunities for disseminating the ideologies of anticolonialism. To these must be added the example first of Germany in the 1930s—a previously fragmented state that had turned its recent unification into a means of challenging the old colonizers, Britain and France; and for much of the 1940s, 1950s, and 1960s the Soviet Union as a new form of social and economic organization, under which a previously feudal regime was being transformed into an egalitarian welfare state. Such visions were especially attractive to those who had not experienced the realities of daily life under such regimes.

SOUTHEAST ASIA Anticolonialism in Southeast Asia has been considered from a wide range of perspectives, resulting in deliberation over its character and place in the region's history. Generally, anticolonialism refers to one type of Southeast Asian response to the encounter with Euro-American colonialism. One might then describe anticolonialism as including everything from the personalities, institutions, and resistance movements that arose in direct response to the establishment of colonies in Southeast Asia, to the growth of literary expressions, rituals, history, and popular culture that emerged within that historical context. More specifically, anticolonialism has also come to represent the ways in which colonized peoples protested, resisted, or expressed dissatisfaction with changes imposed by colonial authorities. Because of the nature and history of colonialism in Southeast Asia (which occurred over four centuries involving different actors, intensities, locations, and agendas), expressions of anticolonialism in the region tend to reflect the circumstances and characteristics particular to each locality. So the study of anticolonial movements in the Spanish colonies was understood in the context of a "Philippine history" that was different from the historical context in which colonialism (and anticolonialism) would be examined in the case of nineteenthcentury Myanmar (then known as Burma), whose history and colonial experience under the British had unfolded in quite a different manner. At the same time, scholars have also done extensive comparative work, demonstrating similarities in the way Southeast Asians articulated protest. In this regard, scholars have concentrated on the different forms of anticolonial expression in order to demonstrate variation and coherency in Southeast Asian cultural history. As a result, a distinctive and uniform "Southeast Asian" response to colonialism has yet to be clearly defined.

SOUTHEAST ASIA Anticolonialism in Southeast Asia has been considered from a wide range of perspectives, resulting in deliberation over its character and place in the region's history. Generally, anticolonialism refers to one type of Southeast Asian response to the encounter with Euro-American colonialism. One might then describe anticolonialism as including everything from the personalities, institutions, and resistance movements that arose in direct response to the establishment of colonies in Southeast Asia, to the growth of literary expressions, rituals, history, and popular culture that emerged within that historical context. More specifically, anticolonialism has also come to represent the ways in which colonized peoples protested, resisted, or expressed dissatisfaction with changes imposed by colonial authorities. Because of the nature and history of colonialism in Southeast Asia (which occurred over four centuries involving different actors, intensities, locations, and agendas), expressions of anticolonialism in the region tend to reflect the circumstances and characteristics particular to each locality. So the study of anticolonial movements in the Spanish colonies was understood in the context of a "Philippine history" that was different from the historical context in which colonialism (and anticolonialism) would be examined in the case of nineteenthcentury Myanmar (then known as Burma), whose history and colonial experience under the British had unfolded in quite a different manner. At the same time, scholars have also done extensive comparative work, demonstrating similarities in the way Southeast Asians articulated protest. In this regard, scholars have concentrated on the different forms of anticolonial expression in order to demonstrate variation and coherency in Southeast Asian cultural history. As a result, a distinctive and uniform "Southeast Asian" response to colonialism has yet to be clearly defined.

Ottoman Empire and the Mandate System With the exception of Morocco, the entire region either had been or still was in the early twentieth century at least nominally part of the Ottoman Empire, a multiethnic geopolitical unit that had been in existence since the late thirteenth century and that came to an end in the 1920s. Although it is misleading to regard the Ottomans as an imperial power, it is nevertheless the case that in spite of the Tanzimat reforms of the nineteenth century, which were generally intended to extend full citizenship to all subjects of the empire, the largely Christian provinces in southeastern Europe had become independent states in the course of the nineteenth century as a consequence of more or less bitter struggles to assert their various ethnolinguistic identities. In contrast, regardless of their ethnicity, the overwhelmingly Muslim population of the Arab provinces continued to regard the (Turkish) Ottomans as the natural defenders of Islam, with the result that most of the Middle East was barely affected by Arab nationalism until the early twentieth century. On the coasts of the Arabian Peninsula, Britain's concern with keeping the route to India safe and open led to a series of treaties with various local rulers between the 1820s and 1916, under which the rulers generally agreed not to grant or dispose of any part of their territories to any power except Britain. In 1839, Britain annexed Aden and turned it into a naval base. Exclusive treaties were signed with the tribal rulers of the interior, and in 1937 the area was divided into the port and its immediate hinterland (Aden Colony) and the more remote rural/tribal areas (Aden Protectorate). Principally because of their remoteness and their apparent lack of strategic importance, central Arabia and northern Yemen were never colonized. After the collapse of the Ottoman Empire at the end of the First World War, the empire's remaining Arab provinces were assigned by the newly created League of Nations to Britain and France as mandates, with Britain taking responsibility for Iraq, Palestine, and Transjordan, and France taking responsibility for Lebanon and Syria. The guiding principle of the mandate system was that the states concerned should remain under the tutelage of the mandatory power until such time as they were able to "stand alone," a period that, although not specified, was still understood to be finite. The mandate period was relatively short-lived, ending with the creation of Israel from the former Palestine mandate in 1948.

Ottoman Empire and the Mandate System With the exception of Morocco, the entire region either had been or still was in the early twentieth century at least nominally part of the Ottoman Empire, a multiethnic geopolitical unit that had been in existence since the late thirteenth century and that came to an end in the 1920s. Although it is misleading to regard the Ottomans as an imperial power, it is nevertheless the case that in spite of the Tanzimat reforms of the nineteenth century, which were generally intended to extend full citizenship to all subjects of the empire, the largely Christian provinces in southeastern Europe had become independent states in the course of the nineteenth century as a consequence of more or less bitter struggles to assert their various ethnolinguistic identities. In contrast, regardless of their ethnicity, the overwhelmingly Muslim population of the Arab provinces continued to regard the (Turkish) Ottomans as the natural defenders of Islam, with the result that most of the Middle East was barely affected by Arab nationalism until the early twentieth century. On the coasts of the Arabian Peninsula, Britain's concern with keeping the route to India safe and open led to a series of treaties with various local rulers between the 1820s and 1916, under which the rulers generally agreed not to grant or dispose of any part of their territories to any power except Britain. In 1839, Britain annexed Aden and turned it into a naval base. Exclusive treaties were signed with the tribal rulers of the interior, and in 1937 the area was divided into the port and its immediate hinterland (Aden Colony) and the more remote rural/tribal areas (Aden Protectorate). Principally because of their remoteness and their apparent lack of strategic importance, central Arabia and northern Yemen were never colonized. After the collapse of the Ottoman Empire at the end of the First World War, the empire's remaining Arab provinces were assigned by the newly created League of Nations to Britain and France as mandates, with Britain taking responsibility for Iraq, Palestine, and Transjordan, and France taking responsibility for Lebanon and Syria. The guiding principle of the mandate system was that the states concerned should remain under the tutelage of the mandatory power until such time as they were able to "stand alone," a period that, although not specified, was still understood to be finite. The mandate period was relatively short-lived, ending with the creation of Israel from the former Palestine mandate in 1948.

Palestine The final and highly anomalous case of anticolonialism in the Middle East is Palestine, unique among its neighbors in that it was a settler state. The text of the Palestine mandate included the terms of the Balfour Declaration (1917), in which Britain as mandatory power undertook to facilitate the setting up of a "national home for the Jewish people." In 1922, there were 93,000 Jews in Palestine and about 700,000 Arabs; in 1936, there were 380,000 Jews and 983,000 Arabs; and in 1946, about 600,000 Jews and 1.3 million Arabs; thus the Jewish population increased from 13 percent to 31 percent over a period of twenty-four years. Anticolonialism took different forms, principally through opposition by both Arabs and Zionists to British policy, which they tried to combat in different ways, and Arab opposition to Zionism. The Palestine rebellion of 1936 to 1939 was mostly a peasant insurrection against colonial rule and the settlers; in 1947 to 1948, the Zionists fought and won against an assortment of Arab armies and the poorly organized Palestinian resistance forces; the colonial power had long indicated that it would withdraw. Opposition to colonial rule and colonial settlement was fairly widespread throughout the nineteenth and twentieth centuries and took a variety of different forms, rural and urban, organized and spontaneous, religious and political, showing greater or lesser degrees of coherence. In any colonial situation, a wide spectrum of responses existed, with resistance at one end, acquiescence in the middle, and collaboration at ANTICOLONIALISM 90 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 90 the other end. Some members of the colonized population rebelled and some collaborated, but the majority acquiesced, at least for most of the time. In the nationalist historiography of the colonial period, the struggle for colonial freedom or national independence is often characterized in a way that shows the brave freedom fighters ranged against the brutal colonial authorities. The "achievements" of colonialism have long been open to question, and the divisions and chaos of the postcolonial world make the value of the legacy more questionable as time passes. Nevertheless, it is also important to understand the complexity and multifaceted nature of anticolonialism: the intrigues; the competing and often warring factions; the venality and corruption of many of them. For national maturity, and increasingly for national reconciliation, it will be necessary that such uncomfortable truths are boldly confronted rather than wilfully ignored.

Palestine The final and highly anomalous case of anticolonialism in the Middle East is Palestine, unique among its neighbors in that it was a settler state. The text of the Palestine mandate included the terms of the Balfour Declaration (1917), in which Britain as mandatory power undertook to facilitate the setting up of a "national home for the Jewish people." In 1922, there were 93,000 Jews in Palestine and about 700,000 Arabs; in 1936, there were 380,000 Jews and 983,000 Arabs; and in 1946, about 600,000 Jews and 1.3 million Arabs; thus the Jewish population increased from 13 percent to 31 percent over a period of twenty-four years. Anticolonialism took different forms, principally through opposition by both Arabs and Zionists to British policy, which they tried to combat in different ways, and Arab opposition to Zionism. The Palestine rebellion of 1936 to 1939 was mostly a peasant insurrection against colonial rule and the settlers; in 1947 to 1948, the Zionists fought and won against an assortment of Arab armies and the poorly organized Palestinian resistance forces; the colonial power had long indicated that it would withdraw. Opposition to colonial rule and colonial settlement was fairly widespread throughout the nineteenth and twentieth centuries and took a variety of different forms, rural and urban, organized and spontaneous, religious and political, showing greater or lesser degrees of coherence. In any colonial situation, a wide spectrum of responses existed, with resistance at one end, acquiescence in the middle, and collaboration at ANTICOLONIALISM 90 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 90 the other end. Some members of the colonized population rebelled and some collaborated, but the majority acquiesced, at least for most of the time. In the nationalist historiography of the colonial period, the struggle for colonial freedom or national independence is often characterized in a way that shows the brave freedom fighters ranged against the brutal colonial authorities. The "achievements" of colonialism have long been open to question, and the divisions and chaos of the postcolonial world make the value of the legacy more questionable as time passes. Nevertheless, it is also important to understand the complexity and multifaceted nature of anticolonialism: the intrigues; the competing and often warring factions; the venality and corruption of many of them. For national maturity, and increasingly for national reconciliation, it will be necessary that such uncomfortable truths are boldly confronted rather than wilfully ignored.

Some of the anticolonial movements of the twentieth century were urban-based mass movements, often led by charismatic leaders, perhaps most notably Habib Bourguiba of Tunisia, who led the Neo-Destour Party between 1954 and Tunisian independence in 1956 and who remained his country's leader until 1987. Allal al-Fassi, leader of the Istiqlal party, might have played a similar role in the history of Morocco. However, in 1953 the French exiled the sultan, Muhammad V, to Madagascar, and as a result the rallying cry of the national movement became the sultan's return from exile, which led in its turn to the sultan/king retaining his position as ruler after Morocco's independence in October 1956 and the virtual eclipse of the secular political parties. In Egypt, a kind of independence was achieved in 1936, but the national movement went through two stages. In the first stage, some but not all powers were handed over to local elites. This arrangement involved some form of power-sharing with the former colonial power, which became increasingly intolerable to wide sections of the population. However, given the balance of forces, it was not possible to break these links by democratic means—that is, by voting in a political party or coalition that would be able to end the relationship. Thus a second stage was necessary, in which a determined group within the military seized power, destroying in the process the fairly rudimentary institutions of parliamentary government that the colonial powers had put in place. In this way, first General Mohamad Neguib (1901-1984) and then Gamel Abdel-Nasser (1918-1970) took power in 1952. Iraq went through a similar process, and 'Abd al-Karim Qasim took power in 1958. A similar but more complex process took place in Syria, although the old social classes still ruling in 1961 had long severed any links they may have had with France.

Some of the anticolonial movements of the twentieth century were urban-based mass movements, often led by charismatic leaders, perhaps most notably Habib Bourguiba of Tunisia, who led the Neo-Destour Party between 1954 and Tunisian independence in 1956 and who remained his country's leader until 1987. Allal al-Fassi, leader of the Istiqlal party, might have played a similar role in the history of Morocco. However, in 1953 the French exiled the sultan, Muhammad V, to Madagascar, and as a result the rallying cry of the national movement became the sultan's return from exile, which led in its turn to the sultan/king retaining his position as ruler after Morocco's independence in October 1956 and the virtual eclipse of the secular political parties. In Egypt, a kind of independence was achieved in 1936, but the national movement went through two stages. In the first stage, some but not all powers were handed over to local elites. This arrangement involved some form of power-sharing with the former colonial power, which became increasingly intolerable to wide sections of the population. However, given the balance of forces, it was not possible to break these links by democratic means—that is, by voting in a political party or coalition that would be able to end the relationship. Thus a second stage was necessary, in which a determined group within the military seized power, destroying in the process the fairly rudimentary institutions of parliamentary government that the colonial powers had put in place. In this way, first General Mohamad Neguib (1901-1984) and then Gamel Abdel-Nasser (1918-1970) took power in 1952. Iraq went through a similar process, and 'Abd al-Karim Qasim took power in 1958. A similar but more complex process took place in Syria, although the old social classes still ruling in 1961 had long severed any links they may have had with France.

The Economic Impact of Colonialism An important effect of colonialism was to hasten the disintegration of long-established social and economic relations and to substitute the often harsher dictates of the market. The precolonial world was no egalitarian paradise, but, for example, the confiscation or purchase of land in colonial Algeria and mandatory Palestine and the formation of large landed estates in Syria and Iraq as a result of the establishment of regimes of private property under the mandates often resulted in cultivators either being driven off the land or being reduced from free peasants to serfs. Being far more incorporated into the world market than they had been before, with the concomitant pressure to cultivate cash crops, forced peasant houeholds to migrate to slum settlements on the edges of the major cities where they faced an uncertain and often near-destitute existence.

The Economic Impact of Colonialism An important effect of colonialism was to hasten the disintegration of long-established social and economic relations and to substitute the often harsher dictates of the market. The precolonial world was no egalitarian paradise, but, for example, the confiscation or purchase of land in colonial Algeria and mandatory Palestine and the formation of large landed estates in Syria and Iraq as a result of the establishment of regimes of private property under the mandates often resulted in cultivators either being driven off the land or being reduced from free peasants to serfs. Being far more incorporated into the world market than they had been before, with the concomitant pressure to cultivate cash crops, forced peasant houeholds to migrate to slum settlements on the edges of the major cities where they faced an uncertain and often near-destitute existence.

A Middle Ground The truth about Africa's impoverishment lies somewhere between the analyses of Afropessimists and Afro-optimists. There is no doubt that corrupt and uncourageous leadership has been the bane of socioeconomic development of sub-Saharan Africa in the postcolonial period. These leaders stunted democratic processes with force in order to preserve a system of one-person rule with no accountability. They awarded overpriced contracts to foreign companies in exchange for large kickbacks deposited in personal accounts in foreign banks. Their conspicuous consumption, cult of personality, nepotism, and naked abuse of political power encouraged a culture of greed, military coups, and instability, which reduced Africa's competitiveness for foreign investment. Governments borrowed billions in the name of the nation and cronies squandered the money, thereby saddling the people with debt. It is also true that in the same period, global political and economic policies reinforced the legacy of colonialism and exacerbated Africa's problems of self-rule. Apartheid South Africa sponsored destabilizing wars in the southern African region, and a cycle of Cold War-surrogate wars and conflicts ravaged Angola and Mozambique. These wars claimed millions of African lives and devastated the economies of the warring countries. Economic adjustment policies of the World Bank forced African countries to cut spending in health, education, and infrastructure in order to save money to service foreign debts. Low international prices of commodities produced by Africans caused African countries to lose about $50 billion in the 1980s and early 1990s, the same period of the most virulent Afropessimism. These externally induced problems combined with internal inefficiencies to stunt Africa's political and economic growth and give rise to Afropessimism. However, by the turn of the twenty-first century sub-Saharan Africa's fortunes seemed to have turned markedly for the better.

A Middle Ground The truth about Africa's impoverishment lies somewhere between the analyses of Afropessimists and Afro-optimists. There is no doubt that corrupt and uncourageous leadership has been the bane of socioeconomic development of sub-Saharan Africa in the postcolonial period. These leaders stunted democratic processes with force in order to preserve a system of one-person rule with no accountability. They awarded overpriced contracts to foreign companies in exchange for large kickbacks deposited in personal accounts in foreign banks. Their conspicuous consumption, cult of personality, nepotism, and naked abuse of political power encouraged a culture of greed, military coups, and instability, which reduced Africa's competitiveness for foreign investment. Governments borrowed billions in the name of the nation and cronies squandered the money, thereby saddling the people with debt. It is also true that in the same period, global political and economic policies reinforced the legacy of colonialism and exacerbated Africa's problems of self-rule. Apartheid South Africa sponsored destabilizing wars in the southern African region, and a cycle of Cold War-surrogate wars and conflicts ravaged Angola and Mozambique. These wars claimed millions of African lives and devastated the economies of the warring countries. Economic adjustment policies of the World Bank forced African countries to cut spending in health, education, and infrastructure in order to save money to service foreign debts. Low international prices of commodities produced by Africans caused African countries to lose about $50 billion in the 1980s and early 1990s, the same period of the most virulent Afropessimism. These externally induced problems combined with internal inefficiencies to stunt Africa's political and economic growth and give rise to Afropessimism. However, by the turn of the twenty-first century sub-Saharan Africa's fortunes seemed to have turned markedly for the better.

ALTRUISM. The term altruism was coined by the French philosopher and sociologist Auguste Comte (1798-1857). Derived from the Italian word altrui, meaning "to others" or "of others," "altruism" was introduced as an antonym for "egoism" to refer to the totality of other-regarding instincts in humans. The new terms altruism, altruist, and altruistic provided nineteenth-century thinkers with a controversial new conceptual framework within which to discuss ancient philosophical, religious, and ethical questions. In the earlier idiom of Enlightenment moralism, these had been expressed as questions about the relationship between particular self-serving passions and benevolent moral sentiments or between the principle of self-love and the authority of the conscience. It was in this earlier idiom that writers such as Thomas Hobbes and Bernard Mandeville expressed their view that all human action was ultimately driven by self-interest and that their critics, including Francis Hutcheson and Joseph Butler, expressed the contrary view that benevolence was as fundamental a principle of human action as self-interest. The conceptual history of "altruism" proper began in the 1850s and has generated its own particular set of scientific, religious, and philosophical questions. ALTRUISM New Dictionary of the History of Ideas 49 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 49 "Altruism" and "altruistic" have been used to refer to at least three different sorts of things: intentions, actions, and ideologies. These three sorts of usage can be grouped under the headings of "psychological altruism," "behavioral altruism," and "ethical altruism." Psychological altruism is any set of inclinations or intentional motivation to help others for their own sakes. Behavioral altruism is defined in terms of consequences rather than intentions: it refers to any action that benefits others (normally with the additional condition that there is some cost to the agent). "Evolutionary altruism" or "biological altruism" is a form of behavioral altruism, since it is defined solely in terms of consequences rather than intentions: it refers to any behavior that reduces the fitness of the organism performing it and increases the fitness of another organism (see Dawkins; Sober and Wilson). Finally, ethical altruism is an ideology stating that the happiness of others should be the principal goal of one's actions. (Ethical egoism, by contrast, states that what the individual should seek above all else is his or her own happiness.) A frequent cause of confusion has been equivocation between the first two of these three possible meanings—between claims about psychology and claims about behavior. The claim that there is no such thing as true altruism, for example, might be intended to convey the view that, psychologically, no one's motives are ever entirely forgetful of self, since we know that we will receive approval and pleasure as a result of our charitable actions. The reply might be that true altruism certainly exists because many people engage in charitable activities at a cost to themselves, but by shifting from the psychological to the behavioral perspective on altruism, this reply fails to rebut the initial claim. Such conceptual confusion and disagreement over the meaning of altruism marked discussions of it from the outset and persist to this day. (Blum provides one useful and concise discussion of some of the definitional and conceptual issues.) Discussions of altruism also have revolved around fundamental empirical, ethical, and political questions. What are the real roots of human altruism? Are they biological, psychological, social, or cultural? Is altruism really the highest moral good? Are we morally obliged to extend our altruism to strangers just as much as to family and friends? Should we even behave altruistically toward nonhuman animals? In what ways can societies be arranged in order to maximize the amount of altruism? Are the best societies, in any case, really those in which altruism is maximized? Comte and Sociology The term altruism was coined, in French (altruisme), by Auguste Comte (1798-1857) in the first volume of his Système de politique positive (1851-1854; System of positive polity). The first uses in English followed in the 1850s and 1860s in works by British thinkers sympathetic to Comte, including George Eliot, G. H. Lewes, and John Stuart Mill (see Dixon, 2005). In the Comtean system, "altruism" stood for the totality of other-regarding sentiments. The new cerebral science of phrenology, Comte said, proved that altruistic sentiments were innate. He heralded this as one of the most important discoveries of modern science and contrasted it with what he presented as the Christian view, namely that human beings are, by nature, entirely selfish (because of the taint of original sin). Comte's hope was that through the institution of a new humanistic religion based on a scientific understanding of human nature and society, civilized nations would develop to a stage where altruistic sentiments prevailed over egoistic ones. Working out how to bring such a society about, Comte taught, was the greatest problem facing humanity. In his view, one of the keys to increased altruism was a recognition of the fact that women, because of their maternal instincts, were more altruistic than men. They therefore should have supreme moral and religious authority (although only within the domestic sphere). Thus the Religion of Humanity, as he called it, encouraged a particular emphasis on feminine moral virtues and the great sanctity of motherhood (see Wright). Another important Comtean coinage with which altruism was initially closely associated was "sociology"—the new science of society. Two of the most significant nineteenth-century theoretical treatments of altruism, other than Comte's own, were also produced by pioneering sociologists, namely Herbert Spencer (1820-1903) and Émile Durkheim (1858-1917). Durkheim, who drew on the sociological theories of both Comte and Spencer while making much greater and more sophisticated use of empirical data than either of them, made a distinction between egoistic, altruistic, and anomic types of suicide in his 1897 study of the subject. Egoistic suicide was most widespread in developed, Western nations (especially strongly Protestant ones), Durkheim said, as a result of the highly developed sense of individual autonomy such nations encouraged. Altruistic suicide, on the other hand, was particularly prevalent among primitive peoples, who had an excessive sense of social integration. The main sorts of altruistic suicide with which Durkheim was concerned were the suicides of men on the threshold of old age or stricken with sickness, suicides of women on their husbands' deaths, and the suicides of followers or servants on the death of their chief (Durkheim, book 2, chapter 4)

ALTRUISM. The term altruism was coined by the French philosopher and sociologist Auguste Comte (1798-1857). Derived from the Italian word altrui, meaning "to others" or "of others," "altruism" was introduced as an antonym for "egoism" to refer to the totality of other-regarding instincts in humans. The new terms altruism, altruist, and altruistic provided nineteenth-century thinkers with a controversial new conceptual framework within which to discuss ancient philosophical, religious, and ethical questions. In the earlier idiom of Enlightenment moralism, these had been expressed as questions about the relationship between particular self-serving passions and benevolent moral sentiments or between the principle of self-love and the authority of the conscience. It was in this earlier idiom that writers such as Thomas Hobbes and Bernard Mandeville expressed their view that all human action was ultimately driven by self-interest and that their critics, including Francis Hutcheson and Joseph Butler, expressed the contrary view that benevolence was as fundamental a principle of human action as self-interest. The conceptual history of "altruism" proper began in the 1850s and has generated its own particular set of scientific, religious, and philosophical questions. ALTRUISM New Dictionary of the History of Ideas 49 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 49 "Altruism" and "altruistic" have been used to refer to at least three different sorts of things: intentions, actions, and ideologies. These three sorts of usage can be grouped under the headings of "psychological altruism," "behavioral altruism," and "ethical altruism." Psychological altruism is any set of inclinations or intentional motivation to help others for their own sakes. Behavioral altruism is defined in terms of consequences rather than intentions: it refers to any action that benefits others (normally with the additional condition that there is some cost to the agent). "Evolutionary altruism" or "biological altruism" is a form of behavioral altruism, since it is defined solely in terms of consequences rather than intentions: it refers to any behavior that reduces the fitness of the organism performing it and increases the fitness of another organism (see Dawkins; Sober and Wilson). Finally, ethical altruism is an ideology stating that the happiness of others should be the principal goal of one's actions. (Ethical egoism, by contrast, states that what the individual should seek above all else is his or her own happiness.) A frequent cause of confusion has been equivocation between the first two of these three possible meanings—between claims about psychology and claims about behavior. The claim that there is no such thing as true altruism, for example, might be intended to convey the view that, psychologically, no one's motives are ever entirely forgetful of self, since we know that we will receive approval and pleasure as a result of our charitable actions. The reply might be that true altruism certainly exists because many people engage in charitable activities at a cost to themselves, but by shifting from the psychological to the behavioral perspective on altruism, this reply fails to rebut the initial claim. Such conceptual confusion and disagreement over the meaning of altruism marked discussions of it from the outset and persist to this day. (Blum provides one useful and concise discussion of some of the definitional and conceptual issues.) Discussions of altruism also have revolved around fundamental empirical, ethical, and political questions. What are the real roots of human altruism? Are they biological, psychological, social, or cultural? Is altruism really the highest moral good? Are we morally obliged to extend our altruism to strangers just as much as to family and friends? Should we even behave altruistically toward nonhuman animals? In what ways can societies be arranged in order to maximize the amount of altruism? Are the best societies, in any case, really those in which altruism is maximized? Comte and Sociology The term altruism was coined, in French (altruisme), by Auguste Comte (1798-1857) in the first volume of his Système de politique positive (1851-1854; System of positive polity). The first uses in English followed in the 1850s and 1860s in works by British thinkers sympathetic to Comte, including George Eliot, G. H. Lewes, and John Stuart Mill (see Dixon, 2005). In the Comtean system, "altruism" stood for the totality of other-regarding sentiments. The new cerebral science of phrenology, Comte said, proved that altruistic sentiments were innate. He heralded this as one of the most important discoveries of modern science and contrasted it with what he presented as the Christian view, namely that human beings are, by nature, entirely selfish (because of the taint of original sin). Comte's hope was that through the institution of a new humanistic religion based on a scientific understanding of human nature and society, civilized nations would develop to a stage where altruistic sentiments prevailed over egoistic ones. Working out how to bring such a society about, Comte taught, was the greatest problem facing humanity. In his view, one of the keys to increased altruism was a recognition of the fact that women, because of their maternal instincts, were more altruistic than men. They therefore should have supreme moral and religious authority (although only within the domestic sphere). Thus the Religion of Humanity, as he called it, encouraged a particular emphasis on feminine moral virtues and the great sanctity of motherhood (see Wright). Another important Comtean coinage with which altruism was initially closely associated was "sociology"—the new science of society. Two of the most significant nineteenth-century theoretical treatments of altruism, other than Comte's own, were also produced by pioneering sociologists, namely Herbert Spencer (1820-1903) and Émile Durkheim (1858-1917). Durkheim, who drew on the sociological theories of both Comte and Spencer while making much greater and more sophisticated use of empirical data than either of them, made a distinction between egoistic, altruistic, and anomic types of suicide in his 1897 study of the subject. Egoistic suicide was most widespread in developed, Western nations (especially strongly Protestant ones), Durkheim said, as a result of the highly developed sense of individual autonomy such nations encouraged. Altruistic suicide, on the other hand, was particularly prevalent among primitive peoples, who had an excessive sense of social integration. The main sorts of altruistic suicide with which Durkheim was concerned were the suicides of men on the threshold of old age or stricken with sickness, suicides of women on their husbands' deaths, and the suicides of followers or servants on the death of their chief (Durkheim, book 2, chapter 4)

ANTICOLONIALISM. This entry includes four subentries: Africa Latin America Middle East Southeast Asia AFRICA In post-World War II history, decolonization is a term generally employed to describe and explain the struggle for, and attainment of, freedom from colonial rule by most countries in Asia and Africa. This attainment was marked by a transfer of ANTICOLONIALISM 80 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 80 power; national political elites assumed the administrative responsibilities and duties previously discharged by the colonial authorities. Thus, new sovereign nations were born. Steadfast struggle through political parties and related movements, in the pursuit of decolonization, marked the era of nationalism in Africa. Nationalism was the indispensable vehicle utilized to achieve the desired goal of decolonization. It is important to point out that the study and analysis of nationalism in Asia and Africa has been affected by the scholarly and ideological controversies that still surround the "national question," nationality, and nationalism. While the power and influence of nationalism is undisputed, Benedict Anderson points out that the terms nation, nationality, and nationalism have all "proved notoriously difficult to define, let alone to analyze." Many scholars, especially in the West, have continued to look at nationalism as an anachronism and therefore as a concept that is not a revealing tool of analysis. Part of the explanation for this scholarly disillusionment is the ill repute nationalism acquired during the era of Nazism and fascism in Europe, when it came to be associated with intolerance and a reactionary chauvinism that was "at odds with the proper destiny of man." The study of nationalism has also been a source of intellectual and ideological frustration to Marxists, who have traditionally been troubled by its "chameleon qualities." Nationalism "takes many different forms, is supported by many different groups and has different political effects." Unlike Marxism, which places much emphasis on a society's class structure, economics, and "form of economic organization," nationalism is basically political and cultural. This explains in part why Marxism and nationalism have had a "difficult dialogue" over the years. In Asia and Africa, post-World War II nationalism was, above all, a "revolt against the West" (Barraclough), its chief characteristic "resistance to alien domination." This resistance, which led to decolonization, ultimately created a multitude of nations out of lands that had had "little or no national consciousness." It is fair to conclude that in order to comprehend the centrality and diversity of nationalism in postwar Asian and African history, "European modalities" may not be strictly relevant.

ANTICOLONIALISM. This entry includes four subentries: Africa Latin America Middle East Southeast Asia AFRICA In post-World War II history, decolonization is a term generally employed to describe and explain the struggle for, and attainment of, freedom from colonial rule by most countries in Asia and Africa. This attainment was marked by a transfer of ANTICOLONIALISM 80 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 80 power; national political elites assumed the administrative responsibilities and duties previously discharged by the colonial authorities. Thus, new sovereign nations were born. Steadfast struggle through political parties and related movements, in the pursuit of decolonization, marked the era of nationalism in Africa. Nationalism was the indispensable vehicle utilized to achieve the desired goal of decolonization. It is important to point out that the study and analysis of nationalism in Asia and Africa has been affected by the scholarly and ideological controversies that still surround the "national question," nationality, and nationalism. While the power and influence of nationalism is undisputed, Benedict Anderson points out that the terms nation, nationality, and nationalism have all "proved notoriously difficult to define, let alone to analyze." Many scholars, especially in the West, have continued to look at nationalism as an anachronism and therefore as a concept that is not a revealing tool of analysis. Part of the explanation for this scholarly disillusionment is the ill repute nationalism acquired during the era of Nazism and fascism in Europe, when it came to be associated with intolerance and a reactionary chauvinism that was "at odds with the proper destiny of man." The study of nationalism has also been a source of intellectual and ideological frustration to Marxists, who have traditionally been troubled by its "chameleon qualities." Nationalism "takes many different forms, is supported by many different groups and has different political effects." Unlike Marxism, which places much emphasis on a society's class structure, economics, and "form of economic organization," nationalism is basically political and cultural. This explains in part why Marxism and nationalism have had a "difficult dialogue" over the years. In Asia and Africa, post-World War II nationalism was, above all, a "revolt against the West" (Barraclough), its chief characteristic "resistance to alien domination." This resistance, which led to decolonization, ultimately created a multitude of nations out of lands that had had "little or no national consciousness." It is fair to conclude that in order to comprehend the centrality and diversity of nationalism in postwar Asian and African history, "European modalities" may not be strictly relevant.

Afrocentric Organizing Principle and Concepts Europe's attempted occupation of practically all human space resulted in Africans being considerably removed from their own cultural base to be relegated to footnote status, to the periphery, the margin of the European experience and consciousness. This mental disenfranchisement is held responsible for Africans often not existing on their own cultural and historical terms but on borrowed European ones. Africans are dislocated and, having lost sight of themselves in the midst of social decay, find it exceedingly difficult to orient themselves in a positive and constructive manner, a most difficult plight. Relocation is the remedy suggested by Afrocentricity. Only when Africans become centered, that is, when they consciously and systematically adopt ways, attitudes, and behaviors that are germane to their own cultural traditions and historical reality, can they hope to achieve freedom. In other words, African freedom is predicated upon the conscious activation of one's Africanness, that is, ultimately, with the exercise by African people of their own agency. Afrocentricity further stresses agency as an African cultural imperative. Indeed, in African culture, ancestral traditions must be preserved and transmitted out of respect for one's personal and collective ancestors. Asante therefore defines Afrocentricity as "a frame of reference" generated by Africans themselves, based on African cosmology, axiology, aesthetic, and epistemology: "Afrocentricity is the study of the ideas and events from the standpoint of Africans as the key players rather than victims. This theory becomes, by virtue of an authentic relationship to the centrality of our own reality, a fundamentally empirical project" (1991, p. 172). Asante further insists that while one may argue over the meaning of Africanness, one cannot argue, as an Afrocentrist, over "the centrality of African ideals and values" for African people (1990, p. 6), thus identifying the notion of cultural, and more specifically, epistemological centeredness as the Afrocentric organizing principle. In addition to this major principle, Afrocentricity includes a set of unquestioned propositions that it inherited from its intellectual and ideological antecedents, namely, Garveyism, the negritude movement, Fanonism, Kawaida, and Cheikh Anta Diop's historiography. Those propositions can be listed as follows: African people must be conceived as agents and victors; a Pan-African perspective is essential; a deep commitment to African people and Africa is necessary; there exists an African cultural matrix common to all African people with different surface manifestations; culture is primary and all-inclusive; Africans must reconnect with African culture for genuine African freedom to be; African cultural rebirth is necessary; the colonizer within the African psyche must be killed; and finally, Nile Valley civilizations (in particular, ancient Egypt, or Kemet) are the foundation of African culture and will serve as a model upon which to elaborate new bodies of thought and action relevant to African contemporary needs. Those principles, which are primary both chronologically and logically, function very much as Afrocentricity's premises

Afrocentric Organizing Principle and Concepts Europe's attempted occupation of practically all human space resulted in Africans being considerably removed from their own cultural base to be relegated to footnote status, to the periphery, the margin of the European experience and consciousness. This mental disenfranchisement is held responsible for Africans often not existing on their own cultural and historical terms but on borrowed European ones. Africans are dislocated and, having lost sight of themselves in the midst of social decay, find it exceedingly difficult to orient themselves in a positive and constructive manner, a most difficult plight. Relocation is the remedy suggested by Afrocentricity. Only when Africans become centered, that is, when they consciously and systematically adopt ways, attitudes, and behaviors that are germane to their own cultural traditions and historical reality, can they hope to achieve freedom. In other words, African freedom is predicated upon the conscious activation of one's Africanness, that is, ultimately, with the exercise by African people of their own agency. Afrocentricity further stresses agency as an African cultural imperative. Indeed, in African culture, ancestral traditions must be preserved and transmitted out of respect for one's personal and collective ancestors. Asante therefore defines Afrocentricity as "a frame of reference" generated by Africans themselves, based on African cosmology, axiology, aesthetic, and epistemology: "Afrocentricity is the study of the ideas and events from the standpoint of Africans as the key players rather than victims. This theory becomes, by virtue of an authentic relationship to the centrality of our own reality, a fundamentally empirical project" (1991, p. 172). Asante further insists that while one may argue over the meaning of Africanness, one cannot argue, as an Afrocentrist, over "the centrality of African ideals and values" for African people (1990, p. 6), thus identifying the notion of cultural, and more specifically, epistemological centeredness as the Afrocentric organizing principle. In addition to this major principle, Afrocentricity includes a set of unquestioned propositions that it inherited from its intellectual and ideological antecedents, namely, Garveyism, the negritude movement, Fanonism, Kawaida, and Cheikh Anta Diop's historiography. Those propositions can be listed as follows: African people must be conceived as agents and victors; a Pan-African perspective is essential; a deep commitment to African people and Africa is necessary; there exists an African cultural matrix common to all African people with different surface manifestations; culture is primary and all-inclusive; Africans must reconnect with African culture for genuine African freedom to be; African cultural rebirth is necessary; the colonizer within the African psyche must be killed; and finally, Nile Valley civilizations (in particular, ancient Egypt, or Kemet) are the foundation of African culture and will serve as a model upon which to elaborate new bodies of thought and action relevant to African contemporary needs. Those principles, which are primary both chronologically and logically, function very much as Afrocentricity's premises

Because racial oppression is a system constructed around a matrix of domination, discrimination, and degradation, an appraisal of African-American political and social thought reveals that black activist intellectuals have mainly engaged two issue clusters: those that revolve around questions of identity and those concerning questions of liberation: "Who are we?" and "What is our present situation, and what should be done about it?" During slavery, the system of racial oppression attempted to destroy African identities, ethnic memories and cultural practices, and the collective and personal identities that derived from them. Since slavery it has sought to make AfricanAmericans nonpersons, requiring, therefore, the search for and (re)assertion of new identities woven from the residue of African survivals and self-interested adaptations, a process that transcends the limits of what is derisively called "identity politics." Identity questions for oppressed racialized communities are fundamental to the pursuit of liberation. These overarching questions about identity, the present situation, and what should be done to create conditions of freedom, self-determination, and social transformation have elicited different answers from different groups of black activist intellectuals. Allowing for terminological differences, most scholars of black history and politics would agree that African-American activist intellectuals have justified their political action via an interpretative repertoire drawn from one of the following five interrelated ideological approaches: (1) autonomic, (2) incorporative, (3) black conservatism, (4) black radicalism, and (5) black feminism.

Because racial oppression is a system constructed around a matrix of domination, discrimination, and degradation, an appraisal of African-American political and social thought reveals that black activist intellectuals have mainly engaged two issue clusters: those that revolve around questions of identity and those concerning questions of liberation: "Who are we?" and "What is our present situation, and what should be done about it?" During slavery, the system of racial oppression attempted to destroy African identities, ethnic memories and cultural practices, and the collective and personal identities that derived from them. Since slavery it has sought to make AfricanAmericans nonpersons, requiring, therefore, the search for and (re)assertion of new identities woven from the residue of African survivals and self-interested adaptations, a process that transcends the limits of what is derisively called "identity politics." Identity questions for oppressed racialized communities are fundamental to the pursuit of liberation. These overarching questions about identity, the present situation, and what should be done to create conditions of freedom, self-determination, and social transformation have elicited different answers from different groups of black activist intellectuals. Allowing for terminological differences, most scholars of black history and politics would agree that African-American activist intellectuals have justified their political action via an interpretative repertoire drawn from one of the following five interrelated ideological approaches: (1) autonomic, (2) incorporative, (3) black conservatism, (4) black radicalism, and (5) black feminism.

Christianity and Unbelief At its inception, the concept of altruism resonated widely in a Victorian culture saturated with moral and religious earnestness (see Collini). Some were attracted to Comtean positivism and its worship of humanity as an eminently respectable form of unbelief, one that combined a commitment to the sciences with a continuing religious sense and with the strong social conscience that the positivist ideology of altruism involved (see Wright). On the other hand, some who were committed to a Christian view of morality and society saw in Comtean altruism a concept of the love of others that was detached both from an understanding of appropriate self-love and from the necessity of a love of God. There were also those who saw in humanistic celebrations of altruism simply a secularized version of the Christian ideology of service to others (see Dixon, 2004, 2005). This last view was held by both proponents and opponents of Christianity. Among the latter, one of the most trenchant was Friedrich Nietzsche (1844-1900). In Nietzsche's vision, Christianity was at the root of all ideologies of altruism, self-sacrifice, and pity—in short, of the "slave morality" that was the exact opposite of the assertive and aristocratic ideals he celebrated (see Nietzsche and the introduction by Ansell-Pearson). From the twentieth century onward, once the origins of altruism in Comte's atheistic philosophy had largely been forgotten, it was much more common to encounter the assumption that altruism was a term that encapsulated the heart of Christian teaching. The French philosopher Jacques Maritain (1882-1973), however, continued to press the point that Comte's extreme and atheistical concept of altruism differed significantly from Christian love, whether human or divine. The difference between Christian love and altruism that Maritain insisted upon could be summarized as the difference between loving one's neighbor as oneself and loving one's neighbor instead of oneself (see Maritain). Nonetheless, some Christian writers still consider altruism to be virtually identical to Christian love, or agape.

Christianity and Unbelief At its inception, the concept of altruism resonated widely in a Victorian culture saturated with moral and religious earnestness (see Collini). Some were attracted to Comtean positivism and its worship of humanity as an eminently respectable form of unbelief, one that combined a commitment to the sciences with a continuing religious sense and with the strong social conscience that the positivist ideology of altruism involved (see Wright). On the other hand, some who were committed to a Christian view of morality and society saw in Comtean altruism a concept of the love of others that was detached both from an understanding of appropriate self-love and from the necessity of a love of God. There were also those who saw in humanistic celebrations of altruism simply a secularized version of the Christian ideology of service to others (see Dixon, 2004, 2005). This last view was held by both proponents and opponents of Christianity. Among the latter, one of the most trenchant was Friedrich Nietzsche (1844-1900). In Nietzsche's vision, Christianity was at the root of all ideologies of altruism, self-sacrifice, and pity—in short, of the "slave morality" that was the exact opposite of the assertive and aristocratic ideals he celebrated (see Nietzsche and the introduction by Ansell-Pearson). From the twentieth century onward, once the origins of altruism in Comte's atheistic philosophy had largely been forgotten, it was much more common to encounter the assumption that altruism was a term that encapsulated the heart of Christian teaching. The French philosopher Jacques Maritain (1882-1973), however, continued to press the point that Comte's extreme and atheistical concept of altruism differed significantly from Christian love, whether human or divine. The difference between Christian love and altruism that Maritain insisted upon could be summarized as the difference between loving one's neighbor as oneself and loving one's neighbor instead of oneself (see Maritain). Nonetheless, some Christian writers still consider altruism to be virtually identical to Christian love, or agape.

For four hundred years, African-Americans have been engaged in a fierce struggle, a struggle for freedom, justice and equality, empowerment and self-determination, or social transformation, depending on one's ideology and its discourses. The lived African-American experience, in its class, gender, generational, and regional specificity, and the struggle against black racial oppression in the form of black social movements, are the soil from which African-American political and social thought are produced. Different social movements—abolition, the nineteenth-century Great Black March West (1879-1910), the protective leagues during the nadir (1877-1917), the New Negro Movement of the early 1900s, the Depression-era struggles, and the civil rights and black power movements of the 1960s—have developed distinct goals and objectives and consequently have evolved quite different strategies, ideologies, and discourses. Contrary to popular opinion, African-American political thought has always been a roiling sea of competing ideological currents. Political scientist Robert C. Smith described ideology as "the enduring dilemma of black politics" because of its variety and vibrancy. The tradition of viewing African-American history through the lens of historical debate underscores the diverse and dynamic character of African-American political discourse. For instance, it is popular to compare and contrast the ideas of Frederick Douglass (1817-1895) and Martin R. Delany (1812-1885), W. E. B. DuBois (1868-1963) and Booker T. Washington (1856-1915), and Marcus Garvey (1887-1940), Ida WellsBarrett (1862-1931) and Margaret Murray Washington (1863-1953) or, more recently, Martin Luther King (1929- 1968) and Malcolm X (1925-1965).

For four hundred years, African-Americans have been engaged in a fierce struggle, a struggle for freedom, justice and equality, empowerment and self-determination, or social transformation, depending on one's ideology and its discourses. The lived African-American experience, in its class, gender, generational, and regional specificity, and the struggle against black racial oppression in the form of black social movements, are the soil from which African-American political and social thought are produced. Different social movements—abolition, the nineteenth-century Great Black March West (1879-1910), the protective leagues during the nadir (1877-1917), the New Negro Movement of the early 1900s, the Depression-era struggles, and the civil rights and black power movements of the 1960s—have developed distinct goals and objectives and consequently have evolved quite different strategies, ideologies, and discourses. Contrary to popular opinion, African-American political thought has always been a roiling sea of competing ideological currents. Political scientist Robert C. Smith described ideology as "the enduring dilemma of black politics" because of its variety and vibrancy. The tradition of viewing African-American history through the lens of historical debate underscores the diverse and dynamic character of African-American political discourse. For instance, it is popular to compare and contrast the ideas of Frederick Douglass (1817-1895) and Martin R. Delany (1812-1885), W. E. B. DuBois (1868-1963) and Booker T. Washington (1856-1915), and Marcus Garvey (1887-1940), Ida WellsBarrett (1862-1931) and Margaret Murray Washington (1863-1953) or, more recently, Martin Luther King (1929- 1968) and Malcolm X (1925-1965).

First attempts to explain America. Although the Americas were undoubtedly visited by the Vikings around the year 1000, the "discovery" of America is attributed to Christopher Columbus, whose voyage to America in 1492 captured the European imagination. Ironically, to Columbus's dying day, he insisted that what he had found was part of Asia. Thus, perceptions of America have been mistaken from the very beginning. (Sixteenth-century mapmakers, recognizing Columbus's mistake, named the New World not after him, but after Amerigo Vespucci—hence the name America—whom they credited as the first to realize that the New World was its own continent.) The Indians of America were misrepresented from the very beginning and ever since their discovery. Not only did Columbus believe America was someplace else—hence the name Indians—but his description of its inhabitants was fanciful, too. He claimed to discover cannibals, Cyclopes, Amazons, Sirens, dog-faced peoples, people with no hair, and people with tails. These bizarre claims were suggested to him by centuries of fanciful tales passed on through medieval times by supposedly reliable authorities. In short, Columbus claimed to find what he was looking for. This began a pattern of preformed opinions dictating what is supposedly found in America. He saw the land as potential wealth and its people as possible converts or slaves. For him, as for most of the early conquistadores and missionaries, the Indians had no independent status, no integrity of their own. They were just to be used. The Spanish Renaissance philosophers who first reflected on the discovery of the Indians did little better in appreciating them. Two positions dominated the Spanish debates. The first position, arguing that the Indians did not possess the faculty of reason, went so far as to argue that the Indians were the concrete embodiment of Aristotle's natural slave. According to this view, the Indians could be incorporated into Europe's traditional Christian-Aristotelian worldview but only in its lowest place. God created the Indians as naturally inferior, the argument went, so it was just and right that the Spanish subjugate them. The second view saw the Indians as rational—as evidenced by their languages, economics, and politics—but as underdeveloped and needing Spanish tutelage. Because they were human, the Indians had to be governed by consent—not their formal, explicit consent, but rather what they would consent to after they came to understand the natural law, which of course the Spanish thought they possessed. In short, because the Spanish were so confident in their worldview, it never occurred to them that they might be incorrect or possess only a partial truth. Their cultural confidence led them to reject the Americans as barbaric.

First attempts to explain America. Although the Americas were undoubtedly visited by the Vikings around the year 1000, the "discovery" of America is attributed to Christopher Columbus, whose voyage to America in 1492 captured the European imagination. Ironically, to Columbus's dying day, he insisted that what he had found was part of Asia. Thus, perceptions of America have been mistaken from the very beginning. (Sixteenth-century mapmakers, recognizing Columbus's mistake, named the New World not after him, but after Amerigo Vespucci—hence the name America—whom they credited as the first to realize that the New World was its own continent.) The Indians of America were misrepresented from the very beginning and ever since their discovery. Not only did Columbus believe America was someplace else—hence the name Indians—but his description of its inhabitants was fanciful, too. He claimed to discover cannibals, Cyclopes, Amazons, Sirens, dog-faced peoples, people with no hair, and people with tails. These bizarre claims were suggested to him by centuries of fanciful tales passed on through medieval times by supposedly reliable authorities. In short, Columbus claimed to find what he was looking for. This began a pattern of preformed opinions dictating what is supposedly found in America. He saw the land as potential wealth and its people as possible converts or slaves. For him, as for most of the early conquistadores and missionaries, the Indians had no independent status, no integrity of their own. They were just to be used. The Spanish Renaissance philosophers who first reflected on the discovery of the Indians did little better in appreciating them. Two positions dominated the Spanish debates. The first position, arguing that the Indians did not possess the faculty of reason, went so far as to argue that the Indians were the concrete embodiment of Aristotle's natural slave. According to this view, the Indians could be incorporated into Europe's traditional Christian-Aristotelian worldview but only in its lowest place. God created the Indians as naturally inferior, the argument went, so it was just and right that the Spanish subjugate them. The second view saw the Indians as rational—as evidenced by their languages, economics, and politics—but as underdeveloped and needing Spanish tutelage. Because they were human, the Indians had to be governed by consent—not their formal, explicit consent, but rather what they would consent to after they came to understand the natural law, which of course the Spanish thought they possessed. In short, because the Spanish were so confident in their worldview, it never occurred to them that they might be incorrect or possess only a partial truth. Their cultural confidence led them to reject the Americans as barbaric.

Impact In the donor countries of the West in the 1980s, Afropessimists were found in the government, media, and academia. The prevailing view that votes for Africa's stabilization and development were a waste of scarce resources was fanned by conservative politicians, bureaucrats, journalists, and scholars. This quickly led to an era of strained relationships between Western donor countries and African recipient countries in the late 1980s and early 1990s. Donor countries complained that progress was being slowed by bad governance, corruption, and mismanagement of funds, creating disillusionment and donor (or aid) fatigue. African countries in turn complained of unfulfilled promises and unwarranted intrusiveness in domestic policies by donors. The net result was a reduction in the volume of development aid from Western to African countries.

Impact In the donor countries of the West in the 1980s, Afropessimists were found in the government, media, and academia. The prevailing view that votes for Africa's stabilization and development were a waste of scarce resources was fanned by conservative politicians, bureaucrats, journalists, and scholars. This quickly led to an era of strained relationships between Western donor countries and African recipient countries in the late 1980s and early 1990s. Donor countries complained that progress was being slowed by bad governance, corruption, and mismanagement of funds, creating disillusionment and donor (or aid) fatigue. African countries in turn complained of unfulfilled promises and unwarranted intrusiveness in domestic policies by donors. The net result was a reduction in the volume of development aid from Western to African countries.

LATIN AMERICA Over the past five hundred years, Latin America has experienced three and possibly four periods of colonization, all of which gave rise to anticolonial movements. The first period symbolically began with Christopher Columbus's arrival in the Americas on 12 October 1492, launching three centuries of Spanish, Portuguese, and British colonial control over the hemisphere, with the French, Dutch, Danish, and other European powers competing for slices of the action in the Caribbean. In most of Latin America, this period came to an end with the wars of independence from about 1810 to 1825. Political independence ushered in a second period (known as neocolonialism), in which the countries of Latin America were still subject to foreign economic control—this time largely by the British. During the third period, corresponding to the twentieth century, this economic dependency shifted from the British to the United States, and anticolonial responses increasingly assumed anti-imperialistic characteristics. The twenty-first century arguably introduced a fourth period of neocolonialism, in which Latin America has become subject to control through the maquiladora system to transnational capital not necessarily rooted in one country and in which the export commodity is labor rather than raw materials

LATIN AMERICA Over the past five hundred years, Latin America has experienced three and possibly four periods of colonization, all of which gave rise to anticolonial movements. The first period symbolically began with Christopher Columbus's arrival in the Americas on 12 October 1492, launching three centuries of Spanish, Portuguese, and British colonial control over the hemisphere, with the French, Dutch, Danish, and other European powers competing for slices of the action in the Caribbean. In most of Latin America, this period came to an end with the wars of independence from about 1810 to 1825. Political independence ushered in a second period (known as neocolonialism), in which the countries of Latin America were still subject to foreign economic control—this time largely by the British. During the third period, corresponding to the twentieth century, this economic dependency shifted from the British to the United States, and anticolonial responses increasingly assumed anti-imperialistic characteristics. The twenty-first century arguably introduced a fourth period of neocolonialism, in which Latin America has become subject to control through the maquiladora system to transnational capital not necessarily rooted in one country and in which the export commodity is labor rather than raw materials

Many writers have given different expressions to the phenomenon of Afropessimism. Attempts to explain the concept include both cogent studies (Ayittey, 1992, 1998; Jackson and Rosberg; Kaplan, 1994) and polemical and shallow travelogues (Richburg). In general, one virtue of Afropessimist writings is that they do not whitewash Africa's problems. Further, they correctly refuse to excuse the outrages of some African dictators on the basis of political ideology or racial identity. In particular they refuse to use colonial exploitation to mask postcolonial kleptocracy, the personalization of state power, and the politics of prebendalism. The writers mentioned above (excepting Richburg and Kaplan) do not reject the hope that Africa can develop or that it is capable of overcoming its political and economic problems. In this sense they are not themselves pessimistic about the future of Africa but rather are simply describing the phenomenon of Afropessimism. The real Afropessimists are writers who call for abandoning, or worse, recolonizing the continent (Johnson; Kaplan, 1992, 1994; Michaels; Hitchens). While generally the writers in the first group merely denounce postcolonial African leadership by pointing out its weaknesses, those in the latter tend to conclude that Africans are incapable of self-rule. However, a common characteristic of the two modes of Afropessimist writings is imbalance. They all tend to highlight the horrors of a few African countries and ignore the advances of many other countries at various times. The unscientific establishment of doomsday conclusions about Africa characteristic of studies in this genre (see, in particular, Richberg) are usually not warranted by the limited sample of African countries discussed in the narratives. The unintended result is that Africa is given a blanket negative portrayal. (There are, by contrast, prominent works that for the most part decry Africa's image in the West—see, for instance, Hammond and Jablow; Hawk.) The resultant foreboding and ominous image in Western media and the academy weakens the continent in the global competition for foreign investment and tourism (see Onwudiwe, 1996). This is an economic effect of Afropessimism

Many writers have given different expressions to the phenomenon of Afropessimism. Attempts to explain the concept include both cogent studies (Ayittey, 1992, 1998; Jackson and Rosberg; Kaplan, 1994) and polemical and shallow travelogues (Richburg). In general, one virtue of Afropessimist writings is that they do not whitewash Africa's problems. Further, they correctly refuse to excuse the outrages of some African dictators on the basis of political ideology or racial identity. In particular they refuse to use colonial exploitation to mask postcolonial kleptocracy, the personalization of state power, and the politics of prebendalism. The writers mentioned above (excepting Richburg and Kaplan) do not reject the hope that Africa can develop or that it is capable of overcoming its political and economic problems. In this sense they are not themselves pessimistic about the future of Africa but rather are simply describing the phenomenon of Afropessimism. The real Afropessimists are writers who call for abandoning, or worse, recolonizing the continent (Johnson; Kaplan, 1992, 1994; Michaels; Hitchens). While generally the writers in the first group merely denounce postcolonial African leadership by pointing out its weaknesses, those in the latter tend to conclude that Africans are incapable of self-rule. However, a common characteristic of the two modes of Afropessimist writings is imbalance. They all tend to highlight the horrors of a few African countries and ignore the advances of many other countries at various times. The unscientific establishment of doomsday conclusions about Africa characteristic of studies in this genre (see, in particular, Richberg) are usually not warranted by the limited sample of African countries discussed in the narratives. The unintended result is that Africa is given a blanket negative portrayal. (There are, by contrast, prominent works that for the most part decry Africa's image in the West—see, for instance, Hammond and Jablow; Hawk.) The resultant foreboding and ominous image in Western media and the academy weakens the continent in the global competition for foreign investment and tourism (see Onwudiwe, 1996). This is an economic effect of Afropessimism

Non-Spanish Caribbean European colonization of the Caribbean began with Colombus's arrival in 1492, and the region was so highly valued that it remained under the control of various European empires longer than any other part of the hemisphere. Spain maintained— and then lost—control over the largest and most populous islands of Cuba, Hispaniola, and Puerto Rico, known as the Greater Antilles. Other European powers, including the British, French, and Dutch, intruded into the Spanish domain and established a significant presence, particularly on the smaller islands, known as the Lesser Antilles, where descendants of African slaves and Asian indentured workers imported to replace the decimated indigenous population led many of the anticolonial movements. As they did in Africa and Asia, modern nationalist anticolonial movements in much of the Caribbean emerged in the aftermath of World War II, with its emphasis on the values of democracy and self-determination. As Cary Fraser argues, independence movements in the Caribbean must be understood in the context of these broader decolonization efforts. During the second half of the twentieth century, some of the islands gained their independence, although the British, French, and Dutch still retained colonial control over several smaller islands. Many of the residents benefited economically from access to European welfare systems, which dampened anticolonial agitation. Even after independence, many of the colonies maintained close relationships with their mother countries, leaving imprints on their political culture that marked them as significantly different from Latin America. For example, the former British colonies remained part of the Commonwealth and retained the British queen as their monarch. As the European empires collapsed, U.S. economic, political, and ideological interests gained increased hegemony over the Caribbean. Tourism and providing tax havens for foreign banks and corporations became the area's primary roles in the global economy. An example of the United States' ambiguous commitment to self-determination and its growing neocolonial control was its successful efforts to unseat Cheddi Jagan and his People's Progressive Party from the presidency of British Guiana in the early 1960s. United States opposition to Jagan, who was influenced by Marxist ideology and maintained friendly ties with the communist world, indicated that the Caribbean (as well as Latin America in general) would remain within the U.S. sphere of influence.

Non-Spanish Caribbean European colonization of the Caribbean began with Colombus's arrival in 1492, and the region was so highly valued that it remained under the control of various European empires longer than any other part of the hemisphere. Spain maintained— and then lost—control over the largest and most populous islands of Cuba, Hispaniola, and Puerto Rico, known as the Greater Antilles. Other European powers, including the British, French, and Dutch, intruded into the Spanish domain and established a significant presence, particularly on the smaller islands, known as the Lesser Antilles, where descendants of African slaves and Asian indentured workers imported to replace the decimated indigenous population led many of the anticolonial movements. As they did in Africa and Asia, modern nationalist anticolonial movements in much of the Caribbean emerged in the aftermath of World War II, with its emphasis on the values of democracy and self-determination. As Cary Fraser argues, independence movements in the Caribbean must be understood in the context of these broader decolonization efforts. During the second half of the twentieth century, some of the islands gained their independence, although the British, French, and Dutch still retained colonial control over several smaller islands. Many of the residents benefited economically from access to European welfare systems, which dampened anticolonial agitation. Even after independence, many of the colonies maintained close relationships with their mother countries, leaving imprints on their political culture that marked them as significantly different from Latin America. For example, the former British colonies remained part of the Commonwealth and retained the British queen as their monarch. As the European empires collapsed, U.S. economic, political, and ideological interests gained increased hegemony over the Caribbean. Tourism and providing tax havens for foreign banks and corporations became the area's primary roles in the global economy. An example of the United States' ambiguous commitment to self-determination and its growing neocolonial control was its successful efforts to unseat Cheddi Jagan and his People's Progressive Party from the presidency of British Guiana in the early 1960s. United States opposition to Jagan, who was influenced by Marxist ideology and maintained friendly ties with the communist world, indicated that the Caribbean (as well as Latin America in general) would remain within the U.S. sphere of influence.

Practical Origins in Hellenistic Egypt Although alchemy's roots undoubtedly extend as far back as metallurgy itself, the textual record dates to the first centuries C.E. in the Egyptian city of Alexandria. Immersed in an extraordinary mix of cultures and traditions, Alexandrian alchemists blended Greek matter theory and philosophy, Neoplatonism, Gnosticism, Babylonian astrology, Egyptian mythology, mystery cults, and craft recipes for making cosmetics, beer, precious stones, and gold. Because the few extant texts from this period were written in Greek, this initial period is typically known as Greek alchemy. The oldest text documenting early alchemy is the Physika kai mystika (Of natural and mystical things), purportedly written by the Greek natural philosopher Democritus but likely written by Bolos of Mendes (third century B.C.E.). The Physika kai mystika and other similar texts (such as two anonymous Egyptian papyri known as the Leiden Papyrus X and the Stockholm Papyrus) focus on the kind of practical knowledge that would continue to engage alchemists for centuries, providing instructions for how to manufacture and "multiply" gold and silver, as well as how to produce chemically other valuable gemstones, pearls, and dyes. The works of a female alchemist from Hellenistic Egypt named Maria the Jewess (fl. 250 C.E.) contain the oldest descriptions of some of alchemy's most important apparatus, namely alchemical furnaces and stills.Practical Origins in Hellenistic Egypt Although alchemy's roots undoubtedly extend as far back as metallurgy itself, the textual record dates to the first centuries C.E. in the Egyptian city of Alexandria. Immersed in an extraordinary mix of cultures and traditions, Alexandrian alchemists blended Greek matter theory and philosophy, Neoplatonism, Gnosticism, Babylonian astrology, Egyptian mythology, mystery cults, and craft recipes for making cosmetics, beer, precious stones, and gold. Because the few extant texts from this period were written in Greek, this initial period is typically known as Greek alchemy. The oldest text documenting early alchemy is the Physika kai mystika (Of natural and mystical things), purportedly written by the Greek natural philosopher Democritus but likely written by Bolos of Mendes (third century B.C.E.). The Physika kai mystika and other similar texts (such as two anonymous Egyptian papyri known as the Leiden Papyrus X and the Stockholm Papyrus) focus on the kind of practical knowledge that would continue to engage alchemists for centuries, providing instructions for how to manufacture and "multiply" gold and silver, as well as how to produce chemically other valuable gemstones, pearls, and dyes. The works of a female alchemist from Hellenistic Egypt named Maria the Jewess (fl. 250 C.E.) contain the oldest descriptions of some of alchemy's most important apparatus, namely alchemical furnaces and stills.

Practical Origins in Hellenistic Egypt Although alchemy's roots undoubtedly extend as far back as metallurgy itself, the textual record dates to the first centuries C.E. in the Egyptian city of Alexandria. Immersed in an extraordinary mix of cultures and traditions, Alexandrian alchemists blended Greek matter theory and philosophy, Neoplatonism, Gnosticism, Babylonian astrology, Egyptian mythology, mystery cults, and craft recipes for making cosmetics, beer, precious stones, and gold. Because the few extant texts from this period were written in Greek, this initial period is typically known as Greek alchemy. The oldest text documenting early alchemy is the Physika kai mystika (Of natural and mystical things), purportedly written by the Greek natural philosopher Democritus but likely written by Bolos of Mendes (third century B.C.E.). The Physika kai mystika and other similar texts (such as two anonymous Egyptian papyri known as the Leiden Papyrus X and the Stockholm Papyrus) focus on the kind of practical knowledge that would continue to engage alchemists for centuries, providing instructions for how to manufacture and "multiply" gold and silver, as well as how to produce chemically other valuable gemstones, pearls, and dyes. The works of a female alchemist from Hellenistic Egypt named Maria the Jewess (fl. 250 C.E.) contain the oldest descriptions of some of alchemy's most important apparatus, namely alchemical furnaces and stills.

Probability, Uncertainty, and the Arrow of Time In the 1680s Isaac Newton's concept of absolute, mathematical time depicted a uniform flow deprived of any psychological aspect, including a propensity to flow only toward the future. In the 1700s Pierre-Simon Laplace's rigid, deterministic viewpoint left no space to uncertainties and contradictions. In the 1820s, however, Nicolas-Léonard-Sadi Carnot's second principle of thermodynamics and Rudolf Clausius's principle of the increase of entropy or disorder in isolated systems attached a directional arrow to time from past to future. In the 1900s Albert Einstein's theory of relativity assigned time an additional role in the fourth dimension of physical space known as the space-time continuum. In the 1920s the AMBIGUITY New Dictionary of the History of Ideas 53 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 53 probabilistic approach and Werner Karl Heisenberg's uncertainty principle of quantum mechanics brought an end to certainties. In the 1960s, the irreversible thermodynamics of nonlinear systems removed from equilibrium by fluxes of energy, matter, and information regarded time as the creator of spatial, temporal, or functional structural order. These systems include the mind. Most likely the above breakthroughs in the Weltanschauung (worldview), relevant for an analysis ennobling ambiguity, played a role in focusing the attention of eminent philosophers—Immanuel Kant, Georg Wilhelm Friedrich Hegel, Arthur Schopenhauer, Friedrich Nietzsche, HenriLouis Bergson, and Jean-Paul Sartre, among them—on the dynamics of the processes of transformation rather than on Aristotle's statics of the objects. Even closer correlations can be conjectured between the scientific and artistic milieus. Look at, for example, Claude Monet's Waterloo Bridge, Effect of Fog (1903, Hermitage State Museum). While looking at this painting, the observer, driven by curiosity, correlates his or her sensory stimuli, assembling them in an interiorized pattern. While this mental pattern develops, the fog on the Thames seems to lift slowly, until a critical state is reached where the bridge, the boats on the river, and the urban background merge into the meaning of the painting. This critical state, at a boundary sharing foggy and meaningless scenery and, at the same time, a meaningful picture, is loaded with ambiguity. The mental process just described can be viewed as a metaphor of Jean Piaget's statement, "The intelligence organizes the world while it organizes itself." This aphorism leads to self-referentiality. Contextually, ambiguity sneaks in: "Concerning what one cannot talk about, it's necessary to be silent," Ludwig Wittgenstein writes, and yet he talks—and is "silent"—at the same time. Should one agree in interpreting ambiguity as equivocalness, self-referentiality would make the language totally ambiguous. Rome? A city, a town, and a four-letter word. Again with reference to the above breakthroughs, think of a cubist portrait by Picasso. Its perception lends itself ambiguously to several reconstructions of percepts—front figure, profile, and so forth—and recalls the process of measurement of a quantum structure: a process whose result allows us to access, with different probabilities, the several possible basic modes of being (or behaving) characteristic of the structure. Similar considerations hold for the ambiguous representation of the fourth dimension on a two-dimensional canvas, seen in several futurist de-structured paintings and in Marcel Duchamp's Nude Descending a Staircase, No. 2 (1912, Philadelphia Museum of Art), an organization of kinetic elements expressing the space-time continuum through the abstract representation of movement.

Probability, Uncertainty, and the Arrow of Time In the 1680s Isaac Newton's concept of absolute, mathematical time depicted a uniform flow deprived of any psychological aspect, including a propensity to flow only toward the future. In the 1700s Pierre-Simon Laplace's rigid, deterministic viewpoint left no space to uncertainties and contradictions. In the 1820s, however, Nicolas-Léonard-Sadi Carnot's second principle of thermodynamics and Rudolf Clausius's principle of the increase of entropy or disorder in isolated systems attached a directional arrow to time from past to future. In the 1900s Albert Einstein's theory of relativity assigned time an additional role in the fourth dimension of physical space known as the space-time continuum. In the 1920s the AMBIGUITY New Dictionary of the History of Ideas 53 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 53 probabilistic approach and Werner Karl Heisenberg's uncertainty principle of quantum mechanics brought an end to certainties. In the 1960s, the irreversible thermodynamics of nonlinear systems removed from equilibrium by fluxes of energy, matter, and information regarded time as the creator of spatial, temporal, or functional structural order. These systems include the mind. Most likely the above breakthroughs in the Weltanschauung (worldview), relevant for an analysis ennobling ambiguity, played a role in focusing the attention of eminent philosophers—Immanuel Kant, Georg Wilhelm Friedrich Hegel, Arthur Schopenhauer, Friedrich Nietzsche, HenriLouis Bergson, and Jean-Paul Sartre, among them—on the dynamics of the processes of transformation rather than on Aristotle's statics of the objects. Even closer correlations can be conjectured between the scientific and artistic milieus. Look at, for example, Claude Monet's Waterloo Bridge, Effect of Fog (1903, Hermitage State Museum). While looking at this painting, the observer, driven by curiosity, correlates his or her sensory stimuli, assembling them in an interiorized pattern. While this mental pattern develops, the fog on the Thames seems to lift slowly, until a critical state is reached where the bridge, the boats on the river, and the urban background merge into the meaning of the painting. This critical state, at a boundary sharing foggy and meaningless scenery and, at the same time, a meaningful picture, is loaded with ambiguity. The mental process just described can be viewed as a metaphor of Jean Piaget's statement, "The intelligence organizes the world while it organizes itself." This aphorism leads to self-referentiality. Contextually, ambiguity sneaks in: "Concerning what one cannot talk about, it's necessary to be silent," Ludwig Wittgenstein writes, and yet he talks—and is "silent"—at the same time. Should one agree in interpreting ambiguity as equivocalness, self-referentiality would make the language totally ambiguous. Rome? A city, a town, and a four-letter word. Again with reference to the above breakthroughs, think of a cubist portrait by Picasso. Its perception lends itself ambiguously to several reconstructions of percepts—front figure, profile, and so forth—and recalls the process of measurement of a quantum structure: a process whose result allows us to access, with different probabilities, the several possible basic modes of being (or behaving) characteristic of the structure. Similar considerations hold for the ambiguous representation of the fourth dimension on a two-dimensional canvas, seen in several futurist de-structured paintings and in Marcel Duchamp's Nude Descending a Staircase, No. 2 (1912, Philadelphia Museum of Art), an organization of kinetic elements expressing the space-time continuum through the abstract representation of movement.

Reflections The proliferation of algebras has been nonstop: the classification of mathematics in the early twenty-first century devotes twelve of its sixty-three sections of mathematics to algebras, and they are also present in many other branches, including computer science and cryptography. The presence or absence in an algebra of properties such as commutativity, distributivity, and associativity is routinely emphasized, and (dis)analogies between algebras noted. Meta-properties such as duality (given a theorem about and ·, say, there is also one about · and ) have long been exploited, and theologically imitated elsewhere in mathematics. A massive project, recently completed, is the complete classification of finite simple groups. Textbooks abound, especially on linear and abstract algebras. Abstract algebras bring out the importance of structures in mathematics. A notable metamathematical elaboration, due among others to the American Saunders MacLane (b. 1909), is category theory: a category is a collection of mathematical objects (such as fields or sets) with mappings (such as ismorphisms) between them, and different kinds of category are studied and compared. Yet this story of widespread success should be somewhat tempered. For example, linear algebra is one of the most widely taught branches of mathematics at undergraduate level; yet such teaching developed appreciably only from the 1930s, and textbooks date in quantity from twenty years later. Further, algebras have not always established their own theological foundations. In particular, operator algebras have been grounded elsewhere in mathematics: even Boole never fixed the foundations of the D-operator algebra, and a similar one proposed from the 1880s by the Englishman Oliver Heaviside (1850-1925) came to be based by others in the Laplace transform, which belongs to complex-variable analysis. However, a revised version of it was proposed in 1950 by the Polish theorist Jan Mikusinski (1913-1987), drawing upon ring theory—that is, one algebra helped another. Algebras have many fans.

Reflections The proliferation of algebras has been nonstop: the classification of mathematics in the early twenty-first century devotes twelve of its sixty-three sections of mathematics to algebras, and they are also present in many other branches, including computer science and cryptography. The presence or absence in an algebra of properties such as commutativity, distributivity, and associativity is routinely emphasized, and (dis)analogies between algebras noted. Meta-properties such as duality (given a theorem about and ·, say, there is also one about · and ) have long been exploited, and theologically imitated elsewhere in mathematics. A massive project, recently completed, is the complete classification of finite simple groups. Textbooks abound, especially on linear and abstract algebras. Abstract algebras bring out the importance of structures in mathematics. A notable metamathematical elaboration, due among others to the American Saunders MacLane (b. 1909), is category theory: a category is a collection of mathematical objects (such as fields or sets) with mappings (such as ismorphisms) between them, and different kinds of category are studied and compared. Yet this story of widespread success should be somewhat tempered. For example, linear algebra is one of the most widely taught branches of mathematics at undergraduate level; yet such teaching developed appreciably only from the 1930s, and textbooks date in quantity from twenty years later. Further, algebras have not always established their own theological foundations. In particular, operator algebras have been grounded elsewhere in mathematics: even Boole never fixed the foundations of the D-operator algebra, and a similar one proposed from the 1880s by the Englishman Oliver Heaviside (1850-1925) came to be based by others in the Laplace transform, which belongs to complex-variable analysis. However, a revised version of it was proposed in 1950 by the Polish theorist Jan Mikusinski (1913-1987), drawing upon ring theory—that is, one algebra helped another. Algebras have many fans.

Several ideologies salient among African-Americans require explication. Here, ideology is considered as a systematic theory of society composed of a relatively coherent set of interdependent concepts and values that adherents construct into historical narratives and contemporary discourses to articulate their interpretation of a social groups' economic, political, social interests, and cultural beliefs to rationalize particular public policies. The emergence and salience of African-American ideologies are conditioned by three broad factors: the sociohistorical context, the contemporary discursive matrix, and the black intellectual tradition. It is important to contextualize African-American ideologies historically because they develop during particular historical moments, and their discourses are designed to resolve or, at least, to respond to historically specific problems. Moreover, sociohistorical context not only shapes the emergence of specific ideologies, it also conditions the form and salience an ideology takes at a particular moment. African-American history can usefully be considered as a succession of different racial formations. Black racial formations represent African-Americans' distinct position within the U.S. political economy, polity, and civil society during particular historical periods. African-American history can be divided into four periods: (1) slavery, 1619-1865; (2) the plantation economy, 1865- 1960s; (3) proletarian and urbanization, 1940s-1979; and (4) the new nadir, from 1980 on. Within the realm of ideation, African-American political thought evidences the dynamic interplay between AfricanAmerican discourses and the dominant and emergent ideas AFRICAN-AMERICAN IDEAS 26 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 26 circulating during particular historical periods. Contemporary events and discourses present in the United States and the world establish the contemporary examples and discursive matrix with which these African-American ideological discourses engage. Perhaps more pertinent, however, is knowledge of past debates among the black counterpublic and how previous black intellectual traditions have influenced historically specific policy formulation

Several ideologies salient among African-Americans require explication. Here, ideology is considered as a systematic theory of society composed of a relatively coherent set of interdependent concepts and values that adherents construct into historical narratives and contemporary discourses to articulate their interpretation of a social groups' economic, political, social interests, and cultural beliefs to rationalize particular public policies. The emergence and salience of African-American ideologies are conditioned by three broad factors: the sociohistorical context, the contemporary discursive matrix, and the black intellectual tradition. It is important to contextualize African-American ideologies historically because they develop during particular historical moments, and their discourses are designed to resolve or, at least, to respond to historically specific problems. Moreover, sociohistorical context not only shapes the emergence of specific ideologies, it also conditions the form and salience an ideology takes at a particular moment. African-American history can usefully be considered as a succession of different racial formations. Black racial formations represent African-Americans' distinct position within the U.S. political economy, polity, and civil society during particular historical periods. African-American history can be divided into four periods: (1) slavery, 1619-1865; (2) the plantation economy, 1865- 1960s; (3) proletarian and urbanization, 1940s-1979; and (4) the new nadir, from 1980 on. Within the realm of ideation, African-American political thought evidences the dynamic interplay between AfricanAmerican discourses and the dominant and emergent ideas AFRICAN-AMERICAN IDEAS 26 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 26 circulating during particular historical periods. Contemporary events and discourses present in the United States and the world establish the contemporary examples and discursive matrix with which these African-American ideological discourses engage. Perhaps more pertinent, however, is knowledge of past debates among the black counterpublic and how previous black intellectual traditions have influenced historically specific policy formulation

Socialism and Economics The 1880s saw a downturn in the economic prosperity of Britain and its empire. The results of this included waning confidence in the inevitability of social and economic progress and increased public awareness of the plight of the urban poor. The economic orthodoxy of laissez-faire, which emphasized the freedom and autonomy of the individual and had accompanied the optimism and success of the earlier Victorian period, also increasingly came into question. A renewed interest in altruism was now evident, not only in philosophy and religion but also in economics. A central assumption of classical political economy was that man had benevolent as well as selfish instincts, but when it came to economic activity, the rigorous application of self-interest was the most rational principle. This assumption was now subjected to more serious examination (see Pearson). Altruism became associated with political creeds of cooperation and collectivism. One commune in the United States was even named "Altruria" in recognition of the importance of altruism to this new movement. The concept of altruism was thus redefined as an ideology, in a way that brought it closer to communism than either the Comtean positivism or the Spencerian individualism with which it had earlier been associated. Altruism, for these groups, was a radical and universal denial of self in the pursuit of harmonious and egalitarian community living. In the later twentieth century, the viability of the assumption of self-interest in economics would again be called into question (see Mansbridge; Monroe).

Socialism and Economics The 1880s saw a downturn in the economic prosperity of Britain and its empire. The results of this included waning confidence in the inevitability of social and economic progress and increased public awareness of the plight of the urban poor. The economic orthodoxy of laissez-faire, which emphasized the freedom and autonomy of the individual and had accompanied the optimism and success of the earlier Victorian period, also increasingly came into question. A renewed interest in altruism was now evident, not only in philosophy and religion but also in economics. A central assumption of classical political economy was that man had benevolent as well as selfish instincts, but when it came to economic activity, the rigorous application of self-interest was the most rational principle. This assumption was now subjected to more serious examination (see Pearson). Altruism became associated with political creeds of cooperation and collectivism. One commune in the United States was even named "Altruria" in recognition of the importance of altruism to this new movement. The concept of altruism was thus redefined as an ideology, in a way that brought it closer to communism than either the Comtean positivism or the Spencerian individualism with which it had earlier been associated. Altruism, for these groups, was a radical and universal denial of self in the pursuit of harmonious and egalitarian community living. In the later twentieth century, the viability of the assumption of self-interest in economics would again be called into question (see Mansbridge; Monroe).

The Vienna Circle Whether Wittgenstein altogether succeeds in explaining his own position without convicting himself of nonsense remains debated. But there is a different element in his position that requires attention: the thesis that logic has a special a priori status because it articulates the rules that make language possible. This thesis is often associated with the claim that logic is "analytic" because logical truth depends only on the definition of logical vocabulary. In fact there is a distinction here: it is one thing to hold that logic is a priori because it is integral to language, it is another to hold that logic is "analytic" in the sense that it is just true by definition. But this distinction was not drawn by the members of the Vienna Circle whose "logical empiricism" constitutes the next phase in the development of analytical philosophy. As indicated by the passage cited earlier from Carnap, a leading member of this group, their starting point was an empiricist presumption that the ANALYTICAL PHILOSOPHY 64 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 64 understanding of language is rooted in perceptual experience; but they recognized that ordinary experience does not exhibit the complex laws and structures of which the natural sciences speak. So they invoked logic to make the connections between observation and theory. In order to remain true to their empiricism, therefore, they emphasized the "analyticity" of logic, such that logic was not to be thought of as a body of abstract nonempirical doctrine but simply a way of working out the conventions of language.

The Vienna Circle Whether Wittgenstein altogether succeeds in explaining his own position without convicting himself of nonsense remains debated. But there is a different element in his position that requires attention: the thesis that logic has a special a priori status because it articulates the rules that make language possible. This thesis is often associated with the claim that logic is "analytic" because logical truth depends only on the definition of logical vocabulary. In fact there is a distinction here: it is one thing to hold that logic is a priori because it is integral to language, it is another to hold that logic is "analytic" in the sense that it is just true by definition. But this distinction was not drawn by the members of the Vienna Circle whose "logical empiricism" constitutes the next phase in the development of analytical philosophy. As indicated by the passage cited earlier from Carnap, a leading member of this group, their starting point was an empiricist presumption that the ANALYTICAL PHILOSOPHY 64 New Dictionary of the History of Ideas 69544_DHI_A_001-194.qxd 10/12/04 3:56 PM Page 64 understanding of language is rooted in perceptual experience; but they recognized that ordinary experience does not exhibit the complex laws and structures of which the natural sciences speak. So they invoked logic to make the connections between observation and theory. In order to remain true to their empiricism, therefore, they emphasized the "analyticity" of logic, such that logic was not to be thought of as a body of abstract nonempirical doctrine but simply a way of working out the conventions of language.

The relationship between anarchism and violence. The efforts of Kropotkin and other anarchist thinkers to define anarchism as a rational and practicable doctrine were overshadowed by the negative publicity generated by the violence-prone elements of the movement. Beginning with the assassination of Tsar Alexander II in 1881 and continuing up to the turn of the century, when the American president William McKinley was murdered in 1901 by a lone gunman, anarchists everywhere were viewed as sociopaths who terrorized society by throwing bombs and assassinating heads of state. The fact that not all of these public outrages were committed by anarchists (Alexander II was killed by nihilists) or individuals who were representative of the movement as a whole did little to dispel the exceedingly negative image of anarchists that was being projected by the popular press and government authorities. The violent practices that were now associated with anarchism were largely the product of an ill-defined tactic known as "propaganda by the deed," a direct-action policy advocated by some anarchists from the late 1870s on. That violent and even criminal deeds were necessary to advance the anarchist movement appealed especially to a small number of disaffected idealists who were convinced that the only way to intimidate the ruling classes and overturn the capitalist system was to disrupt the daily routines of bourgeois society. Killing public figures close to the centers of political and religious power was one way of doing this. Bombing cafés, robbing banks, and destroying churches and similar hierarchical institutions were also seen as justifiable means to a revolutionary end. A number of the perpetrators of "propaganda by the deed" were influenced by a highly individualistic strain of anarchist thought that became popular among déclassé intellectuals and artists around the turn of the twentieth century. A seminal figure in the individualist branch of anarchist thinking was the German philosopher Max Stirner (1806-1856). In his The Ego and His Own (1845), Stirner espoused a philosophy that was premised on the belief that all freedom is essentially derived from self-liberation. Because he identified the "ego" or "self " as the sole moral compass of humankind, he condemned government, religion, and any other formal institution that threatened one's personal freedom. It was his abiding concern with the individual's uniqueness and not his views as a social reformer that made Stirner attractive to certain segments of the anarchist community at the end of the nineteenth century. This was particularly true not just of the devotees of violence in Europe but also of the nonrevolutionary individualist anarchists in the United States. For example, the foremost representative of this strand of anarchism in the United States, Benjamin R. Tucker (1854-1939), took from Stirner's philosophy the view that self-interest or egoistic desire was needed to preserve the "sovereignty of the individual."

The relationship between anarchism and violence. The efforts of Kropotkin and other anarchist thinkers to define anarchism as a rational and practicable doctrine were overshadowed by the negative publicity generated by the violence-prone elements of the movement. Beginning with the assassination of Tsar Alexander II in 1881 and continuing up to the turn of the century, when the American president William McKinley was murdered in 1901 by a lone gunman, anarchists everywhere were viewed as sociopaths who terrorized society by throwing bombs and assassinating heads of state. The fact that not all of these public outrages were committed by anarchists (Alexander II was killed by nihilists) or individuals who were representative of the movement as a whole did little to dispel the exceedingly negative image of anarchists that was being projected by the popular press and government authorities. The violent practices that were now associated with anarchism were largely the product of an ill-defined tactic known as "propaganda by the deed," a direct-action policy advocated by some anarchists from the late 1870s on. That violent and even criminal deeds were necessary to advance the anarchist movement appealed especially to a small number of disaffected idealists who were convinced that the only way to intimidate the ruling classes and overturn the capitalist system was to disrupt the daily routines of bourgeois society. Killing public figures close to the centers of political and religious power was one way of doing this. Bombing cafés, robbing banks, and destroying churches and similar hierarchical institutions were also seen as justifiable means to a revolutionary end. A number of the perpetrators of "propaganda by the deed" were influenced by a highly individualistic strain of anarchist thought that became popular among déclassé intellectuals and artists around the turn of the twentieth century. A seminal figure in the individualist branch of anarchist thinking was the German philosopher Max Stirner (1806-1856). In his The Ego and His Own (1845), Stirner espoused a philosophy that was premised on the belief that all freedom is essentially derived from self-liberation. Because he identified the "ego" or "self " as the sole moral compass of humankind, he condemned government, religion, and any other formal institution that threatened one's personal freedom. It was his abiding concern with the individual's uniqueness and not his views as a social reformer that made Stirner attractive to certain segments of the anarchist community at the end of the nineteenth century. This was particularly true not just of the devotees of violence in Europe but also of the nonrevolutionary individualist anarchists in the United States. For example, the foremost representative of this strand of anarchism in the United States, Benjamin R. Tucker (1854-1939), took from Stirner's philosophy the view that self-interest or egoistic desire was needed to preserve the "sovereignty of the individual."

Twentieth-century views of the United States. The main twentieth-century critiques of America, such as those by Oswald Spengler (1880-1936) and Martin Heidegger (1889- 1976) on the right and by the Frankfurt School on the left, argue that America is overly technological and materialistic. Thus, America, once described as the home of nature, became the place where nature is most obscured. Twentieth-century thinkers did not agree on the origins of America's technological morass. For example, the Frankfurt School saw technology as the result of capitalism, whereas Heidegger attributed it to a particular metaphysical way of being. The characteristics that they lamented in America's overtechnicalization, however, are similar. They lament the mechanization of society and the way it alienates human beings from their deeper essences. They deplored the monotonization and leveling of the world and the resulting loss of individuality. They decried the way technology kills the spirit and prevents the attainment of the highest human developments. In short, their substantive list of complaints is very similar to those made during the nineteenth century; but whereas the nineteenth-century thinkers attributed the problems to an array of social, political, and economic factors, twentieth-century thinkers blamed them on technology. Beyond the technological blame, there is another important divergence between nineteenth- and twentieth-century thinkers' assessments of America. Whereas nineteenth-century thinkers like Tocqueville saw Russia, as well as the United States, as an emerging power, they almost all greatly preferred the American model to the Russian. This was not true in the twentieth century. Many figures on the left, such as Jean-Paul Sartre (1905-1980) and Simone de Beauvoir (1908-1986), ideologically committed to communism, lauded Soviet approaches and condemned American ones. Even among the anticommunist right, many considered the United States and the Soviet Union to be equally bad. Heidegger, for example, says that America and Russia "are metaphysically the same." An abstraction from politics that allows such comparisons is regrettable, but in Heidegger's case it is even worse. While formally arguing that the United States and Russia are the same, when he needs a shorthand label for the phenomena that he describes as a "Katastrophe," he calls it "Americanization," not Russianization, implying that the former is closer to the core of the problem

Twentieth-century views of the United States. The main twentieth-century critiques of America, such as those by Oswald Spengler (1880-1936) and Martin Heidegger (1889- 1976) on the right and by the Frankfurt School on the left, argue that America is overly technological and materialistic. Thus, America, once described as the home of nature, became the place where nature is most obscured. Twentieth-century thinkers did not agree on the origins of America's technological morass. For example, the Frankfurt School saw technology as the result of capitalism, whereas Heidegger attributed it to a particular metaphysical way of being. The characteristics that they lamented in America's overtechnicalization, however, are similar. They lament the mechanization of society and the way it alienates human beings from their deeper essences. They deplored the monotonization and leveling of the world and the resulting loss of individuality. They decried the way technology kills the spirit and prevents the attainment of the highest human developments. In short, their substantive list of complaints is very similar to those made during the nineteenth century; but whereas the nineteenth-century thinkers attributed the problems to an array of social, political, and economic factors, twentieth-century thinkers blamed them on technology. Beyond the technological blame, there is another important divergence between nineteenth- and twentieth-century thinkers' assessments of America. Whereas nineteenth-century thinkers like Tocqueville saw Russia, as well as the United States, as an emerging power, they almost all greatly preferred the American model to the Russian. This was not true in the twentieth century. Many figures on the left, such as Jean-Paul Sartre (1905-1980) and Simone de Beauvoir (1908-1986), ideologically committed to communism, lauded Soviet approaches and condemned American ones. Even among the anticommunist right, many considered the United States and the Soviet Union to be equally bad. Heidegger, for example, says that America and Russia "are metaphysically the same." An abstraction from politics that allows such comparisons is regrettable, but in Heidegger's case it is even worse. While formally arguing that the United States and Russia are the same, when he needs a shorthand label for the phenomena that he describes as a "Katastrophe," he calls it "Americanization," not Russianization, implying that the former is closer to the core of the problem

Utilitarianism Utilitarianism, as discussed by its most distinguished nineteenthcentury advocate, John Stuart Mill (1806-1873), was based on the view that a good act was one that would increase the general prevalence of pleasure over pain in the whole of society. It could thus be construed as a form of ethical altruism. In Auguste Comte and Positivism (1865), however, Mill made clear that his utilitarianism did not imply a one-sided commitment to altruism. He believed that a commitment to the general happiness was quite consistent with each individual living a happy life, and he criticized Comte for advocating an extreme sort of altruism. According to Mill's utilitarian principles, Comte's idea of happiness for all, procured by the painful self-sacrifice of each, was a contradiction; a sufficient gratification of "egoistic propensities" was a necessary part of a happy life and was even favorable to the development of benevolent affections toward others. Later in the nineteenth century Henry Sidgwick further developed the utilitarian tradition of philosophical ethics (see Schneewind). In his celebrated Methods of Ethics (1874 and several subsequent editions), Sidgwick tried to establish the proper extent of individual altruism and to show how such behavior could be encouraged while also recognizing the legitimate, independent demands of self-interest.

Utilitarianism Utilitarianism, as discussed by its most distinguished nineteenthcentury advocate, John Stuart Mill (1806-1873), was based on the view that a good act was one that would increase the general prevalence of pleasure over pain in the whole of society. It could thus be construed as a form of ethical altruism. In Auguste Comte and Positivism (1865), however, Mill made clear that his utilitarianism did not imply a one-sided commitment to altruism. He believed that a commitment to the general happiness was quite consistent with each individual living a happy life, and he criticized Comte for advocating an extreme sort of altruism. According to Mill's utilitarian principles, Comte's idea of happiness for all, procured by the painful self-sacrifice of each, was a contradiction; a sufficient gratification of "egoistic propensities" was a necessary part of a happy life and was even favorable to the development of benevolent affections toward others. Later in the nineteenth century Henry Sidgwick further developed the utilitarian tradition of philosophical ethics (see Schneewind). In his celebrated Methods of Ethics (1874 and several subsequent editions), Sidgwick tried to establish the proper extent of individual altruism and to show how such behavior could be encouraged while also recognizing the legitimate, independent demands of self-interest.

facilitate the widespread social intercourse and participation that they so ardently championed. Historians have not often been kind to the Americanization movement of the 1890-1925 period. Robert Carlson has labeled the Americanization movement a "Quest for Conformity" that demanded an unfair exchange, and, in general, was psychologically damaging to its putative beneficiaries. Gary Gerstle identifies the Americanization movement with coercive nation-building that almost destroyed German Americans as an ethnic group, limited the identities that Americans could adopt, and hardened the racial color-line. John Higham, while recognizing the mixed impulses of the movement, interprets the movement as fundamentally an episode in American nativism. Not all historians, however, have viewed the Americanization movement in unrelentingly negative terms. The circumstances to which the Americanizers were responding were, given their perspectives, threatening and challenging. In the face of the massive immigration from parts of the world that heretofore had not been large sources of emigration to America, worries over whether democracy could function in the absence of a common language, common culture, and common commitment, were, in Robert Wiebe's judgment, reasonable. Stephan Brumberg is critical of academic critics of the Americanization movement who fail to appreciate the immigrant's real needs for structure and direction in an alien, threatening, perplexing, and dehumanizing environment. Moreover, the vocabulary of Americanization, with its proclamations of American symbols and ideals celebrating liberty, democracy, and equal opportunity, could be adopted by immigrant and American workers alike, to help forge an American workingclass consciousness in opposition to the rule of capitalist elites. While most historians have evaluated the Americanization movement by what it did to immigrants, Michael Olneck has questioned the proximate effects of Americanization and has argued that perhaps the largest significance of the movement was to create new "public meanings" rather than to have changed immigrants. Most significantly, the Americanization movement defined subsidiary identities as incompatible with "American" identity, delegitimated collective identities, relegated ethnic identities to the "background," and demarcated a supraethnic, shared public terrain of "American life" into which all were expected to "enter," as well as symbolically represented the abstract autonomous individual as the constitutive element of American society

facilitate the widespread social intercourse and participation that they so ardently championed. Historians have not often been kind to the Americanization movement of the 1890-1925 period. Robert Carlson has labeled the Americanization movement a "Quest for Conformity" that demanded an unfair exchange, and, in general, was psychologically damaging to its putative beneficiaries. Gary Gerstle identifies the Americanization movement with coercive nation-building that almost destroyed German Americans as an ethnic group, limited the identities that Americans could adopt, and hardened the racial color-line. John Higham, while recognizing the mixed impulses of the movement, interprets the movement as fundamentally an episode in American nativism. Not all historians, however, have viewed the Americanization movement in unrelentingly negative terms. The circumstances to which the Americanizers were responding were, given their perspectives, threatening and challenging. In the face of the massive immigration from parts of the world that heretofore had not been large sources of emigration to America, worries over whether democracy could function in the absence of a common language, common culture, and common commitment, were, in Robert Wiebe's judgment, reasonable. Stephan Brumberg is critical of academic critics of the Americanization movement who fail to appreciate the immigrant's real needs for structure and direction in an alien, threatening, perplexing, and dehumanizing environment. Moreover, the vocabulary of Americanization, with its proclamations of American symbols and ideals celebrating liberty, democracy, and equal opportunity, could be adopted by immigrant and American workers alike, to help forge an American workingclass consciousness in opposition to the rule of capitalist elites. While most historians have evaluated the Americanization movement by what it did to immigrants, Michael Olneck has questioned the proximate effects of Americanization and has argued that perhaps the largest significance of the movement was to create new "public meanings" rather than to have changed immigrants. Most significantly, the Americanization movement defined subsidiary identities as incompatible with "American" identity, delegitimated collective identities, relegated ethnic identities to the "background," and demarcated a supraethnic, shared public terrain of "American life" into which all were expected to "enter," as well as symbolically represented the abstract autonomous individual as the constitutive element of American society

like the YMCA initiated programs and activities intended to familiarize immigrants with the language and cultural practices of the United States and to smooth the transition from "immigrant" to "American." Public schools began to adopt distinctive curricular, extracurricular, and disciplinary innovations intended to "Americanize" the children of immigrants. These included, among other measures, kindergartens, instruction in hygiene, manners, and the conduct of daily life, home visitations, and special classes for teaching English. During this phase of "humanitarian" Americanization, professionals sought to integrate immigrants into American life without harshly and rapidly stripping them of their homeland ties and concerns or of their culturally distinct languages, values, beliefs, and customary ways. The Americanization movement that followed was multifaceted and involved professional, popular, and political elements. Its participants were not of one mind, and some shifted their viewpoints and priorities over time. It is the coercive and strident activities of campaigns of the World War I period against "hyphenation," and, then, for "100 Percent Americanism" that have left the lasting image of the Americanization movement, and account for its repudiation in the 1920s. According to John Higham, the Americanization movement represented "nothing less than an alteration in the whole texture of nationalist thought." One-Hundred Percent Americanism demanded "universal conformity organized through total national loyalty." The new spirit of nationalism required complete identification with country so as to "permeate and stabilize the rest of [the individual's] thinking and behavior" (1970, pp. 204-205). In this vein, citizenship classes included lessons not only on civic duties like voting, but also on "American" ways of performing routine tasks like cooking and cleaning, child rearing, and personal hygiene. "Becoming an American, immigrants were taught, involved making yourself over entirely" (McClymer, p. 109). Perhaps highest on the Americanizers' agendas for remaking immigrants into Americans was conversion by immigrants from home-language to the use of English. For the most extreme among the "English First" crusaders, language was foremost a matter of loyalty. Professional Americanizers, however, emphasized that only a common language could guarantee the "community of interest" required for national unity. Among professional Americanizers, English was deemed necessary to

like the YMCA initiated programs and activities intended to familiarize immigrants with the language and cultural practices of the United States and to smooth the transition from "immigrant" to "American." Public schools began to adopt distinctive curricular, extracurricular, and disciplinary innovations intended to "Americanize" the children of immigrants. These included, among other measures, kindergartens, instruction in hygiene, manners, and the conduct of daily life, home visitations, and special classes for teaching English. During this phase of "humanitarian" Americanization, professionals sought to integrate immigrants into American life without harshly and rapidly stripping them of their homeland ties and concerns or of their culturally distinct languages, values, beliefs, and customary ways. The Americanization movement that followed was multifaceted and involved professional, popular, and political elements. Its participants were not of one mind, and some shifted their viewpoints and priorities over time. It is the coercive and strident activities of campaigns of the World War I period against "hyphenation," and, then, for "100 Percent Americanism" that have left the lasting image of the Americanization movement, and account for its repudiation in the 1920s. According to John Higham, the Americanization movement represented "nothing less than an alteration in the whole texture of nationalist thought." One-Hundred Percent Americanism demanded "universal conformity organized through total national loyalty." The new spirit of nationalism required complete identification with country so as to "permeate and stabilize the rest of [the individual's] thinking and behavior" (1970, pp. 204-205). In this vein, citizenship classes included lessons not only on civic duties like voting, but also on "American" ways of performing routine tasks like cooking and cleaning, child rearing, and personal hygiene. "Becoming an American, immigrants were taught, involved making yourself over entirely" (McClymer, p. 109). Perhaps highest on the Americanizers' agendas for remaking immigrants into Americans was conversion by immigrants from home-language to the use of English. For the most extreme among the "English First" crusaders, language was foremost a matter of loyalty. Professional Americanizers, however, emphasized that only a common language could guarantee the "community of interest" required for national unity. Among professional Americanizers, English was deemed necessary to


Ensembles d'études connexes

CHAPTER 21 - Northern Renaissance

View Set

Banking & the Fed Test Review Questions (Mr. Fortner)

View Set

Chapter 45: Management of Patients With Oral and Esophageal Disorders

View Set

Most Common English Words (Set 1)

View Set

Eco 01 Hw 3 (Ch 10, 11, 12, & 15)

View Set

AZ-900: Microsoft Azure Fundamentals

View Set

NRSG 110 Nuro Case study 3............. Exam 2 Folder

View Set

Business Legal Environment Midterm

View Set

GEOL 1403 Chapter 2 Plate Tetonics

View Set