J. R. McNeill's Case Studies From Something New Under the Sun (2000)

Lakukan tugas rumah & ujian kamu dengan baik sekarang menggunakan Quizwiz!

Comb Jellyfish in Black Sea

Around 1980, a comb jellyfish, native to the Atlantic coast of the Americas, stowed away on a ship bound for a northern Black Sea port of the Soviet Union. Amid highly polluted and oxygen-poor waters, it found ecological nirvana as few species ever had, devouring plankton, larva, and fish eggs. Nothing in the Black Sea would eat it. By 1990 it dominated life in the Black Sea, accounting for 95% of its wet biomass. It obliterated fisheries worth $250 million per year at a time when Russians and Ukrainians could ill afford it, helping in a modest way to nudge the USSR into the dustbin of history. So, in the tense Cold War atmosphere of the early 1980s, American ecosystems launched a first strike with the comb jelly and the USSR's biota retaliated with the zebra mussel.

Athens

Athens was built 2,500 years before cars, yet it acquired a serious pollution problem that translates to "the cloud." The city is surrounded by mountains and sea, and its sunny climate makes it ideal for smog formation. Its population climbed from 15,000 to 500,000 between 1830 and 1920 after becoming the capital of Greece, then quickly doubled as refugees from Anatolia poured in after a failed military venture. It reached 3 million by 1980, occupied almost all of its land, and accommodated a third of Greece's population. Athens hosted half of Greek industry by 1960, and most factories were small-scale, unregulated, and energy-inefficient, relying on fossil-fuel burning power stations in the western part of the city. After 1950, Athens slowly converted to an energy base of imported oil and dirty coal. Before 1965, smokestacks and chimneys accounted for most of Athens' pollution, but the automobile colonized Athens after 1955, because besides for a single subway line, there was no alternative surface transport. In 1965 the city had 100,000 cars, and by the 1980s a million. Unplanned growth created labyrinthian street patterns that contributed to traffic jams during four rush hours because of siestas. Until the late 1990s Athenian busses, along with old clunkers, were notorious polluters. Visibility was obscured from at least the 1930s, but "the cloud" appeared only in the 1970s. Regulation resulted in a drop of smoke and sulfur dioxide levels by 1980, but smog still intensified after 1975 as Athens became more prosperous. The worst episode came from a heat wave in 1987, with excess mortality of about 2,000. Ozone levels of this time were twice those of 1900 to 1940. In the 1980s the socialist party promised to eradicate air pollution within three years to get votes, won, restricted industrial fuel burning, introduced low-lead gasoline, and prohibited driving in the city on alternate days by license plate numbers. Prosperous Athenians responded by buying second cars. Emission checks came in the early 1990s. These and further measures did not suffice due to lax laws and geographic location, and by the early 1990s Athens' smog was two to six times as thick as that of LA.

Calcutta

Calcutta, India, became a coal city in the late nineteenth century and a megacity in the late twentieth century. It achieved extraordinary success in regulating smoke after 1900, but eventually its growth outpaced its pollution control policy. Laws against coal smoke proved difficult to enforce until a reformation resulted in a smoke inspectorate. Around 1910, smoke declined by about 90%. An effective pollution notification system declined smoke levels sharply again. Authoritarian power of the colonial government in India over the power of coal interests made smoke control easier than in London or Pittsburgh. But as demographic growth accelerated after 1950, smoke problems returned, and while sulfur dioxide concentrations persisted even with low-sulphur coal, the most serious pollution problems came from domestic coal dust and soot. Breathing Calcutta's air after 1975 was equivalent to smoking a pack of Indian cigarettes a day. Nearly 2/3rds suffered lung ailments, and those who prepared meals, women and children, breathed more coal. After 1980 motor vehicles added to this pollution, and ozone levels grew rapidly as a newly prosperous class emerged. Population growth, the difficulty of regulating domestic fuel use, and eventual motorization brought the murky gloom of a century before back to Calcutta.

China's Loess Plateau

China's Loess Plateau represents the first pulse of soil erosion history. Home to about 40 million people, it's one of the world's most easily eroded landscapes. The soil contains very deep deposits that blew from Mongolia over 3 million years, so it is loose and easily dislodged. 3,000 years ago, prior to cultivation, forests coved most of the region, protecting the soil from intense rains. Over 2,000 years, cultivation cleared most of the plateau, erosion increased, and the Great River turned into the Yellow River. By the early twentieth century, soil loss reached 1.7 billion tons annually, and by 1990, 2.2 billion.

Chicago

Even though Chicago is a young city on one of the largest lakes in the world, it too developed water problems with its rapid growth in the nineteenth century. Its population used the lakefront and the Chicago River, which flowed into Lake Michigan, to dump its wastes, contaminating the water supply. At a population of 30,000 around 1850, this caused only modest problems, but after a population boom from the Civil War in the 1860s, old arrangements had to change. City authorities built longer and longer pipes out into the lake to try to draw water uncontaminated by the city, but Chicago's rapid growth continually outstripped the pipes. Typhoid, a bacterial disease, sickened about 20,000 Chicagoans a year in the 1890s. The epidemics provoked the largest engineering project before the Panama Canal, reversing the flow of the Chicago and Calumet Rivers, so that by 1900 they no longer emptied into Chicago's drinking water supply, but instead flowed toward the Illinois River and down to the Mississippi. Thus the sewage from Chicago, including the refuse from the world's greatest stockyards, drifted away to St. Louis and New Orleans, eliminating waterborne epidemics in Chicago. As a New York Times headline stated, "Water in Chicago River Now Resembles Liquid." Residents of other Great Lakes states, as well as Ontario, thought this lowered the level of the Great Lakes Lakes, and it did lower Lakes Michigan and Huron by 6 or 8 inches. Thus Chicago and Illinois were repeatedly sued, and under legal restrictions after 1930, turned to groundwater, but by 1960 they overused it, provoking a lawsuit from Wisconsin. After 1985, larger deliveries from Lake Michigan eased the situation, as did Chicago's world-class sewage treatment plants.

Microbial Tragedy of the Commons

For 2 billion years microbes reigned as lords of the biosphere. They helped shape climate, geology, and all life. The twentieth century was a tumultuous one in the balance between people and microbes, partly because of conscious human efforts to attack disease and pests, and partly as a side effect of large-scale social and ecological changes. Armed with emerging knowledge of microbiology and inspired ideas of progress, medical officials attacked disease on many fronts. With these developments, humanity between 1880 and 1960 stole a march on pathogens, sharply reducing the human disease burden, encouraging rapid population growth, and fundamentally changing the human and microbial condition. After 1940, antibiotics added another weapon in human assault on pathogens and pests. Soon penicillin and other antibiotics, many of them derived from soil microbes, proved effective against a wide range of bacterial infections, including pneumonia, diphtheria, syphillis, gangrene, spinal meningitis, tuberculosis, and some dysenteries. By 1990 some 25,000 antibiotics existed, curtailing microbial careers and improving human and animal health. Antibiotics offered no protection against viruses such as diphtheria, yellow fever, influenza, polio, and measles, but vaccines could, and in the US, this came chiefly between the 1940s and 1960s, just in time to safeguard the largest generation in American history and adding about 20 years to the worldwide human life expectancy between 1920 and 1990. Antibiotics were successful in killing off strains of infections until the 1970s, when multiple-drug-resistant (MDR) bacteria emerged. Thereafter, incurable strains of tuberculosis, malaria, and numerous other infections threatened human life. Evolution made the emergence of MDR bacteria inevitable, but improper use of antibiotics made this happen sooner. In the US, nonprescription antibiotics could be bought at any drugstore until the mid-1950s, and in much of the world this remained true in the 1990s. In any case, doctors succumbed often enough to impatient patients and prescribed antibiotics recklessly, and others did too. The US livestock industry after the early 1950s fed massive quantities of antibiotics to American cattle and pigs to keep the animals healthy and help them grow faster in the hazardous feedlots of postwar America. All these practices represent a microbial "tragedy of the commons." It suited doctors, patients, and cattlemen to use antibiotics recklessly, because they were cheap and easy to use and the benefits were quick, real, and personal. The costs came in the future, they were shared among all society, and they were inevitable: restraint by an individual would at best only marginally delay matters. Worldwide, MDR tuberculosis killed about 3 million people a year by 1995, almost all in poor countries, and this number continued to rise. Other infections proved just as resilient, such as malaria, especially after mosquitos evolved resistant to DDT, pneumonia, and a variety of dysentery. The evolution of resistant infections and disease vectors put an end to a golden age and locked pharmaceutical researchers into an endless arms race with deadly bacteria, meanwhile the expansion of irrigation, acceleration of transport, human disruption of tropical ecosystems, changing relations between humans and animals, and the spread of large cities all made victories against infections more insecure.

Nile River and Delta- Aswan Dam

For the last 10,000 years, water and silt made a long ribbon of the Egyptian desert habitable and gradually built the Nile Delta over the continental shelf. The Nile was a two-way highway because winds in most seasons allowed upstream sailing. Annual floods prompted by monsoon rains brought moisture and fertile silt to the river, permitting cultivation of winter crops such as barley and wheat, but if monsoon rains were light, the Nile did not rise, crops failed, and famines resulted, and if rains were too heavy, the Nile swept away settlements, so Egypt changed the Nile. Irrigation barrages built in the the mid-1800s adjusted the lower Nile's flow in an effort to suit the demands of cotton cultivation, which Muhammad Ali had hoped would allow Egypt to import the wherewithal for rapid modernization. With new crops and new economy, big floods became far more costly than before. Egypts cotton production multiplied sixfold from the 1850s to the 1880s, when Britain occupied Egypt and tamed the Nile by building a low Aswan Dam around 1900. It was heightened twice to help store water for dryer months, but this was no match for draughts or big floods, as it captured only 1/5th of a big flood in mid-1930. New power in Egypt in the 1950s liked the idea of a high dam at Aswan, a symbol that would contribute to the heroic, vigorous image sought for the revolutionary regime and for Arab nationalism, as well as a reliable water supply for Egypt and sufficient hydroelectric power to transform Egypt into an industrial state, bring true independence and "everlasting prosperity" to Egypt. From a hydrological point of view, a high dam was misplaced at Aswan in southern Egypt, one of the highest evaporation zones on earth. The political consequences of the difficult birth of the Aswan High Dam reached around the world, but the Dam was built throughout the 1960s and completed around 1970. It revolutionized Egyptian agriculture, provided full flood control, and generated about 1/3rd of Egypt's electricity in the 1980s and 1990s. It stopped 98% of the fertile silt that formerly had coated the inhabited part of Egypt, forcing Egyptian agriculture to turn to chemical fertilizers, of which Egypt became one of the world's top users. Much of Aswan's electric power went to fertilizer factories. Salinization of soils also emerged as a serious threat without the annual flood's flushing. In a country with so slender a resource base as Egypt, and with a million more mouths to feed each year in the 1990s, menaces to agriculture were urgent matters. The Nile Delta, home to 30 million people and accounting for 2/3rds of Egypt's agricultural area, began to shrink. The dam deprived the Mediterranean of the nutrients the Nile carried, destroying fisheries that had employed 30,000 Egyptians. Lack of floods caused new weeds, and these weeds caused a debilitating disease to flourish. The dam also swamped and corroded the cultural heritage of the Nile Valley. Improvements from the dam lead to population growth, for which the dam provided inadequate amounts of water. Thus, the only large, ecologically sustainable irrigation system in world history, the Nile River, which sustained the lives of millions for five millennia and made Egypt the richest land in the Mediterranean from the Pharaohs to the Industrial Revolution, was traded for this environmental disaster. Such are the pressures of politics.

Minimata - Dirty to Clean

In 1910, a chemical factory was built in Minamata, Japan, which gradually became a company town of 50,000 by 1950. Mercury-laden waste was dumped into Minamata Bay. Bacteria converted the Mercury into an organic compound that worked its way up the food chain in ever greater concentrations, resulting in unexplained fish die-offs beginning in the late 1940s. In the 1950s, the factory accelerated production and mercury dumping, and soon many Minamata cats went mad, danced as if drunk, vomited, and died, which people called "cat-dancing disease." By mid-1950, Minamata children began to develop brain damage, what would come to be called Minamata disease. Fish were correctly suspected, killing the local fish market, and a prominent local doctor confirmed that this disease was mercury poisoning, but his findings were kept secret under pressure from his employer, the chemical factory. Around 1960 local fishermen attacked the factory, but the mercury kept flowing for another 10 years, while thousands developed symptoms and more than 100 died. The mayor remained in favor of the chemical factory, but the aggrieved mounted a lawsuit, and the chemical factory lost and by 1980 had paid $100 million to Minamata victims and their families. For decades, no one elsewhere in Japan would knowingly marry anyone from Minamata in fear of deformed offspring. After mid-1980 the Japanese government decontaminated the floor of Minamata Bay, which cost almost $500,000 million. By 2000, authorities declared Minamata Bay free of mercury and removed the netting, installed in the 1970s, keeping unsuspecting fish out of the contaminated waters. The Minamata episode was probably the worst case of contamination of the sea, but it was a simple one, involving one nation, one factory, and one pollutant.

Lake Baikal

In Russia, the connection between nationalism and nature crystallized around the struggle to save Lake Baikal from pollution, a brave fight under Soviet conditions. Lake Baikal, the pearl of Siberia, is the world's deepest lake and one of the oldest. It biota is unique, containing many species found nowhere else on earth. In its clear waters some Soviet engineers saw an ideal cleaning fluid well-suited to the needs of the country's military-industrial complex. Around 1960 authorities secretly planned a factory to be located on Baikal's shores for fibers used in jet aircraft tires. Members of the Soviet scientific and cultural elite took advantage of a time of unusual freedom of expression to dissent publicly. Objections failed to prevent two cellulose plants from opening around 1965, by which time new plastics had made the fibers obsolete for jet tires, bringing an unusual degree of attention, by Soviet standards, to pollution control. The campaign may have helped deter another 1950s plan, one in which nuclear explosions were to open up the Southern end of Baikal, raising the water flow through power stations on the Angara River.

Po Valley

In Southern Europe, as in Egypt, political elites sought to unleash the latent economic energies of their societies, encourage population growth, and enrich the state by any number of strategies. A favorite was industrialization. The Po River Basin covers a sixth of Italy and is home to a third of its population. It is for the most part flat and posed drainage problems for those who would farm its fertile soils. Efforts at taming the meandering waters of the basin failed until, in the upper Po Valley, with massive investment and heroic labor, Lombards replumbed their landscape, built drainage and irrigation channels, and made much of Lombardy suitable for rice. These efforts climaxed in a canal built in the 1860s. After the 1870s, the government of newly unified Italy generously subsidized further drainage, irrigation, and channelization. Between the 1880s and the 1910s, they confined the waters to chosen times and places, drained the marshes dry, regularized the shapes of their fields, adopted farm machinery and chemical fertilizers, and doubled or tripled agricultural output of wheat, maize, rice, alfalfa, hemp, and sugar beets. In doing so they almost eliminated the Po marshes and their fishing-hunting-gathering way of life. Malaria began to recede as well. Because Italy lacked coal, to compete with the energy-intensive economies of the twentieth century, only hydroelectric power could drive the necessary transformation of Italian ecology and society. By 1905, Italy led all Europe in hydroelectric power use, growing 1,000-fold between the 1920s and 1930s. Hydropower furnished Italy with almost all its electricity. Scores of dams, artificial lakes, and power transmission lines appeared, and the textile industry descended to the Po Valley once electrification arrived, bringing pollution from chemical dyes with it. Rural electrification allowed farmers to pump water uphill and gave a boost to marsh drainage efforts after 1920, so the ecological transformation of northern Italy fed upon itself. Italy's emergence as a European and imperial power after 1890 rested on this electrification. Without alpine hydropower, harnessed in the ongoing environmental transformation of Northern Italy, Mussolini's geopolitics would have been impossible. Industrialization fostered a class-conscious proletariat, blasting Italy into fascism. Capital-incentive enterprises took the place of family-based fishing, hunting, and horticultural exploitation of the Po floodplain. Po valley landowners often saw their interests in conflict with those of the urban and rural workers, providing strong support for Mussolini's fascist movement. The reconfiguration of environment and society went hand in hand.

Machakos Hills

In the Machakos Hills of Kenya, soil conservation schemes undertaken after independence worked well. Colonial land policy had concentrated Africans on poorer lands such as those of the semi-dry and often steep Machakos District. From at least 1930, the Machakos Hills suffered from acute erosion, causing food problems and emigration. Between 1930 and 1990, population density tripled and cultivated areas increased sixfold. But in the 1970s, the Kenyan soil conservation service, along with local Akamba farmers, stemmed the tide of erosion. What made the difference was greater security of land tenure, Kenyan authorities working closely with existing self-help groups in Akamba society, and Swedish aid money. Plentiful labor and secure tenure encouraged families to husband soil by leveling their plots, keeping animals away, channeling watercourses, and other means. Intensive farming stabilized the soils of the Machakos Hills, even as population densities grew.

The Green Party

In the US, an ideological crusade to roll back environmental regulation in the early 1980s boomeranged, as provocative statements from President Raegan's officials served as recruiting devices for environmental pressure groups. Leadership in terms of innovative institutions and planning passed to northern European countries, notably the Netherlands, and to Japan. Green parties entered politics and parliaments, and before 2000, the German Green Party took part in a coalition government, and its members held important positions in German government. The European government pioneered a consensual politics of environmental modernization. The Dutch arrived at an integrated national environmental plan around 1990, designed to harness those in powerful and influential positions who are often resistant to ecological prudence, such as agribusiness.

Peruvian Anchoveta

In the mid-1950s, veterans of the dying sardine fishery headed from California to Peru, where cold, oxygen-rich water supports shoals of anchoveta. Peruvians had fished these waters for centuries, but not with power boats, big nets, and airplanes to find the fish. By the early 1960s, Peru landed more fish tonnage than any country in the world, peaking around 1970 at 10 to 12 million tons, 20% of the world's total. Anchoveta, converted into fish meal and fish oil and anchored Peru's foreign trade, providing a third of its foreign exchange. After this peak, a collapse in anchoveta lowered world fish production by about 15%. This disaster crippled Peru's economy, featuring relentless inflation, mass unemployment, and the emergence of violent revolutionary groups in the 1970s and 1980s. Multiple El Niño brought warmer, nutrient-poor water to the Peruvian coast, working with overfishing in this collapse. As with the California sardine collapse, both natural and social causes combined in Peru's misfortune. The anchoveta collapse is the most spectacular in the history of fisheries.

Aral Sea/Basin

In the quest for cotton, the USSR created the single greatest irrigation disaster of the twentieth century. The case of the Aral Sea represents cavalier water manipulation by arrogant political and scientific elites, justified in the name of the people. The Aral Sea's demise is the climactic chapter in a long checkered history of Soviet water manipulation. Soviet policy scarcely effected the hydrosphere until the 1930s, when Stalin believed that Soviet engineering could customize the hydrosphere to meet the economic and political needs of a rapidly industrializing country, struggling to build communism before its enemies destroyed it, so measures had to be big, targets heroic, deadlines ambitious, and corners cut. Millions of cost-free laborers, political prisoners of the Soviet gulags, made giant projects all the more tempting. A new era dawned with hydroelectric installations, and by the 1950s, diversions reduced the streamflow of all the large rivers in the southwestern USSR. As rivers shrank, engineers channeled canals to top them, short-changing the Caspian Sea and depriving the Sea of Azov of its fresh water flow, making it far saltier and ruining its once magnificent fisheries. By 1975, the USSR used 8 times as much water as in the 1910s, most of it for irrigation. The maximization of the USSR's economic potential required water, so these rivers were bent to the will of the state and its planners. By the 1950s Soviets planned the creation of a vast irrigated cotton belt that would make the USSR "cotton independent." The demise of the Aral Sea, formerly the fourth largest lake in the world with an area of nearly 30,000 square miles, was a planned assassination. "I belong to those scientists who consider that drying up of the Aral is far more advantageous than preserving it. First, in its zone, good fertile land will be obtained... Cultivation of [cotton] alone will pay for the existing Aral Sea, with all its fisheries, shipping, and other industries. Second... the disappearance of the Sea will not affect the region's landscapes." Skeptics were ignored, or worse. Nearly 30,000 square miles of land by 1990 allowed the USSR to become the world's second largest exporter of the world's lowest-quality cotton, strangling the Aral Sea, once called "the Blue Sea" or "Sea of Islands." This will surely be the greatest hydrological change yet engineered by humankind. The ability of the Aral Sea to moderate local climate shrank with the sea as summer heat and winter cold grew more extreme, along with drier and saltier air, shortening the cotton belt's growing season by 2 weeks and lowering crop yields, along with many other problems with aerial salinization. The fisheries of the Aral Sea that had yielded about 40,000 tons annually in the 1950s had disappeared by 1990, as with nearly half the mammal species present in 1960 and 3/4 of bird species.

Virgin Land Scheme (USSR)

In the twentieth century, population growth and the rapid rise in international grain markets continued to drive the frontier conversion to cropland. Political ambitions sometimes provided the driving force, as in the last great effort at extensification (utilizing large areas of land with minimal inputs and expenditures of capital and labor) of agriculture in temperate latitudes: Khrushchev's Virgin Land scheme (1954 - 1960), in which broad swaths of Russian and Kazakh steppe (grasslands) were broken to the plow. After 1960, new settlement and cultivation of former steppe and temperate forest lands ceased. By the 1960s in temperate zones, efforts to expand food production focused almost solely on obtaining more harvest per acre rather than on farming more acres. The Soviet Union's formal commitment to "chemicalization" of agriculture in the mid-1960s completed this transition. After the end of the Virgin Lands scheme, the active frontier zones of the late twentieth century lay in the tropics. By the late 1990s, a quarter of grasslands plowed up under the Virgin Lands program was abandoned.

Migration Impacts in Indonesia/Borneo (note: population vs migration impacts - look also at Amazon)

Indonesia, like the Soviet Union, incorporated vast expanses that its rulers sought to transform into economically productive areas. It used less compulsion and moved fewer people. The Indonesian scheme was known as "transmigration." Most Indonesians live on Java and Bali, the fertile inner core of the country. The far larger outer islands of Borneo and Sumatra have among the poorest soils on earth. Java and Bali have long hosted dense populations of rice farmers, while the outer islands, in contrast, supported scant population, remained mostly under forest, and contributed little to the wealth and power of the colonized state. Dutch colonials had long thought that the population of Borneo and Sumatra needed augmenting so that the resources of these island, notably timber and gold, could be brought to market, so in 1905 the Dutch East Indies awarded nearly a hectare of land to any of the nearly 40 million Indonesians that would move, and 200,000 did by the 1940s. Indonesian planners intended that by 1990 no less than 50 million Javanese would populate the outer islands. This would alleviate pressures on Java, turn the outer islands into revenue-producing areas, and conveniently overwhelm their local populations to resist revolt. This grand scheme enjoyed the support of diverse backers, from the Indonesian Communist Party to the World Bank, which chipped in half a billion dollars. But it foundered on ecological ignorance, and while too few took the lure, most who did stayed poor, and resentments were created. By 1990 the lure had grown to 4 to 5 hectares of land per family, but still the vast majority of Javanese preferred the crowded conditions of rural Java to the hardships of pioneer life on the outer islands, where the expertise of Java's peasants at growing paddy rice would be of little help in most areas. Their heroic efforts to clear the forest too often led to poor harvests, dashed hopes, land impoverishment, and abandonment. Some transmigrants adopted appropriate techniques and enjoyed strong state support, but some did not. About 20% of the migrants improved their standard of living, many of the rest found that local populations wanted to farm the best lands themselves. By the late 1980s the transmigration scheme luckily had ground nearly to a halt, preventing further environmental change to the outer islands. Officially sponsored migrations helped decimate 2 of the twentieth century's 3 great rainforests, in Indonesia and the Amazon in Brazil. Africa's rainforests had avoided this fate in part because of the continent's political fragmentation.

Istanbul

Istanbul's only water fresh water source came in the form of a stream that flows into its historic harbor, the Golden Horn. The city's population easily polluted this water supply to the point where it was dangerous to drink, which constrained the city's growth. The creation of dams and aqueducts allowed it to become one of the world's largest cities by 1600. Although movement of Turkey's capital out of Istanbul slowed population growth in the 1920s, by the 1950s population growth throughout Turkey and rural exodus swelled the city by 10% per year. Most of the new arrivals built their houses on the edges of the city, living without piped water or sewage, but eventually after enough growth their residents acquired enough political weight to extract favors such as connection to the city water and sewage systems from the government by the 1970s. After 1980, Istanbul, nearing 10 million people, drew heavily on water carried underneath land from great distances, but even this proved insufficient in the 1990s, when summertime often required strenuous water conservation. As in many cities around the world, water supply continued to vex authorities responsible for accommodating urban growth, and ordinary people who had to make due with less.

Watarase River Basin -Ashio Copper Mine

Japan's Watarase River suffered only from the Ashio Copper Mine in central Japan, which has been active since 1610. For over 250 years, Ashio provided Japan with much of its copper, but had almost ceased production around 1880 when under new ownership. Operations were modernized and expanded, and in the 1880s rich deposits were found that made Ashio Asia's most lucrative copper mine. Japan's national policy of militarization needed the profits from Ashio to help buy steel. Copper was one of Japan's largest exports, and Ashio produced nearly half of it, so it was Japan's single most important mine. By 1890 sulfuric acid rain from smelters had killed 20 square miles of forests and contaminated local waters. Floods became more common because hillsides lost their vegetation cover. Mine tailings seeped, or were dumped, into the Watarase River, contaminating water used for irrigation in rice paddies. Local peasants became sickly and resentful. In the 1890s, death rates outstripped birth rates in the town of Ashio, which was home to about 30,000 people. Toxic waters killed off fish and fowl, depriving peasants of traditional supplements to their food supply. Everyone along the Watarase knew it was the Ashio Copper Mine that jeopardized their rice, health, and lives. Scholars, journalists, and the local member of parliament demanded that the smelter be shut down. Thousands of peasants marched on Tokyo three times in the late 1890s, clashing violently with police and attracting publicity, which obliged the government to require Ashio to install antipollution devices, but the technology was primitive and ineffective. A fourth march on Tokyo provoked strong government repression in 1900, as the Ashio was too important to the state to allow objections from its neighbors. The peasant antipollution movement lost all momentum with the patriotism and repression of the Russo-Japanese War around 1905, but after this, miners rioted, a landmark event in Japanese history that was in part motivated by pollution grievances. Nearly 500 Ashio households were banished, ending popular protest as Ashio plagued its remaining neighbors in relative peace. Ashio installed desulfurization equipment in the mid-1950s and closed the mine after 1970. In a landmark judicial case, local farmers in mid-1970 won millions of dollars in compensation for a century of air and water pollution. The Watarase River basin had served its purpose as a sacrifice zone in the industrialization of Japan.

Japanese and Indonesian Forests

Japan's retention of forest cover depended on timber imports from Indonesia, among other places. From about 1780 to 1860, careful forestry had checked deforestation, but by the late nineteenth century, with the intense drive on the part of the Meiji government to industrialize the country, Japan's forests came under pressure once more. By 1900 virtually no old-growth forest existed, and efforts to safeguard remaining forests fell prey to militarization, bombed-out cities, and extreme fuel shortages of the war years in the 1930s and reconstruction by 1945. After 1950, a great reversal took place as Japan shifted away from charcoal and fuelwood to fossil fuels (charcoal use was banned in the 1960s) and Japan began to import timber in a large way as hardwoods came from Oregon, Washington, British Columbia, and Siberia, and softwoods came from Southeast Asia. Japan became the world's largest importer of timber and pulpwood, and by the 1980s forests covered a larger share of the national territory (about 70%) than in any other temperate country except Finland. Indonesia once featured widespread tropical forests, but its rich volcanic soils encouraged population and agricultural growth, and by 1930 wet-rice cultivation and teak (one of the world's most durable timbers) cutting had eliminated much of the natural forest. Japanese occupation in the early 1940s led to record-high teak and fuelwood extractions, and the war, occupation, revolution, revolts, and civil war that followed trimmed back Java's forests, but the larger Indonesian islands held plenty of tall timber until 1965 when new leadership and ideals that paved way for capital-intensive forestry provoked a feverish assault on Indonesia's outer islands, shaping and maintaining the Indonesian state. Indonesia soon became the world's largest exporter of tropical timber, and by the 1980s plywood too, despite new laws banning log exports. Despite laws requiring replanting, by 1990, about 1/3rd of Indonesian forests had vanished. When the U.N. issued a nonbinding resolution suggesting a ban on tropical logging by the year 2000, efforts were redoubled to get assets to the market in time. Then, huge fires in the late-1990s turned remaining trees into ash and smoke.

Green Belt Movement -Wangari Maathai

Kenya's Green Belt movement, dedicated to tree planting, was organized by the National Council of Women of Kenya in the late-1970s. In the 1980s this massive movement was led by a woman, a former professor of veterinary anatomy, Wangari Maathai. Ordinarily these grassroots environmental movements were embedded in peasant protest or some other social struggle. When strong enough, they won some concessions from governments, and when not, they solidified anti-environmental attitudes in those in power, inadvertently inviting elites to equate environmentalism with undermining power and treason. The Green Belt movement grew strong enough to make an impact on the land and provoke a backlash: it had planted some 20 million trees in Kenya by the mid-1990s, but government spokesmen abusively wrote or spoke about Maathai and government thugs roughed her up more than once for her efforts.

Libyan Great Man-Made River

Libya is a big country with a small population. Southern Libya, in the heart of the Sahara desert, lies on top of vast amounts of fossil water. In search of oil as an Italian colony in the 1920s, disappointed petroleum engineers found only aquifers, and after Libyan independence in the early 1950s, American oil men found more aquifers. But they lay far from any center of population, so the water stayed put until a revolution around 1970 when an American billionaire working in Petroleum was convinced to help build pipelines to deliver the Saharan water to the Libyan coasts. A good share of Libya's oil revenues (about $25 billion) went into the Great Man-Made River, a system of two major pipelines buried under the sands, capable of delivering water equal to about 5% of the Nile's flow. Coastal regions could now raise crops on a scale quite impossible before the river began to flow in the mid-1980s. The water cost 4 to 10 times as much as the value of the crops it produced, a massive money loser for Libya. After the collapse of oil prices in the early 1980s, Libya had trouble footing the bills, but Korean and American construction firms did well by the project, and it had national support, which explains Libya's persistence with the project, despite its economics and the troubles it caused with Egypt and Chad, who objected that this river might poach their water.

London - dirty to clean

London was the world's biggest city in 1900, home to 6.6 million people, with 700,000 chimneys and a few thousand steam engines, all burning coal. London air was foulest, based on fogs, around 1870 to 1900. One fog around 1870 caused people to blindly walk into rivers. Several thousand people died prematurely on account of London's fogs in this period, generally from aggravated lung conditions. Smoke abatements made some progress and remained the focus of antipollution efforts up to 1950. The spatial expansion in London, and more efficient industrial combustion, helped disperse and control pollution, but coal-burning remained untouched by reformers pre-1950 due to the hegemony of the time. However, a 6-day fog that killed 4,000 in 1952 due to chilly weather, stagnant air, and temperature inversions that reduced visibility to zero, and even healthy people found breathing uncomfortable, killing more Londoners in the twentieth century than anything other than the influenza pandemic. This led to the Clean Air Act of 1956, which sharply regulated domestic coal smoke and helped London switch to gas and electric heat, when the smoke problem shrank to insignificance. Sulfur emissions, although unregulated until around 1970, fell 90% thanks to this fuel shift. Ironically, the clearer air allowed sunshine to penetrate city streets, where it reacted with tailpipe emissions to form photochemical fog.

Green Revolution in Agriculture (Where did it work better, where did it not?)

Mechanization fit well with the Green Revolution, a crucial departure in agriculture that depended centrally in plant breeding. The Green Revolution was a technical and managerial package exported from the First World to the Third beginning in the 1940s but making a major impact in the 1960s and 1970s. It featured new high-yielding strains of staple crops, mainly wheat, maize, and rice. Plant geneticists selected these strains for their responsiveness to chemical fertilizer and irrigation water, for their resistance to pests, and eventually for their compatibility with mechanized harvesting. Success required new inputs, new management regimes, and often new machines. Like the political revolutions of the twentieth century, the Green Revolution drew intellectually mainly from the Western world, changed its forms when it spread elsewhere, and led to unexpected consequences. American farmers and plant breeders were hard at work on hybrid maize, hoping to concoct higher-yielding and disease-resistant strains, and by 1970 over 99% of US maize acreage was sown to hybrids with yields rising 3 to 4 times 1920 levels. By the 1990s 3/4ths of the Third World's wheat and rice area was under new high-yield varieties. The dissemination of new breeds amounted to the largest and fastest set of crop transfers in world history. Ecologically it combined with mechanization to support monoculture as farmers purchased seeds rather than using their own and required fertilizers and pesticides specific to a single crop, saving money on inputs by buying in bulk, and creating new resistant pests. The necessary irrigation helped drive huge dam-building programs, and thousands of strains of wheat and other crops turned into few. In places such as Mexico and Indian Punjab, the Green Revolution strongly favored farmers with reliable access to credit and water. The Green Revolution promoted income inequality among farmers as social frictions intensified into overt class and ethnic or religious conflict. Literature suggests that the social effects proved more favorable in lands raising rice than those with wheat. South Korea, China, India, and to a lesser extent Mexico improved their agricultural balance of payments, reduced or eliminated food dependence, and, whatever the ecological or social costs, improved their international economic and political position. Countries that could not create favorable conditions for the Green Revolution, those with too little water or underdeveloped credit markets, suffered in comparison. Broadly speaking, this meant sub-Saharan Africa sank in the scales against Asia and Latin America. It helped in the industrialization drives of Taiwan, South Korea, Indonesia, and other "Asian tigers." It made India a food exporter. But while the Green Revolution made Third-World agriculture more land- and labor-efficient, it could not match the productivity increases ongoing in the West and Japan. By 1985, agriculture in the West was 36 times more labor-efficient and prosperous than in the Third-World, who switched around this time from being net-exporters to net-importers. Thus the geopolitical effect on modern changes in agriculture improved the relative position of the West and Japan slightly, that of China, the Asian tigers (South Korea, Taiwan, Malaysia) and Latin America even more slightly, while contributing to the relative decline of Soviet status, and to the weakness of Africa.

MAP

Mediterranean countries were exceptional after 1975 for developing the Mediterranean Action Plan under the guidance of the U.N. Environment Programme, when basically all Mediterranean countries convened and agreed to an ongoing process of environmental management for the entire basin. The plan continues to support scientific research and integrated development planning and has produced several agreements and protocols to limit pollution, although enforcement normally left something to be desired. For example, over 1,000 miles of coastline were "sacrificed" to development through lax enforcement or specialized dispensation. But the plan, together with national regulations and EU restrictions, helped limit Mediterranean pollution after 1975, and helped in the construction of sewage treatment plans in many cities large and small. While the sea 20 years later was more polluted than when MAP began, it surely would have been much more so without MAP. Any accords with pairs of sworn enemies must rank as high political achievement. Scientific wisdom carried unusual weight because hundreds of billions of tourist dollars were at stake. No country could achieve clean beaches alone, given the circulation of the Mediterranean.

Mexico City

Mexico City lies at over 7,000ft, so cars run inefficiently, generating more pollutants, and oxygen is scarce, intensifying the adverse health effects of ozone and carbon monoxide. The city had 350,000 residents in 1900, but this grew to 20 million by 2000. In 1930 it had 7% of the nation's industry; by 1980 over 30%. By 1990 the bowl had 30,000 industries, of which 4,000 burned Mexico's high-sulfur fossil-fuels. Increased motorized transport combined with a government with fossil-fuel interests discouraging energy conservation produced what is considered the world's worst urban air pollution problem. In the 1970s sulfur dioxide levels ranged from 1 to 4 times the guideline of the World Health Organization, occasionally reaching 10 or 15 times. Greater use of natural gas helped during the city's growth in the 1980s, but dust and soot grew thicker, lead in the air doubled, which prompted the introduction of low-lead gasoline. Unfortunately the additives used in this compounded the ozone problem. Schools would close for ozone alerts as people gasped, wheezed, and died. Vegetation on the surrounding mountains suffered, affecting the water balance of the city. Birds fell out of the sky mid-flight in 1985. The policy response to air pollution in Mexico City led to two awards from the United Nations for its antipollution efforts. By 1990, the air by some measures improved, and sulfur dioxide, carbon monoxide, and lead concentrations no longer regularly exceeded guidelines, but ozone remained a substantial issue. Centuries of history are not easily undone by public policy.

Nile Perch

The Nile perch, a long and heavy predator nicknamed "the T. Rex of freshwater fish," made its way into Lake Victoria in Africa, introduced by someone who fancied it as a sport fish. It proceeded to eliminate over half the fish in the lake, about 200 cichlid species, in the largest vertebrate mass extinction in recorded history. Cichlids are the best surviving indications of how new species evolve, making this devastating for evolutionary biologists. The total fish catch, mainly Nile perch, rose sharply after the 1970s, inspiring fish processing plants and an export trade, mostly to Israel. The ecological changes in Lake Victoria helped the big operators because it takes a bigger boat to catch Nile perch than the canoes of small-scale fisherfolk, mainly women, who previously caught cichlids.

New Caledonia

Modern mining can alter landscapes and lives miles around, as it did in the French island the size of New Jersey in the southwest Pacific New Caledonia (East of Australia). Between a quarter and a third of the known oxidized nickel in the world lied under its mountain summits. By the 1920s mining through picks and shovels by immigrant labor allowed New Caledonia to lead the world in nickel production, which lasted until the 1990s. Between 1890 and 1990, half a billion tons of rock were moved to get 100 million tons of ore and 2.5 million tons of nickel by beheading mountains and opencast mining. Streams filled with silt and debris, making fishing and navigation impossible. Floods and landslides destroyed lowlands, dumping gravel on farmable lands and destroying coconut groves. Silt smothered offshore corals in one of the world's largest lagoons. Many lost their livelihoods, homes, and lands in the first decades of nickel mining. Smelters that were built locally filled the air with smoke and noxious gases. After 1950, bulldozers, hydraulic shovels, and 40-ton trucks replaced picks and shovels, and the scale of production increased 10-fold by 1960 and 100-fold by mid-1970, driven by Japanese industrial expansion and the Cold War. Independence struggle and political violence wracked New Caledonia in the 1980s, when the French government began to impose environmental regulations on active mines, but the pollution, erosion, and siltation from abandoned mines will continue for decades, if not centuries. After 1960 and particularly after 1980, similar parallels occurred around the great mines of this Melanesian region.

Cubatão - dirty to clean

Nestled between the sea and a steep slope, Cubatão, Brazil, consisted chiefly of banana plantations and mangrove swamp in 1950, but because of its hydroelectric potential and port access soon became a target for state-sponsored industrialization, which succeeded too well during Brazil's boom years of the 1960s and 1970s. By 1980 it was producing 40% of Brazil's steel and fertilizers and 7% of Brazil's tax revenues. It then earned the nickname "Valley of Death" because 35% of infants died before their first birthday. As soot and dust worsened, it became unsuitable for birds, insects, and trees. Laboratory rats were placed in the poorest and thus most polluted part of town, and they only survived with ill effects to their respiratory systems. Acid rain killed vegetation, bring landslides, and communities had to be evacuated. Cubatão was thought to be the most polluted place on earth, and after much denial, authorities responded to media harassment, citizen protests, and several deadly industrial accidents after military rule ended in 1985 and questioners of the state development model grew less fearful. Under the impact of regulations, fines, and new technologies, pollution levels dropped to only 20 to 30% of their previous levels by 1990. By the late 1990s, trees had returned to the slopes above the city and carp swam in the effluent pools of some of the chemical factories. In Cubatão the modernizing state created the industrial pollution, but when democratized and duly pressured, it also tamed it.

Niger Delta Oil

Nigeria's oil lay in the Niger delta, home to about 6 million people in 1990. Shell-BP, which had been granted exploration licenses by the British colonial government, struck oil around 1955. Production began in the 1960s and a refinery was built in 1965, stimulating production. Shell-BP backed the victorious central government in the civil war around 1970, in which southeastern Nigeria attempted to secede and take the oil revenues with it. After the price hikes of the 1970s, Shell-BP pumped out oil while Nigeria corruptly pretended to comply with the cartel's rules. Leaks, spills, and perhaps sabotage splashed oil throughout the delta, fouling the fisheries and farms of local peoples, notably the half-million-strong Ogoni. Their protests and rebellions were met with intimidation, force, show trails, and executions. Nigeria's military government by the 1990s derived up to 90% of its revenue from oil, benefitting personal riches. Around 1990 the UN declared the Niger to be the world's most ecologically endangered delta. In 1995 Shell-BP began to address environmental and other complaints. Nonetheless, the Niger delta at the end of the century, like Tampico at the beginning, became a zone of sacrifice as local peoples lacked the needed power to resist the energy regime.

Norilsk

Norilsk nickel smelters in northwest Siberia were part of a giant metallurgical complex built by gulag labor after 1935 and run by Stalin's secret police. Norilsk grew to be the largest city in the world above the Arctic Circle and a bulwark in the Soviet military-industrial complex. Its pollution killed or damaged a swatch of forest half the size of Connecticut, and filled residents' lungs, contributing to severe health problems even by the standards of the late Soviet period. The men of Norilsk suffered the highest lung cancer rates in the world. In the 1980s the smokestacks of Norilsk spewed out more sulfur dioxide than did all of Italy.

Veracruz/Tampico, Mexico

Oil lay under the rainforests of Veracruz along the shores of the Gulf of Mexico. Here the capital came from American and British firms, the equipment often secondhand from Texas, and the labor from Texas and from the local indigenous population. To oilmen and to successive Mexican governments, their rainforest ways seemed backwards and pointless, so widespread drilling began around 1905. With the Mexican Revolution of the 1910s, ambitious new leaders saw in oil a way to propel Mexico forward. Boosters thought that northern Veracruz could support 40 million people, if only trees and Indians would make way for oil and oilmen. The allies floated on a wave of oil in WWI that mostly came from Tampico. Mexico stood second in world oil production by 1920, peaking shortly after. Oil recast both ecology and society in northern Veracruz, almost overnight. Tampico went from being a sleepy, swampy port to being swarmed by industry and nearly 100,000 people, as well as the surrounding region. Through spills, leaks, blowouts, and fires, the oil business ruined the land. Salt water seeped into the oil fields, causing complications, and with US and Venezuelan oil fields flourishing, The Mexican government nationalized the oil industry around 1940 and forbade exports of crude. Foreign companies boycotted Mexican oil anyways. Production plummeted and low forest slowly recolonized much of the oil fields.

Cassava and Cassava Mites

One shining success in biocontrol restored Africa's broad cassava belt to health in the mid-1990s. Cassava, a Brazilian root crop imported to Africa in the sixteenth century, by 1990 supported some 200 million Africans. In 1970 a cassava mite, also Brazilian, began to colonize Africa's cassava fields, spreading in all directions from a secure position in Uganda. Unlucky farmers lost half their crops, and no pesticides or other control techniques worked. Then, after 10 years' search in South America, entomologists found the answer: another mite that ferociously pursued and ate the cassava mite. Released first in Benin in 1993, the predator mite quickly checked the havoc caused by the cassava mite, raising cassava yields in West Africa by about a third.

Los Angeles

Photochemical smog made its debut in the human consciousness in Los Angeles in the early 1940s. The smog was often mistaken for Japanese attack gas, as its intensity was heightened by LA's geographic location between mountains. The cheap energy characteristic of the fossil fuel age allowed for the growth of the American Southwest, along with cheap water, growing LA's population from 100,000 in 1900 to 6 million by 1960. In the 1940s, like other American cities, it dismantled its system of public trains to make way for cars. LA's auto-population quadrupled between 1950 and 1990 to 11 million, in a setting made for smog. By 1950 the city created air quality boards that regulated refineries, factories, and finally cars. By the 1960s, smog stunted tree growth 50 miles away. After more stringent regulations in the early 1970s, ozone and smog decreased by about half in the LA basin, despite an increase in cars and driving. Yet in the mid-1970s, LA's air still reached officially unhealthy levels 3 days out of 4. In the 1990s LA smog remained a regular health hazard, the most serious urban air pollution problem in the US.

Pittsburgh - dirty to clean

Pittsburgh had pollution problems from coal like other American cities, and from around 1870 forward they enacted smoke abatement laws. But all remained smoky and sulfurous until 1940. Coal took off around 1760, and steel took off around 1880, using 3 million tons or 5% of the nation's coal. Around 1890, natural gas drove down coal use and the skies cleared, but when gas supplies ran out, smoke returned with a vengeance. Military orders boosted production around 1940, but thanks to the example of St. Louis, Pittsburgh enacted effective smoke prevention laws, which enacted on industry and homes closer to 1950, despite objections from coal interests, United Mine Workers, and railroads. Alternative energy sources were found, and around 1950 Pittsburgh's air was significantly cleaner. The steel industry collapsed after mid-1970s, population declined, the air got even cleaner, and in 1985 Pittsburgh was rated as America's most livable city.

Rwanda

Rwanda is a zone of highland terraces in east-central Africa with rich volcanic soils, abundant rain, and a comparatively mild disease regime that supported unusually dense rural populations in modern times. Before 1800 these slopes carried forest and minimal human population, but gradually pioneer cultivators made their way up into the lower hills. In the twentieth century, the migrants pushed father west and farther up due to population pressure and politics. Belgian authorities obliged Hutu peasants to cultivate larger areas of land and to adopt crop rotations that left soils without plant cover during rainy seasons. Anti-erosion practices familiar to Hutu farmers were lost as the Belgians sought to maximize food production. As peasants cleared new lands and shortened periods of leaving lands unplowed, erosion problems mounted. Belgian officials began to take note of rapid erosion in the 1920s and 1930s and imposed forced-labor soil conservation schemes. Erosion in Rwanda was declared as a matter of life or death around 1950, when the spread of the cash crop of bananas, which provide good soil cover, helped mitigate erosion. Nonetheless, rapid soil erosion continued in many parts of Rwanda. Population grew after independence around 1960, allowing for sufficient labor to attend conscientiously to soil conservation, and by the 1980s some slopes had stabilized, but others eroded even faster. The civil war and aftermath in the mid-1990s reduced rural population sharply, but Rwanda still has one of the highest rural population densities in Africa.

Chernobyl

Scores of mishaps beginning in the late 1950s climaxed at Chernobyl (in Soviet Ukraine) in 1986, by far the most serious nuclear accident. There, human error led to an electrical fire and explosions that nearly destroyed one reactor. Over 30 people died quickly, while untold numbers died (and will yet die) from Chernobyl-related cancers, primarily among the 800,000 workers and soldiers forced into cleanup operations, but also among local children whose thyroid glands absorbed excessive radiation. About 140,000 people had to leave their homes indefinitely, while some returned. The total release of radiation, officially put at 90 million curies, was hundreds of times greater than that given off by the bombs at Hiroshima and Nagasaki, which continued to cause health problems for decades after detonation. Everyone in the Northern Hemisphere received at least a tiny dose of Chernobyl's radiation. The accident and initial denial and cover-up knocked one of the last props out from under the Soviet Union. It completely changed the public perception of nuclear power plants around the world, especially in Europe. Only France, Belgium, Japan, South Korea, and Taiwan showed interest in nuclear power after Chernobyl. Some nuclear wastes and part of Chernobyl's fallout will be lethal for 24,000 years, easily the most lasting human mark of the twentieth century and the longest debt on the future that any generation of humanity has yet imposed.

Lake Washington/Seattle

Seattle's raw sewage caused eutrophication (unusually high quantities of nitrogen and oxygen resulting in excessive aquatic plant and bacteria growth), and small algal blooms grew in Lake Washington in the 1930s. The problem abated when the city diverted its sewage to Puget Sound, an inlet of the Pacific Ocean, in the mid-1930s, but in the late 1940s suburban growth renewed the problem, and by mid-1950 algal blooms decorated the lake. Political tussles ensued, but by mid-1960 the suburbs too sent their sewage to Puget Sound, and Lake Washington cleared up again. Puget Sound's size, regulations on phosphate additives, and improved sewage treatment prevented the Sound from suffering Lake Washington's fate.

Japanese Environmental Miracle

Several convergent forces allowed Japan to change environmental course without derailing the engine of environmental growth. The most important were a system of local government that responded to local concerns, an energy shift away from coal, widespread prosperity that encouraged citizens to question the necessity of pollution, and the extraordinary rate of capital accumulation that allowed industry to spend on pollution control when obliged to. Ube took effective action first in the early 1950s as the health consequences of Ube's foul air could be demonstrated academically and a leading industrialist convinced his peers that Ube could clear its air. Bureaucrats composed new regulations on emissions, chiefly on dust and smoke, and by the 1960s Ube's air became a transparent shadow of its former self. Elite-driven reform worked, and its regulations became the basis for a national smoke-and-soot law, which did nothing to curtail emissions of sulfur dioxide or heavy metals. Air pollution levels in Japan by the late 1960s were worse than ever before, but citizen action drove the next phase of the reform. Around 1970, asthma sufferers sued a giant petrochemical complex and won large damages amid great publicity, resulting in tighter sulfur emission standards. Women also pressured authorities into pollution abatement, and planned industrial complexes in the later 1960s were canceled, prompted by pollution and health concerns, and extreme departure from the previous mentality and practice in Japan. Pollution control entered the main stage of national politics by 1965, well-published lawsuits continued, and people and prefectural authorities learned more and more about pollution and pollution control. A spate of antipollution laws were passed after a particularly bad smog episode in Tokyo in 1970, creating a new agency to monitor environmental affairs. Due to regulatory pressures, new Japanese cars in 1980 emitted only about 10% as much pollution as new cars in 1970.

Soil Pollution in Japan: Kosaka Mine, Jinzu River Valley

Surges of mining, smelting, refining, and using metals in Japan in the late nineteenth century brought acute heavy metal pollution in the early twentieth century, particularly through copper contamination, and the heavy metals contaminated rice paddies in Japanese river basins, reducing rice yields, often provoking farmer protests. Farmers who suffered surrounded the smelter of the Kosaka mine with armed forces, forcing it to surrender. In the Jinzu River valley, hundreds of cases of a bone disease that translates to "ouch-ouch" was a consequence of cadmium poisoning. After 1950, the Korean War jump-started heavy-metal production and pollution, so by the 1970s Japan led the world in zinc and cadmium production, which contaminated irrigation water that was used on rice paddies, and cadmium in particular is easily absorbed by rice plants. Because of this, by 1980, about 10% of Japanese rice became unsuitable for human consumption due to soil pollution. All of this killed hundreds of Japanese and made thousands sick in the twentieth century, making heavy-metal soil pollution more serious in Japan than anywhere else.

The Green Run

The American weapons complex involved some 3,000 sites in all. The US built tens of thousands of nuclear warheads and tested more than a thousand of them. Around 1950, shortly after the Soviets had exploded their first atomic bomb, the Americans conducted a secret experiment at Hanford in Washington, where the bomb that destroyed Nagasaki was built. Questioning how quickly the Soviets were able to process plutonium, American officials decided to use "green" uranium for less than 20 days out of the reactor to test their hypothesis. The Green Run, as it was known to those in on the secret, doused the downwind region with iodine radiation at levels varying between 100 and 1,000 times the limit than thought tolerable. The local populace learned about these events around 1985, when Hanford became the first of the US nuclear weapons complexes to release documents concerning the environmental effects of weapons production. The Green Run shows the environmental liberties the Americas took under the influence of Cold War security anxiety. But that was just the tip of the iceberg, because a half century of weapons production around the US left a big mess. Partial cleanup is projected to take 75 years and cost up to $1 trillion, the largest environmental remediation project in history. Full cleanup is impossible. More than half a ton of plutonium is buried around Hanford alone.

Atlantic Forests of Brazil and US

The Atlantic coastlands of the American continents in 1500 supported great forests, but after 1500 in Brazil's coastal forest, and after about 1600 in North America's eastern woodlands, a long siege by colonial settlers began. By 1910 North America had occupied all the promising farmland of the eastern part of the continent and cut all the good timberlands. A quarter of the cut in 1900 became railroad ties, which before 1920 had to be replaced every few years. In all, more than half of these eastern forests disappeared between about 1600 and 1920. After about 1920, however, the eastern woodlands returned because after 1840 midwestern and Ontario farms drove older ones out of business. Farm abandonment in the East continued rapidly as railroads surrounded the continent. Then, from the 1930s, more marginal cropland was abandoned to forest regrowth as farming yields rose and total use of forest products declined because fuelwood plummeted after the mid-1930s, because iron, steel, and plastic replaced wood in many uses, and because fire suppression lowered the loss to forest fires by about 90% between 1930 and 1960, so the eastern woodlands regenerated, and total forest area in North America stabilized after about 1920, as eastern growth roughly equaled clearances in the West. In Brazil things worked out differently. Agriculture, especially sugar and coffee, slowly chipped away at coastal forests, as well as railroads, mines, fuelwood removals, and by the late 1920s timber trade, so coastal forests shrank faster and faster. The land tenure system, paired with population growth, assured a steady supply of landless peasants. Brazil had almost no fossil fuels through high fuelwood and charcoal consumption, so after 1950, dam building flooded out additional forest. By 1990, only 8% of Brazil's Atlantic coastal forests remained. The technological and social changes that halted forest destruction in North America did not happen, or happened too faintly. Brazil's government systematically opened up Amazonia to colonization in the 1960s. Between 1960 and 2000 roughly 10% of the world's largest rainforest and its most botanically diverse province, became pasture, farmland, or scrubland.

The Ganges

The Ganges River Basin drains a quarter of India, containing about 100 million people in 1900, of whom perhaps 10 million dumped their waste directly into the river. The Ganges' smelly condition gave rise to one of the world's first antipollution societies in the 1880s. By 1990s, 450 million people lived in the basin, and some 70 million discharged their wastes into the Ganges, going untreated as fish suffocated from lack of oxygen. Because of population growth, the Ganges suffered 5 to 10 times more from biological pollution at the end of the century than at the beginning. This could be said about hundreds of other rivers, but the Ganges is unique in that it acquired pollution for sacred as well as profane reasons. Hindus believed that the Ganges was a gift from God to wash away sins, attracting the elderly and the sick, and that cremation at a local crematoria ensures liberation of the soul, so millions of tons of ash were deposited into the Ganges every month, along with bodies and animal carcasses. Government cleanup efforts took place in the bacteriological nightmare in the 1960s to the 1980s had little effect.

Hanshin Region (Kobe, Osaka, Kyoto)

The Hanshin Region hosted more heavy industry than anywhere else in Japan except possibly greater Tokyo. After 1880 the region bristled with new iron, steel, cement, and chemical plants, and population grew significantly. Victory in the Russo-Japanese War around 1905 justified their economic strategy, and redoubled government commitment. Hanshin's industries spilled out onto residential and farming districts, causing acute social stress. Smoke and sulfur dioxide poured onto millions of people, and, like the Ruhr, national interest justified intense pollution. Osaka began monitoring air pollution around 1910, as it was a coal city like London or Pittsburgh. Hanshin boomed after acquiring German colonies in WWI, and despite a 1925 law that required smoke prevention equipment on urban buildings, air pollution worsened. Osaka doubled in population and expanded in the 1920s, and after 1930 a law designed to increase combustion efficiency and reduce smoke was ineffective due to having three smoke inspectors for 35,000 smokestacks. Smoke levels doubled in the 1930s, and airplanes crashed on account of reduced visibility, but still no pollution restrictions carried weight as Japan's overriding priority became war production. Smoke, ash, dust, and sulfur bathed the region until American air power pounded Hanshin's industry to rubble in the mid-1940s. As in the Ruhr, American geopolitical anxieties required the resurrection of destroyed industry. Dustfall in Osaka, in 1945 only a quarter of 1935 levels, by 1955 surpassed prewar heights. Automobiles added to the dark clouds, especially after 1970 as the region nearly fused into a single motorized urban area. The Hanshin region was only one of several places severely polluted during Japan's economic miracle. But by the late 1960s the air all over Japan was beginning to clear.

Ogallala Aquifer

The Ogallala (or High Plains) Aquifer is a body of water equal to the volume to about Lake Huron or Lake Ontario, stretching from Texas to South Dakota. It's actually a very slow underground river that has been dripping inches per day southeast through a gravelly bed at a depth of about 300 feet for 10,000 to 25,000 years. As the searing drought of the 1930s sharpened the thirst of the High Plains and postwar opportunity deepened it further, the newly accessible and constantly reliable Ogallala seemed an answer to the prayers of the Dust Bowl farmers. Use then quadrupled between 1950 and 1980, spurred by more droughts. By the late 1970s, Ogallala water accounted for 1/5th of the irrigated area in the US. A good chunk of the country's wheat, corn, alfalfa, and even cotton depended on it. Nearly 40% of the nation's cattle drank Ogallala water and ate grain produced with it. Water was drawn 10 times faster than it was being produced, being drained less than 1% a year, and certain people capitalized on and profited from the previously unsought dry land above it. Farmers soon had to go deeper and deeper to get water, and many found the costs did not justify the results, so irrigation declined in the mid-1970s and contracted in the mid-1980s. Since the late 1970s states came to agreements as to who gets how much Ogallala water, and extraction rates stabilized, but did not decline. Across the High Plains, 150,000 pumps work day and night during the growing season. A prediction of 300 years' worth of water in 1970 turned into less than 30 in 1990. Half of the accessible water was gone by this time, and while it took many millennia to fill, humankind will drain it almost surely in less than a century.

Palliser Triangle Region

The Palliser Triangle of Western Canada is a semi-dry wheat belt in the prairie providences. It belonged to nomadic Indians and the buffalo before the Canadian Pacific Railway went through in the 1880s, along with a few hopeful settlers. After a few rainy years, population grew about 15-fold between 1900 and 1915. High wheat prices around the world then inspired even more railroads, and towns, and settlers from eastern North America and Europe. They all sought to preserve soil moisture in the summer by leaving fields unsown, but due to wind and dry years, serious wind erosion occurred. Droughts hit, dust storms darkened the skies, and about 15,000 square miles were completely destroyed. Dust blew east into Ontario and then the Atlantic in the mid-1930s. Social and economic distress matched that of the well-known Dust Bowl and spurred the success of unorthodox politics in the form of socialist and populist movements, causing many to flee. The sad tale of farming in the Palliser's Triangle was one of boom, erosion, and bust.

Philippines

The Philippine Islands are for the most part steep and subject to heavy monsoon downpours, and thus prone to erosion even without human intervention. Cash cropping in the northernmost region of the islands encouraged forest clearance and cultivation from about 1880, and after about 1890, American forces drove Spain from the Philippines, and the U.S. army generated regular demand and good prices for food crops. On remoter islands in the central Philippines, subsistence needs and international politics, rather than the market, inspired land clearing after American occupation favored population growth and plantation development. Peasants cleared forests, and rains stripped the soils, eroding especially quickly after about 1920. By 1950, in many areas, there was no soil left to erode, and contour plowing and agroforestry of the 1970s helped stem erosion. However, when timber companies arrived mainly after 1960, erosion sped to the point where around 1990 the World Bank considered it the most acute environmental problem in the country.

The Rhine River Basin -Sandoz Disaster

The Rhine flows about 800 miles through Europe, and was filled with life before urban wastes caused the Rhine to smell of filth. After 1880, mounting chemical pollution added to the mix due to the proximity of coal and iron deposits in the Ruhr valley. By the 1910s the Rhine pollution load was heavy and fish nearly disappeared. The Rhine got a brief respite in the 1940s but the 1950s and 1960s worsened its condition. By 1980, about 20% of the world's chemical production took place in the Rhine basin. Between 1900 and 1980 the concentrations of heavy metals in Rhine sediments increased 5-fold for chromium, 2-fold for nickel, 7-fold for copper, 4-fold for zinc, 27-fold for cadmium, 5-fold for lead, and 6-fold for salt, stimulating algae growth, clogging pipes, and depleting it of oxygen and thus life, especially with the addition of DDT and PCBs. With high population, densely packed heavy industry, and chemically dependent agriculture in its basin, the Rhine bore most of the pollution burdens a river can have. Cleanup efforts included sewage treatment after WWII, Germany requiring biodegradable detergents, and pollution restriction in the 1970s. Most heavy-metal concentrations in the river, but not the sediment, declined sharply after 1975, and after this, fish populations rose. More effective action followed the disastrous fire in Sandoz chemical warehouse in Switzerland in 1986. Firefighters sprayed water on the warehouse, washing pesticides, herbicides, and fungicides into the Rhine, killing virtually everything 100 miles downstream. Although most aquatic life recovered within 2 years, the affair concentrated the attention of ministers and captains of industry like never before. Regulations, incentives, and enforcement of all sorts followed.

The Ruhr Region - dirty to clean

The Ruhr Region in Germany is small but beneath it run some of the world's biggest coal seams. In 1850 it was an agricultural area. By 1910 it produced 110 million tons of high-sulfur coal, employed 400,000 miners, and sustained the giant steel- and ironworks crucial in the German military-industrial complex. Industry so important to the German state escaped almost all regulation, so air pollution from smoke, soot, and sulfur dioxide attained gigantic proportions. In 1900 the Ruhr was the biggest industrial region in Europe and probably the most polluted. Without it, Germany could scarcely have fought World War I. When French and Belgian troops occupied the Ruhr in the 1920s after failing to pay war reparations, strikes shut down industry and suddenly the skies cleared. Negotiations ended the strikes and pollution resumed, but inquiries into this concluded that pollution was inevitable, and that the Ruhr must adapt to it rather than try to limit it. New combustion technology permitted the use of lower-quality coal, causing pollution to reach new heights. One new plant covered its surroundings in white ash within hours of firing up its boilers. Protests resulted in the installation of metal filters, but acidic gases disintegrated them in a few days. Schools had to shut down for 18 months to accommodate industry. Attitude, policy, and law followed this accommodation because of industrial domination. Profits and jobs mattered more than pollution, and those whose interests suffered could not compete. Pollution increased rapidly during WWII, and the allies leveled a large share of Ruhr industry. German defeat once again led to relaxation of pollution in the mid to late-1940s, but recovery of the Ruhr industry became essential as Europe needed German coal, iron, and steel for reconstruction. By the late 1950s, Ruhr coal was subsidized, and the Ruhr's air quality became a national political issue by 1960. Tall smokestacks went up and distributed pollution over wider areas in less concentrated doses. Politically this often soothed matters and delayed effective emissions reduction. In the Ruhr it took until the early 1980s before the accumulated evidence of sulfuric rain generated the political will to clamp down on emissions.

Mayak Complex, Ob River Basin

The Soviets' nuclear program began with Stalin, who wanted more atomic weapons as fast as possible, whatever the human and environmental cost. The Soviets had only one center for processing used nuclear fuel, at the Mayak complex in the upper Ob River basin in western Siberia, now the most radioactive place on earth. It accumulated 50 times more plutonium than Hanford. For 8 years around the early-to-mid 1950s, the Mayak complex dumped radioactive waste into an Ob river and the sole source of drinking water for 10,000 to 20,000 people. After 1950 storage tanks held some of Mayak's most dangerous wastes, but in 1957 one exploded, raining about 40% of the level of radiation released at Chernobyl down onto the neighborhood. After this, liquid wastes were stored in Lake Karachay, and about 10 years later, a drought exposed the lakebed's radioactive sediments to the steppe winds, sprinkling dust with 3,000 times the radioactivity released at Hiroshima down over the size of Belgium and onto a half million unsuspecting people. By the 1980s, anyone standing at the lakeshore for an hour received a lethal does of radiation. The situation at Mayak was likened to 100 Chernobyls, but no one knew in the former USSR the extent of this because the nuclear complex was so large and so secret. Much of the complex was shut down in the last years of the USSR, but the mess remained and Russia cannot afford much in the way of cleanup.

Mediterranean Sea

The growth of modern industry in many Mediterranean countries, the emergence of chemicalized agriculture, and the rise of human and animal populations sharply increased the basin's pollution load after about 1950. A great deal of this ended up in the sea itself. The Mediterranean is the world's largest inland sea; in 1995 its catchment was home to about 200 million people divided into 18 countries. On average it takes about 80 years for the salty Mediterranean to flush out fully. It is rich in species diversity, being home to about 10,000 animals and plants, but its total biomass and biological productivity is low due to being thin in nutrients. It is possible that the amount of pollutants dumped directly into the Mediterranean is now less than it was a century ago, but pollution also reaches the sea via rivers and through the air, and in greater quantity than ever before. The main pollutants in the Mediterranean were and are much the same as elsewhere around the aquatic world. Microbes, synthetic organic compounds such as DDT or PCBs, oil, litter, and excess nutrients topped the list, with heavy metals and radionuclides less important. By the end of the twentieth century, about 30% of the raw sewage entering the Mediterranean received treatment, and the total quantity had tripled or quadrupled since 1900, so the risks of gastrointestinal ailments, typhoid, or hepatitis to people bathing or eating seafood increased significantly. By the late 1980s, beach closings became routine, and by the 1990s, about 10% of Mediterranean European beaches failed European Union standards, although many did not close. A 1980 estimate calculated that 800,000 tons of oil leaked into the Mediterranean each year, and typically 1/3rd washed up on shore as tar, plaguing beaches more than any other in the world. Oil slicks sometimes covered 10% of the sea's surface. Around 1980 the Mediterranean absorbed a sixth of the world's oil pollution, largely through routine loading and cleaning, but industry contributed more to sullying the Mediterranean than oil. Eutrophication in the Mediterranean derived less from the enormous amount of industry surrounding the sea than from agriculture and municipal sewage, causing algal blooms that have played havoc with fish populations, seabed life in general, and the tourist trade. Yet long stretches of coastline, and smaller ones elsewhere, retained clean waters, and overall was cleaner than other seas because of its size, its lively mixing of deep and surface waters, and its currents that help to dilute pollution load.

Whaling in Southern Ocean

Whales enjoyed unusual peace, as they had few predators, for their first 50 million years, until prehistoric times brought death by spears and harpoons. Dutch and English whalers brought certain bowhead whale populations near to extinction between 1610 and 1840. The Industrial Revolution quickened the hunt for whales as sperm oil became especially useful in lubricating machinery and whalebone became the plastic of the nineteenth century. Americans carried the hunt to the wide Pacific and dominated it between 1820 and 1860, so by this time most of the easily hunted whales (sperm and right whales) were gone, followed by bowhead whales in the 1890s, bringing famine to native peoples of the Bering Sea coast. The survivors were mainly large baleen whales that swam too fast to be caught and sank when killed, including blue whales, the largest creatures in the history of life on Earth. The development of the harpoon cannon allowed Norwegians to dominate modern whaling until 1950, along with the factor ship, or "seagoing slaughterhouse," which could take a 100-ton blue whale on board and render it into oil and bonemeal within an hour, first sailing in 1925, eliminating the need for licenses from and duties to British authorities. Britain, Argentina, the US, Denmark, Germany, and Japan joined the hunt by the 1930s, the peak decade for catching blue whales, then the USSR displaced Norway in the Antarctic hunt. The profit derived mainly from whale oil, as millions of whales became, through ingenious chemistry, margarine, soap, and explosives. Sonar and aircraft added to blue whale's and smaller whale's scarcity. In the twentieth century whalers found the mother lode of whales, and depleted it species by species. By 1935 the clear decline in blue whales led to regulations overseen by the League of Nations, which had scant effect. The International Whaling Commission (IWC) followed around 1945, protecting the price of whale oil, not whales, by allotting quotas among themselves. In the 1960s, whale scarcity and conservationists' agitation drove most of the whaling fleets out of business, leaving the Japanese and the USSR to dominate. They, together with the Norwegians and Icelanders, proved ingenious in avoiding a moratorium enforced after the mid-1980s; a few thousand whales killed "for scientific purposes," and thus exempt from the moratorium, ended up in sushi bars. Nonetheless, populations of most whale species appeared to be growing after 1990, giving hope that whales might escape their brush with extinction. Even if problems of an open-access resource are resolved, whales will never be far from extinction whenever pure economic logic takes precedence.

The Sulfuric Triangle

The triangle bounded by areas in Germany, Poland, and the Czech Republic sits on rich seams of brown coal, high in sulfur and ash. Early industrialization took advantage of these deposits, and of water power from surrounding areas. By 1900 this region supported a considerable industrial establishment producing coal, iron, and steel in Poland, second in Europe only to the Ruhr. Czech forests showed the ill effects of a high-sulfur diet from the 1920s, and by 1940, the region contributed heavily to the Nazi war effort, although American bombing and Soviet artillery flattened most of its industry by mid-1940. But coal remained, and industrial development suited the interests and ideologies of the communist parties in charge by 1950, so heavy industry returned, expanded, and polluted skies like never before, with no worry of resistance, leading to economic growth. By the 1970s, pollution attained gargantuan proportions, causing adverse health effects, genetic mutations, childhood developmental disabilities, and death. Air pollution continued to climb in the 1980s unlike the Ruhr region as coal subsidiaries and fuel inefficiency remained standard. East Germany generated more sulfur dioxide per capita than anywhere else in the world. Authorities in Poland initially denied that pollution existed in socialist economies, then found it necessary to make environmental information state secrets, which held systems in place, as outrage at pollution was directed at the state itself rather than individual firms, contributing to the demise of the communist states around 1990. In East Germany this then led to reunification, investment, technological improvements, and quick changes in energy efficiency and pollution control, much of which was achieved through greater use of natural gas. In Polish and Czech lands this came from plant closings rather than technological improvements.

Zebra Mussel

The zebra mussel is a striped mollusc, native to the Black and Caspian Seas. It hitched a ride to the Great Lakes in ballast water (used to stabilize ships) in the mid-1980s. Around 1990 it was discovered in Lake St. Clair, between Laker Huron and Erie. By the mid-1990s it had colonized all the Great Lakes and the St. Lawrence, Illinois, Ohio, Tennessee, Arkansas, and most of the Mississippi Rivers. The zebra mussel filters water to feed and removes numerous pollutants and algae, leaving cleaner and clearer water wherever it goes. Preferring hard, smooth surfaces, it delighted in the industrial infrastructure of the Great Lakes region, building thick colonies that sank navigational buoys and clogged water intakes on factories, power plants, and municipal water filtration systems. By the early 1990s it had temporarily shut down a Ford Motor plant, a Michigan town's water power supply, and cost the US about a billion dollars a year. In dollar terms it threatened to become the most costly invader in US history, a distinction previously held by the boll weevil. The saga of the zebra mussel was but one episode in a costly biotic exchange between America and the Soviet Union.

Ozone Hole -Montreal Protocol

Up in the stratosphere, sunlight and oxygen react to form ozone, which absorbs 99% of the UV radiation entering the atmosphere and took eons to form. This thin shield protected life on earth for about a billion years until Freon was invented, the first CFC, which could be used in refrigerants, solvents, and spray propellants, among other things, allowing for practical refrigeration and air-conditioning. When these drift into the stratosphere, however, UV radiation breaks them up, releasing agents that rupture ozone molecules. When emissions of CFCs reached about 750,000 tons by 1970, an unnoticed assault on the ozone layer was in full career, and in 1985 this was confirmed over Antarctica, with mini-holes over Chile and Australia. The initial discoveries prompted unusually quick political reaction, largely because UV-B radiation kills phytoplankton, the basis of oceanic food chains, affects photosynthesis in green plants, causes cataracts and other eye ailments, suppresses immune system response, and causes skin cancer. The US, Canada, and Scandinavian countries banned CFCs in aerosol sprays in the late 1970s, but global CFC releases continued to climb. The United Nations Environment Programme then organized the 1985 Vienna Convention on ozone depletion, leading to the 1987 Montreal Protocol, and amendments throughout the 1990s. Altogether this was an extraordinary international response to an extraordinary problem, sharply curtailing CFC production. Chemical manufacturers who had argued and complained quickly found substitutes, such as water or lemon juice as solvents. Worldwide CFC use declined roughly 80%, but because CFCs are so stable, ones released before the Montreal Protocol will still be destroying ozone in the 2080s. The "ultraviolet century" in human history should prove to span from roughly 1970 to 2070. In the first 25 years of this century, about 1 to 2 million excess cases of skin cancer derived from stratospheric ozone loss, resulting in about 10,000 to 20,000 deaths, far smaller than that of respiratory ailments from air pollution, although the full effects of excess UV radiation are unclear.

Prickly Pear Cactus and Moth (Australia)

Up to the mid-1970s there were over 50 more or less successful cases of biological control of plants (with many failures), one of which was the use of a moth against the prickly pear cactus in Australia. Prickly pair cactus arrived in Australia from North America as an ornamental garden plant but soon escaped, and by the 1920s it covered an area the size of Colorado or Italy. Australian scientists scoured North America for cactus eaters, and in 1925 found a moth which rolled back the cactus frontier. Few biocontrol programs work as well as this, because few creatures are choosy enough in their eating habits, although as the science of biocontrol grew more refined, the probability of favorable results perhaps improved.

Ankara - Dirty to Clean

When Ankara became the capital of Turkey in the 1920s, its population doubled every decade until 1980. Although it lies in a shallow bowl and has frequent wintertime temperature inversions, it had no notable air pollution problems until the 1960s. But by 1970, when its population reached about a million residents, Ankara's emissions surpassed a threshold and developed growing sulfur dioxide, smoke, and soot problems, derived mainly from power stations and household use of lignite, a form of coal. High oil prices drove Turkey to develop its lignite, which is high in sulfur and ash, and Ankara's air became the worst in Turkey. By 1990 it had 4 million people, half a million cars, and, in the winter at least, air quality among the worst in the world. By the early 1990s, in hopes to join the European Union, Ankara began to tighten pollution controls and convert to natural gas. Its air improved dramatically, despite continued urban growth.

Saudi Arabia and Wheat

When oil markets in the 1970s sent billions of dollars towards the Arabian peninsula (and Libya), Saudis invested some of that into schemes to exploit their aquifers. By 2000 they got 70 to 90% of their freshwater from underground. Although it takes a thousand tons of water (assuming not a drop is wasted) to raise a single ton of wheat, Saudi policy after 1975 was to grow wheat in the desert, at 5 times the international market price, in order to be self-sufficient in food. By mid-1980, Saudi Arabia exported wheat regularly. The Arabian aquifers scarcely recharged at all, so the Saudis hope that desalination of seawater will become a practical alternative that will relax the historical constraint of water supply in Arabia, which is a bold gamble.


Set pelajaran terkait

Voltage-gated ion channels: Na+, Ca2+ and K+

View Set

Chapter 13: Divergent Boundaries

View Set