Chapter 28

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

Chapter 28 section 1

"The Economic Miracle" Among the most striking features of American society in the 1950s and early 1960s was a booming economic growth that made even the heady 1920s seem pale by comparison. It was a better-balanced and more widely distributed prosperity than that of thirty years earlier, but it was not as universal as some Americans liked to believe. Sources of Economic Growth Between 1945 and 1960, the gross national product grew by 250 percent, from $200 billion to over $500 billion. Unemployment, which during the Depression had averaged between 15 and 25 percent, remained throughout the 1950s and early 1960s at about 5 percent or lower. Inflation, in the meantime, hovered around 3 percent a year or less. Government Spending The causes of this growth and stability were varied. Government spending, which had ended the Depression in the 1940s, continued to stimulate growth through public funding of schools, housing, veterans' benefits, welfare, the $100 billion interstate highway program, which began in 1956, and above all, military spending. Economic growth was at its peak (averaging 4.7 percent a year) during the first half of the 1950s, when military spending was highest because of the Korean War. In the late 1950s, with spending on armaments in decline, the annual rate of growth declined by more than half, to 2.25 percent. The national birth rate reversed a long pattern of decline with the so-called baby boom, which had begun during the war and peaked in 1957. The nation's population rose almost 20 percent in the decade, from 150 million in 1950 to 179 million in 1960. The baby boom contributed to increased consumer demand and expanding economic growth. Suburban Growth The rapid expansion of suburbs—the suburban population grew 47 percent in the 1950s, more than twice as fast as the population as a whole—helped stimulate growth in several important sectors Page 755of the economy. The number of privately owned cars (essential for most suburban living) more than doubled in a decade, sparking a great boom in the automobile industry. Demand for new homes helped sustain a vigorous housing industry. The construction of roads and highways stimulated the economy as well. The American Birth Rate, 1940-1960 This chart shows how the American birth rate grew rapidly during and after World War II (after a long period of decline in the 1930s) to produce what became known as the "baby boom." At the peak of the baby boom, during the 1950s, the nation's population grew by 20 percent. (© Archive Photos/Getty Images) • What impact did the baby boom have on the nation's economy? Because of this unprecedented growth, the economy grew nearly ten times as fast as the population in the thirty years after the war. And while that growth was far from equally distributed, it affected most of society. The average American in 1960 had over 20 percent more purchasing power than in 1945, and more than twice as much as during the prosperous 1920s. By 1960, per capita income was over $1,800, $500 more than it had been in 1945. The American people had achieved the highest standard of living of any society in the history of the world. The Rise of the Modern West No region of the country profited more from economic growth than the American West. Its population expanded dramatically; its cities boomed; its industrial economy flourished. Before World War II, most of the West had been, economically at least, an appendage of the great industrial economy of the East. The West provided the East with raw materials and agricultural goods. By the 1960s, however, some parts of the West had become among the most important (and populous) industrial and cultural centers of the nation in their own right. As during World War II, much of the growth of the West was a result of federal spending and investment—on the dams, power stations, highways, and other infrastructure projects that made economic development possible; and on the military contracts that continued to flow disproportionately to factories in California and Texas, many of them built with government funds during the war. But other factors played a role as well. The enormous increase in automobile use after World War II—a result, among other things, of suburbanization and improved highway systems—gave a large stimulus to the petroleum industry and contributed to the rapid growth of oil fields in Texas and Colorado, and also to the metropolitan centers serving them: Houston, Dallas, and Denver. State governments in the West invested heavily in their universities. The University of Texas and University of California systems, in particular, became among the nation's largest and best; as centers of research, they helped attract technology-intensive industries to the region. Favorable Climate Climate also contributed to economic growth. California, Nevada, and Arizona, in particular, attracted many migrants from the East because of their warm, dry climates. The growth of Los Angeles after World War II was a particularly remarkable phenomenon. More than 10 percent of all new businesses in the United States between 1945 and 1950 began in Los Angeles. Its population rose by over 50 percent between 1940 and 1960. The New Economics The discovery of the power of the American economic system was a major cause of the confident tone of much American political life in the 1950s. During the Depression, politicians, intellectuals, and others had often questioned the viability of capitalism. In the 1950s, such doubt vanished. Two features in particular made the postwar economy a source of national confidence. Keynesian Economics First was the belief that Keynesian economics made it possible for government to regulate and stabilize the economy without intruding directly into the private sector. The British economist John Maynard Keynes had argued as early as the 1920s that by varying the flow of government spending and taxation (fiscal policy) and managing the supply of currency (monetary policy), the government could stimulate the economy to cure recession and dampen growth to prevent inflation. The experience of the last years of the Depression and the first years of the war had seemed to confirm this argument. By the mid-1950s, Keynesian theory was rapidly becoming a fundamental article of faith—not only among professional economists but also among much of the public. The "new economics," as its supporters came to call it, finally won official acceptance in 1963, when John Kennedy proposed a tax cut to stimulate economic growth. Although it took Kennedy's death and the political skills of Lyndon Johnson to win passage of the measure in 1964, the result seemed to confirm all that the Keynesians had predicted: an increase in private demand, which stimulated economic growth and reduced unemployment. Ending Poverty through Economic Growth As the economy continued to expand far beyond what anyone had predicted only a few years before, many Americans assumed that such growth was now without bounds. By the mid-1950s, reformers concerned about poverty were arguing Page 756that the solution lay not in redistribution but in economic growth. The affluent would not have to sacrifice in order to eliminate poverty; the nation would simply have to produce more abundance, thus raising the quality of life of even the poorest citizens to a level of comfort and decency. Patterns of Popular Culture On the Road PEOPLE have traveled across the Americas for thousands of years, often with considerable difficulty. Indians had to make paths over rough terrain. Immigrants from England, Scotland, Mexico, and many other places began to move from town to town often on log roads. Later, they used stones and pebbles to create new byways. Wagons and carriages often rode on bumpy or muddy surfaces. Of course, many years later more-efficient, comfortable roads were built across the American continent, a development that changed American life dramatically. Jack Kerouac Beat author Jack Kerouac's On the Roadcaptured in literature the energy of a country on the move. The book's emphasis on jazz, beat poetry, and drugs, however, made it a notorious expression of the prevailing counterculture. (© Hulton Archive/Getty Images) In the mid-twentieth century, automobiles traveled more slowly than they do today. But cars allowed travelers to discover interesting places and meet new people as they moved from town to town. Jack Kerouac, in his famous book On the Road (1957), was one of the first to capture in writing the new spirit of automobile travel. "It was drizzling and mysterious at the beginning of our journey," Kerouac wrote as he got his battered car ready. "I could see that it was all going to be one big saga of the mist. . . . We were all delighted, when we all realized we were leaving confusion and nonsense behind and performing our one and noble function of the time, move. And we moved!" In the 1950s, Route 66 was one of the first highways to cross most of the United States, from the West Coast to Chicago. But Route 66 became more than a road; it became a popular symbol of a country on the move. Travelers could drive along the famous highway while listening to the hit song "(Get Your Kicks On) Route 66" on the radio. A television series called Route 66 depicted characters becoming involved in all sorts of drama as they traveled the storied route. People traveling long distances, perhaps even across the continent, needed restaurants, motels, and shops, a development that encouraged the creation of fast-food chains, many of which began with drive-in restaurants where customers could be served and eat in their cars. The first drive-in restaurant, Royce Hailey's Pig Stand, opened in Dallas in 1921, followed later in the decade by the White Tower. Ray Kroc's McDonald's opened its first outlets in Des Plaines, Illinois, and southern California in 1955. Five years later, there were 228 outlets. In time, with thousands of restaurants, McDonald's became the most recognizable symbol of food in the world. Large supermarket chains—catering to customers with automobiles—replaced smaller, family-owned markets in town centers. Large centers and shopping malls moved the center of retailing out of cities and into separate, sprawling complexes surrounded by large parking lots. ROUTE 66 The television show Route 66, starring George Maharis and Martin Milner, conveyed in popular culture the optimistic energy of a driving nation. (© CBS/Photofest) Eventually, with President Eisenhower's encouragement, the Federal Aid Highway Act of 1956 provided money to build the interstate highway system. This network of highways made it possible for people to travel long distances quickly and efficiently. • Understand, Analyze, and Evaluate How did television programs like Route 66 of the mid-1950s reflect much older American myths about the American West? In what other ways than those described in this feature did the interstate highway system help the American economy? What military advantages did the highway system provide? Capital and Labor Corporate Consolidation Over 4,000 corporate mergers took place in the 1950s; and more than ever before, a relatively small number of large-scale organizations controlled an enormous proportion of the nation's economic activity. This was particularly true in industries benefiting from government defense spending. As during World War II, the federal government tended to award military contracts to a few large corporations. In 1959, for example, half of all defense contracts went to only twenty firms. By the end of the decade, half the net corporate income in the nation was going to only slightly more than 500 firms, or one-tenth of 1 percent of the total number of corporations. A similar consolidation was occurring in the agricultural economy. As increasing mechanization reduced the need for farm labor, the agricultural workforce declined by more than half in the two decades after the war. Mechanization also endangered one of the most cherished American institutions: the family farm. By the 1960s, relatively few individuals could Page 757any longer afford to buy and equip a modern farm, and much of the nation's most productive land had been purchased by financial institutions and corporations. Corporations enjoying booming growth were reluctant to allow strikes to interfere with their operations. As a result, business leaders made important concessions to unions. As early as 1948, Walter Reuther, president of the United Automobile Workers, obtained a contract from General Motors that included a built-in "escalator clause"—an automatic cost-of-living increase pegged to the consumer price index. In 1955, Reuther received a guarantee from Ford Motor Company of continuing wages to auto workers even during layoffs. By the mid-1950s, factory wages in all industries had risen substantially. The "Postwar Contract" By the early 1950s, large labor unions had developed a new kind of relationship with employers, a relationship sometimes known as the "postwar contract." Workers in steel, automobiles, and other large unionized industries were receiving generous increases in wages and benefits; in return, the unions tacitly agreed to refrain from raising other issues—issues involving control of the workplace and a voice for workers in the planning of production. Strikes became far less frequent. AFL-CIO The economic successes of the 1950s helped pave the way for a reunification of the labor movement. In December 1955, the American Federation of Labor and the Congress of Industrial Organizations ended their twenty-year rivalry and merged to create the AFL-CIO, Page 758under the leadership of George Meany. Relations between the leaders of the former AFL and the former CIO were not always comfortable. CIO leaders believed (correctly) that the AFL hierarchy was dominating the relationship. AFL leaders were suspicious of what they considered the radical past of the CIO leadership. Even so, the union of the two great labor movements of the 1930s survived; and gradually tensions subsided. Success bred corruption in some union bureaucracies. In 1957, the powerful Teamsters Union became the subject of a congressional investigation, and its president, David Beck, was charged with misappropriation of union funds. Beck ultimately stepped down to be replaced by Jimmy Hoffa, whom government investigators pursued for nearly a decade before finally winning a conviction against him (for tax evasion) in 1967. The United Mine Workers, the union that had spearheaded the industrial movement in the 1930s, similarly became tainted by suspicions of corruption and by violence. John L. Lewis's last years as head of the union were plagued with scandals and dissent within the organization. His successor, Tony Boyle, was ultimately convicted of complicity in the 1969 murder of the leader of a dissident faction within the union. Limited Gains for Unorganized Workers While the labor movement enjoyed significant success in winning better wages and benefits for workers already organized in strong unions, the majority of laborers who were as yet unorganized made fewer advances. Total union membership remained relatively stable throughout the 1950s, at about 16 million; and while this was in part a result of a shift in the workforce from blue-collar to white-collar jobs, it was also a result of new obstacles to organization. The 1947 Taft-Hartley Act and the state "right-to-work" laws that it created made it more difficult to form (or even sustain) many unions. The CIO launched a major organizing drive in the South shortly after World War II, targeting the poorly paid workers in textile mills in particular. But "Operation Dixie," as it was called, was a failure—as were most other organizing drives for at least thirty years after World War II. Workers Represented By Unions, 1920-2001 This chart shows the number of workers represented by unions over an eighty-year period. Note the dramatic rise in the unionized workforce during the 1930s and 1940s, the slower but still-significant rise in the 1960s and 1970s, and the steady decline that began in the 1980s. The chart, in fact, understates the decline of unionized labor in the postwar era, since it shows union membership in absolute numbers and not as a percentage of the rapidly growing workforce. • Why did unions cease recruiting new members successfully in the 1970s, and why did they begin losing members in the 1980s?

Chapter 28 section 6

Eisenhower Republicanism Dwight D. Eisenhower was one of the least experienced politicians to serve in the White House in the twentieth century. He was also among the most popular and politically successful presidents of the postwar era. At home, he pursued essentially moderate policies, avoiding most new initiatives but accepting the work of earlier reformers. Abroad, he continued and even intensified American commitments to oppose communism but brought to some of those commitments a measure of restraint that his successors did not always match. "What Was Good for . . . GeneralMotors" Business Leaders' New Outlook The first Republican administration in twenty years staffed itself with men drawn from the same quarter as those who had staffed Republican administrations in the 1920s: the business community. But by the 1950s, many business leaders had acquired a social and political outlook very different from that of their predecessors. Above all, many had reconciled themselves to at least the broad outlines of the Keynesian welfare state the New Deal had launched. Indeed, some corporate leaders had come to see it as something that actually benefited them—by helping maintain social order, by increasing mass purchasing power, and by stabilizing labor relations. To his cabinet, Eisenhower appointed wealthy corporate lawyers and business executives who were not apologetic about their backgrounds. Charles Wilson, president of General Motors, assured senators considering his nomination for secretary of defense that he foresaw no conflict of interest because he was certain that "what was good for our country was good for General Motors, and vice versa." Eisenhower's consistent inclination was to limit federal activities and encourage private enterprise. He supported the private rather than public development of natural resources. To the chagrin of farmers, he lowered federal support for farm prices. He also removed the last limited wage and price controls maintained by the Truman administration. He opposed the creation of new social service programs such as national health insurance. He strove constantly to reduce federal expenditures (even during the recession of 1958) and balance the budget. He ended 1960, his last full year in office, with a $1 billion budget surplus. The Survival of the Welfare State Federal Highway Act of 1956 The president took few new initiatives in domestic policy, but he resisted pressure from the right wing of his party to dismantle those welfare policies of the New Deal that had survived the conservative assaults of the war years and after. He agreed to extend the Social Security system to an additional 10 million people and unemployment compensation to an additional 4 million, and he agreed to increase the legal minimum hourly wage from 75 cents to $1. Perhaps the most significant legislative accomplishment of the Eisenhower administration was the Federal Highway Act of 1956, which authorized $25 billion for a ten-year project that built over 40,000 miles of interstate highways—the largest public works project in American history. The program was to be funded through a highway "trust fund," whose revenues would come from new taxes on the purchase of fuel, automobiles, trucks, and tires. Page 775 The "Highway Bill" President Dwight Eisenhower signs the Federal Highway Act of 1956 into law from the Oval Office at the White House. This initiative authorized $25 billion to build more than 40,000 miles of interstate highway. (© Corbis) In 1956, Eisenhower ran for a second term, even though he had suffered a serious heart attack the previous year. With Adlai Stevenson opposing him once again, he won by another, even greater landslide, receiving nearly 57 percent of the popular vote and 457 electoral votes to Stevenson's 73. Democrats retained control of both houses of Congress, which they had won back in 1954. And in 1958—during a serious recession—they increased that control by substantial margins. The Decline of McCarthyism The Eisenhower administration did little in its first years in office to discourage the anticommunist furor that had gripped the nation. By 1954, however, the crusade against subversion was beginning to produce significant popular opposition—an indication that the anticommunist passion of several years earlier was beginning to abate. The clearest signal of that change was the political demise of Senator Joseph McCarthy. Army-McCarthy Hearings During the first year of the Eisenhower administration, McCarthy continued to operate with impunity. But in January 1954 he overreached when he attacked Secretary of the Army Robert Stevens and the armed services in general. At that point, the administration and influential members of Congress organized a special investigation of the charges, which became known as the Army-McCarthy hearings. They were among the first congressional hearings to be nationally televised. The result was devastating to McCarthy. Watching McCarthy in action—bullying witnesses, hurling groundless (and often cruel) accusations, evading issues—much of the public began to see him as a villain, and even a buffoon. In December 1954, the Senate voted 67 to 22 to condemn him for "conduct unbecoming a senator." Three years later, with little public support left, he died—a victim, apparently, of complications arising from alcoholism.

Chapter 28 section 2

The Explosion of Science and Technology In 1961, Time magazine selected as its "man of the year" not a specific person but "the American Scientist." The choice was an indication of the widespread fascination with which Americans viewed science and technology. But it was also a sign of the remarkable, and remarkably rapid, scientific and technological advances in many areas during the postwar years. Medical Breakthroughs A particularly important advance in medical science was the development of new antibacterial drugs capable of fighting infections that in the past had been all but untreatable. Antibiotics The development of antibiotics had its origins in the discoveries of Louis Pasteur and Jules-François Joubert. Working in France in the 1870s, they produced the first conclusive evidence that virulent bacterial infections could be defeated by other, more ordinary bacteria. Using their discoveries, the English physician Joseph Lister revealed the value of antiseptic solutions in preventing infection during surgery. But the practical use of antibacterial agents to combat disease did not begin until many decades later. In the 1930s, scientists in Germany, France, and England demonstrated the power of so-called sulfa drugs—drugs derived from an antibacterial agent known as sulfanilamide—which could be used effectively to treat streptococcal blood infections. New sulfa drugs were soon being developed at an astonishing rate, and were steadily improved, with dramatic results in treating what had once been a major cause of death. Penicillin In 1928, in the meantime, Alexander Fleming, an English medical researcher, accidentally discovered the antibacterial properties of an organism that he named penicillin. There was little progress in using penicillin to treat human illness, however, until a group of researchers at Oxford University, directed by Howard Florey Page 759and Ernest Chain, learned how to produce stable, potent penicillin in sizable enough quantities to make it a practical weapon against bacterial disease. The first human trials of the new drug, in 1941, were dramatically successful, but progress toward the mass availability of penicillin was stalled in England because of World War II. American laboratories took the next crucial steps in developing methods for the mass production and commercial distribution of penicillin, which became widely available to doctors and hospitals around the world by 1948. Since then, a wide range of new antibiotics of highly specific character have been developed so that bacterial infections are now among the most successfully treated of all human illnesses. There was also dramatic progress in immunization. The first great triumph was the development of the smallpox vaccine by the English researcher Edward Jenner in the late eighteenth century. A vaccine effective against typhoid was developed by an English bacteriologist, Almorth Wright, in 1897, and was in wide use by World War I. Vaccination against tetanus became widespread in many countries just before and during World War II. Medical scientists also developed a vaccine, BCG, against another major killer, tuberculosis, in the 1920s; but controversy over its safety stalled its adoption, especially in the United States, for many years. It was not widely used in the United States until after World War II, when it largely eliminated tuberculosis. Viruses are much more difficult to prevent and treat than bacterial infections, and progress toward vaccines against viral infections—except for smallpox—was relatively slow. Not until the 1930s, when scientists discovered how to grow viruses in laboratories in tissue cultures, could researchers study them with any real effectiveness. Gradually, they discovered how to produce forms of a virus incapable of causing a disease but capable of triggering antibodies in vaccinated people that would protect them from contracting the disease. An effective vaccine against yellow fever was developed in the late 1930s, and one against some forms of influenza—one of the great killers of the first half of the twentieth century—appeared in 1945. Salk Vaccine A particularly dramatic postwar triumph was the development of a vaccine against polio. In 1954, the American scientist Jonas Salk introduced an effective vaccine against the virus that had killed and crippled thousands of children and adults (among them Franklin Roosevelt). It was provided free to the public by the federal government beginning in 1955. After 1960, an oral vaccine developed by Albert Sabin—usually administered in a sugar cube—made widespread vaccination even easier. By the early 1960s, these vaccines had virtually eliminated polio from American life and much of the rest of the world. As a result of these and many other medical advances, both infant mortality and the death rate among young children declined significantly in the first twenty-five years after the war (although not by as much as in Western Europe). Average life expectancy in that same period rose by five years, to seventy-one. Pesticides DDT At the same time that medical researchers were finding cures and vaccines against infectious diseases, other scientists were developing new kinds of chemical pesticides, which they hoped would protect crops from destruction by insects and protect humans from such insect-carried diseases as typhus and malaria. The most famous of the new pesticides was dichlorodiphenyltrichloroethane, generally known as DDT, a compound discovered in 1939 by a Swiss chemist named Paul Muller. He had found that although DDT seemed harmless to human beings and other mammals, it was extremely toxic to insects. American scientists learned of Muller's discovery in 1942, just as the army was grappling with the insect-borne tropical diseases—especially malaria and typhus—that threatened American soldiers overseas during World War II. Under these circumstances, DDT seemed a godsend. It was first used on a large scale in Italy in 1943-1944 during a typhus outbreak, which it quickly helped end. Soon it was being sprayed in mosquito-infested areas of Pacific islands where American troops were fighting the Japanese. No soldiers suffered any apparent ill effects from the sprayings, and the incidence of malaria dropped precipitously. DDT quickly gained a reputation as a miraculous tool for controlling insects, and it undoubtedly saved thousands of lives. Only later did scientists recognize that DDT had long-term toxic effects on animals and humans. Postwar Electronic Research Invention of Television The 1940s and 1950s saw dramatic new developments in electronic technology. Researchers in the 1940s produced the first commercially viable televisions and created a technology that made it possible to broadcast programming over large areas. Later, in the late 1950s, scientists at RCA's David Sarnoff Laboratories in New Jersey developed the technology for color television, which first became widely available in the early 1960s. In 1948 Bell Labs, the research arm of AT&T, produced the first transistor, a small solid-state device capable of amplifying electrical signals, which was much smaller and more efficient than the cumbersome vacuum tubes that had powered most electronic equipment in the past. Transistors made possible the miniaturization of many devices (radios, televisions, audio equipment, hearing aids) and were also important in aviation, weaponry, and satellites. They contributed as well to another major breakthrough in electronics: the development of integrated circuitry in the late 1950s. Integrated circuits combined a number of once-separate electronic elements (transistors, resistors, diodes, and others) and embedded them into a single, microscopically small device. They made it possible to create increasingly complex electronic devices requiring complicated circuitry that would have been impractical to produce through other means. Most of all, integrated circuits helped advance the development of the computer. Page 760 The Salk Vaccine Dr. Jonas Salk, a medical researcher at the University of Pittsburgh, developed in the mid-1950s the first vaccine that proved effective in preventing polio. In its aftermath, scenes similar to this one—a mass inoculation of families in a municipal stadium in Evansville, Indiana—repeated themselves all over the country. A few years later, Dr. Albert Sabin of the University of Cincinnati created a vaccine that could be administered more easily, through sugar cubes. (© AP Images) Postwar Computer Technology Prior to the 1950s, computers had been constructed mainly to perform complicated mathematical tasks, such as those required to break military codes. In the 1950s, they began to perform commercial functions for the first time, as data-processing devices used by businesses and other organizations. The Dawn of the Computer Age This massive computer, powered by tubes, was part of the first generation of mainframes developed after World War II. They served mostly government agencies and large corporations. By the 1990s, a small desktop computer could perform all the functions of this huge computer at much greater speed. (© Time & Life Pictures/Getty Images) UNIVAC The first significant computer of the 1950s was the Universal Automatic Computer (or UNIVAC), which was developed initially for the U.S. Bureau of the Census by the Remington Rand Company. It was the first computer able to handle both alphabetical and numerical information easily. It used tape storage and could perform calculations and other functions much faster than its predecessor, the ENIAC, developed in 1946 by the same researchers at the University of Pennsylvania who were responsible for the UNIVAC. Searching for a larger market than the census for their very expensive new device, Remington Rand arranged to use a UNIVAC to predict the results of the 1952 election for CBS television news. It would, they believed, produce valuable publicity for the machine. Analyzing early voting results, the UNIVAC accurately predicted an enormous landslide victory for Eisenhower over Stevenson. Few Americans had ever heard of a computer before that night, and the UNIVAC's television debut became, therefore, a critical breakthrough in public awareness of computer technology. Remington Rand had limited success in marketing the UNIVAC, but in the mid-1950s the International Business Machines Company (IBM) introduced its first major data-processing computers and began to find a wide market for them among businesses in the United States and abroad. These early successes, combined with the enormous amount of money IBM invested in research and development, made the company the worldwide leader in computers for many years. Bombs, Rockets, and Missiles The Hydrogen Bomb In 1952, the United States successfully detonated the first hydrogen bomb. (The Soviet Union tested its first H-bomb a year later.) Unlike the plutonium and uranium bombs developed during World War II, the hydrogen bomb derives its power not from fission (the splitting of atoms) but from fusion (the joining of lighter atomic elements with heavier ones). It is capable of producing explosions of vastly greater power than the earlier, fission bombs. The development of the hydrogen bomb gave considerable impetus to a stalled scientific project in both the United States and the Soviet Union—the effort to develop unmanned rockets and missiles capable of carrying the Page 761new weapons, which were not suitable for delivery by airplanes, to their targets. Both nations began to put tremendous resources into their development. The United States, in particular, benefited from the emigration to America of some of the German scientists who had helped develop rocketry for Germany during World War II. In the United States, early missile research was conducted almost entirely by the Air Force. There were significant early successes in developing rockets capable of traveling several hundred miles. But American and Soviet leaders were both struggling to build longer-range missiles that could cross oceans and continents—intercontinental ballistic missiles, or ICBMs, capable of traveling through space to distant targets. American scientists experimented in the 1950s with first the Atlas and then the Titan ICBM. There were some early successes, but there were also many setbacks, particularly because of the difficulty of massing sufficient, stable fuel to provide the tremendous power needed to launch missiles beyond the atmosphere. By 1958, scientists had created a solid fuel to replace the volatile liquid fuels of the early missiles; and they had also produced miniaturized guidance systems capable of ensuring that missiles could travel to reasonably precise destinations. Within a few years, a new generation of missile, known as the Minuteman, with a range of several thousand miles, became the basis of the American atomic weapons arsenal. American scientists also developed a nuclear missile capable of being carried and fired by submarines—the Polaris, which could launch from below the surface of the ocean by compressed air. A Polaris was first successfully fired from underwater in 1960. The Space Program The Shock of Sputnik The origins of the American space program can be traced most directly to a dramatic event in 1957, when the Soviet Union announced that it had launched an earth-orbiting satellite—Sputnik—into outer space. The United States had yet to perform any similar feats, and the American government (and much of American society) reacted to the announcement with alarm, as if the Soviet achievement was also a massive American failure. Federal policy began encouraging (and funding) strenuous efforts to improve scientific education in the schools, to create more research laboratories, and, above all, to speed the development of America's own exploration of outer space. The United States launched its first satellite, Explorer I, in January 1958. Launching a Satellite, 1961 Four years after the successful Russian launching of the satellite Sputnik in 1957 threw Americans into something close to a panic, a Thor-Able Star rocket takes off from Cape Canaveral, Florida, carrying an American satellite. The satellite contained a nuclear generator capable of providing it with extended continuous power for its radio transmitters. (National Archives and Records Administration) The centerpiece of space exploration, however, soon became the manned space program, established in 1958 through the creation of a new agency, the National Aeronautics and Space Administration (NASA), and through the selection of the first American space pilots, or "astronauts." They quickly became among the nation's most revered heroes. NASA's initial effort, the Mercury Project, was designed to launch manned vehicles into space to orbit the earth. On May 5, 1961, Alan Shepard became the first American launched into space. But his short, suborbital flight came several months after a Soviet "cosmonaut," Yuri Gagarin, had made a flight in which he had orbited the earth. On February 2, 1962, John Glenn (later a U.S. senator) became the first American to orbit the globe. NASA later introduced the Gemini program, whose spacecraft could carry two astronauts at once. The Apollo Program Mercury and Gemini were followed by the Apollo program, whose purpose was to land men on the moon. It had some catastrophic setbacks, most notably a fire in January 1967 that killed three astronauts. But on July 20, 1969, Neil Armstrong, Edwin Aldrin, and Michael Collins successfully traveled in a space capsule into orbit around the moon. Armstrong and Aldrin then detached a smaller craft from the capsule, landed Page 762on the surface of the moon, and became the first humans to walk on a body other than earth. Six more lunar missions followed, the last in 1972. Not long after that, however, the government began to cut the funding for missions, and popular enthusiasm for the program began to wane. Apollo 11 Edwin ("Buzz") Aldrin is photographed by his fellow astronaut Neil Armstrong in August 1969, when they became the first humans to set foot on the surface of the moon. They traveled into orbit around the moon in the spaceship Apollo 11 and then traveled from the spaceship to the moon in a "lunar module," which they then used to return to the ship for the journey home. (NASA) The future of the manned space program did not lie primarily in efforts to reach distant planets, as originally envisioned. Instead, the program became a more modest effort to make travel in near space easier and more practical through the development of the "space shuttle," an airplane-like device launched by a missile but capable of both navigating in space and landing on earth much like a conventional aircraft. The first space shuttle was successfully launched in 1982. The explosion of one shuttle, Challenger, in January 1986 shortly after takeoff, killing all seven astronauts, stalled the program for two years. Missions resumed in the late 1980s, driven in part by commercial purposes. The space shuttle launched and repaired communications satellites, and inserted the Hubble Space Telescope into orbit in 1990 (and later repaired its flawed lens). But problems continued to plague the program into the early twenty-first century. The space program, like the military development of missiles, gave a tremendous boost to the American aeronautics industry and was responsible for the development of many technologies that proved valuable in other areas.

Chapter 28 section 7

Eisenhower, Dulles, and the Cold War The threat of nuclear war with the Soviet Union created a sense of high anxiety in international relations in the 1950s. But the nuclear threat had another effect as well. With the potential devastation of an atomic war so enormous, both superpowers began to edge away from direct confrontations. The attention of both the United States and the Soviet Union began to turn to the rapidly escalating instability in the poor and developing nations of the Third World. Dulles and "Massive Retaliation" Eisenhower's secretary of state, and (except for the president himself ) the dominant figure in the nation's foreign policy in the 1950s, was John Foster Dulles, an aristocratic corporate lawyer with a stern moral revulsion to communism. He entered office denouncing the containment policies of the Truman years as excessively passive, arguing that the United States should pursue an active program of "liberation," which would lead to a "rollback" of communist expansion. Once in power, however, he had to defer to the more moderate views of the president himself. Economic Benefits of "Massive Retaliation" The most prominent of Dulles's innovations was the policy of "massive retaliation," which Dulles announced early in 1954. Page 776The United States would, he explained, respond to communist threats to its allies not by using conventional forces in local conflicts (a policy that had led to so much frustration in Korea) but by relying on "the deterrent of massive retaliatory power" (by which he meant nuclear weapons). In part, the new doctrines reflected Dulles's inclination for tense confrontations, an approach he once defined as "brinksmanship"—pushing the Soviet Union to the brink of war in order to exact concessions. But the real force behind the massive-retaliation policy was economics. With pressure growing both in and out of government for a reduction in American military expenditures, an increasing reliance on atomic weapons seemed to promise, as some advocates put it, "more bang for the buck." France, America, and Vietnam What had been the most troubling foreign policy concern of the Truman years—the war in Korea—plagued the Eisenhower administration only briefly. On July 27, 1953, negotiators at Panmunjom finally signed an agreement ending the hostilities. Each antagonist was to withdraw its troops a mile and a half from the existing battle line, which ran roughly along the 38th parallel, the prewar border between North and South Korea. A conference in Geneva was to consider means by which to reunite the nation peacefully—although in fact the 1954 meeting produced no agreement and left the cease-fire line as the apparently permanent border between the two countries. Eisenhower and Dulles Although President Eisenhower was a somewhat colorless television personality, his was the first administration to make extensive use of the new medium to promote its policies and dramatize its actions. The president's press conferences were frequently televised, and on several occasions Secretary of State John Foster Dulles reported to the president in front of the cameras. Dulles is shown here in the Oval Office on May 17, 1955, reporting after his return from Europe, where he had signed the treaty restoring sovereignty to Austria. (© AP Images) Almost simultaneously, however, the United States was being drawn into a long, bitter struggle in Southeast Asia. Vietnam, a colony of France, was facing strong opposition from nationalists, led by Ho Chi Minh, a communist. Dien Bien Phu When French troops became surrounded in a disastrous siege in North Vietnam, only American intervention, it was clear, could prevent the total collapse of the French military effort. Yet despite the urgings of Secretary of State Dulles, Vice President Nixon, and others, Eisenhower refused to permit direct American military intervention in Vietnam, claiming that neither Congress nor America's other allies would support such action. Cold War Crises American foreign policy in the 1950s rested on a reasonably consistent foundation: the containment policy, as revised by the Truman and Eisenhower administrations. But the nation's leaders spent much of their time reacting to both real and imagined crises in far-flung areas of the world. Among the Cold War challenges the Eisenhower administration Page 777confronted were a series of crises in the Middle East, a region in which the United States had been little involved until after World War II. Recognizing Israel On May 14, 1948, after years of Zionist efforts and a dramatic decision by the new United Nations, Israel proclaimed its independence. President Truman recognized the new Jewish homeland the next day. But the creation of Israel, while it resolved some conflicts, created others. Palestinian Arabs, unwilling to accept being displaced from what they considered their own country, joined with Israel's Arab neighbors and fought determinedly against the new state in 1948—the first of several Arab-Israeli wars. The State Of Israel The prime minister of Israel, David Ben-Gurion (in suit and open collar shirt), watches the departure of the last British troops from Palestine shortly after the United Nations approved (and the United States recognized) in 1948 the existence of a new Jewish state in part of the region. (© Bettmann/Corbis) Committed as the American government was to Israel, it was also concerned about the stability and friendliness of the Arab regimes in the oil-rich Middle East, in which American petroleum companies had major investments. Thus the United States reacted with alarm as it watched Mohammad Mossadegh, the nationalist prime minister of Iran, begin to resist the presence of Western corporations in his nation in the early 1950s. In 1953, the American CIA joined forces with conservative Iranian military leaders to engineer a coup that drove Mossadegh from office. To replace him, the CIA helped elevate the young shah of Iran, Mohammad Reza Pahlevi, from his position as token constitutional monarch to that of virtually absolute ruler. The shah remained closely tied to the United States for the next twenty-five years. Suez Crisis American policy was less effective in dealing with the nationalist government of Egypt, under the leadership of General Gamal Abdel Nasser, which began to develop a trade relationship with the Soviet Union in the early 1950s. In 1956, to punish Nasser for his friendliness toward the communists, Dulles withdrew American offers to assist in building the great Aswan Dam across the Nile. A week later, Nasser retaliated by seizing control of the Suez Canal from the British, saying that he would use the income from it to build the dam himself. On October 29, 1956, Israeli forces attacked Egypt. The next day the British and French landed troops in the Suez to drive the Egyptians from the canal. Dulles and Eisenhower feared that the Suez crisis would drive the Arab states toward the Soviet Union and precipitate a new world war. By refusing to support the invasion, and by joining in a United Nations denunciation of it, the United States helped pressure the French and British to withdraw and helped persuade Israel to agree to a truce with Egypt. Cold War concerns affected American relations in Latin America as well. In 1954, the Eisenhower administration ordered the CIA to help topple the new, leftist government of Jacobo Arbenz Guzmán in Guatemala, a regime that Dulles (responding to the entreaties of the United Fruit Company, a major investor in Guatemala fearful of Arbenz) argued was potentially communist. Fidel Castro No nation in the region had been more closely tied to America than Cuba. Its leader, Fulgencio Batista, had ruled as a military dictator since 1952, when with American assistance he had toppled a more moderate government. Cuba's relatively prosperous economy had become a virtual fiefdom of American corporations, which controlled almost all the island's natural resources and had cornered over half the vital sugar crop. American organized-crime syndicates controlled much of Havana's lucrative hotel and nightlife business. In 1957, a popular movement of resistance to the Batista regime began to gather strength Page 778under the leadership of Fidel Castro. On January 1, 1959, with Batista having fled to exile in Spain, Castro marched into Havana and established a new government. Castro soon began implementing radical policies of land reform and expropriating foreign-owned businesses and resources. Cuban-American relations deteriorated rapidly as a result. When Castro began accepting assistance from the Soviet Union in 1960, the United States cut back the "quota" by which Cuba could export sugar to America at a favored price. Early in 1961, as one of its last acts, the Eisenhower administration severed diplomatic relations with Castro. Isolated by the United States, Castro soon cemented an alliance with the Soviet Union. The Cuban Revolution Fidel Castro is shown here in the Cuban jungle in 1957 with a small group of his staff and their revolutionary forces. Kneeling in the foreground is Castro's brother Raoul. Two years later, Castro's forces toppled the existing government and elevated Fidel to the nation's leadership, where he remained for almost fifty years. (© Bettmann/Corbis) Europe and the Soviet Union Hungarian Revolution of 1956 Although the problems of the Third World were moving slowly toward the center of American foreign policy, the direct relationship with the Soviet Union and the effort to resist communist expansion in Europe remained the principal concerns of the Eisenhower administration. In 1955, Eisenhower and other NATO leaders met with the Soviet premier, Nikolai Bulganin, at a cordial summit conference in Geneva. But when a subsequent conference of foreign ministers met to try to resolve specific issues, they could find no basis for agreement. Relations between the Soviet Union and the West soured further in 1956 in response to the Hungarian Revolution. Hungarian dissidents had launched a popular uprising in November to demand democratic reforms. Before the month was out, Soviet tanks and troops entered Budapest to crush the uprising and restore an orthodox, pro-Soviet regime. The Eisenhower administration refused to intervene. The U-2 Crisis In November 1958, Nikita Khrushchev, who had succeeded Bulganin as Soviet premier and Communist Party chief earlier that year, renewed the demands of his predecessors that the NATO powers abandon West Berlin. When the United States and its allies predictably refused, Khrushchev suggested Page 779that he and Eisenhower discuss the issue personally, both in visits to each other's countries and at a summit meeting in Paris in 1960. The United States agreed. Khrushchev's 1959 visit to America produced a cool but mostly polite public response. Plans proceeded for the summit conference and for Eisenhower's visit to Moscow shortly thereafter. Only days before the scheduled beginning of the Paris meeting, however, the Soviet Union announced that it had shot down an American U-2, a high-altitude spy plane, over Russian territory. Its pilot, Francis Gary Powers, was in captivity. Khrushchev lashed out angrily at the American incursion into Soviet air space, breaking up the Paris summit almost before it could begin and withdrawing his invitation to Eisenhower to visit the Soviet Union. Eisenhower's Restraint After eight years in office, Eisenhower had failed to eliminate, and in some respects had actually increased, the tensions between the United States and the Soviet Union. Yet Eisenhower had brought to the Cold War his own sense of the limits of American power. He had resisted military intervention in Vietnam. And he had placed a measure of restraint on those who urged the creation of an enormous American military establishment. In his Farewell Address in January 1961, he warned of the "unwarranted influence" of a vast "military-industrial complex." His caution, in both domestic and international affairs, stood in marked contrast to the attitudes of his successors, who argued that the United States must act more boldly and aggressively on behalf of its goals at home and abroad.

Chapter 28 section 3

People of Plenty Among the most striking social developments of the postwar era was the rapid expansion of a middle-class lifestyle and outlook. The new prosperity of social groups that had previously lived on the margins; the growing availability of consumer products at affordable prices and the rising public fascination with such products; and the massive population movement from the cities to the suburbs—all helped make the American middle class a larger, more powerful, more homogeneous, and more dominant force than ever before. The Consumer Culture At the center of middle-class culture in the 1950s, as it had been for many decades before, was a growing absorption with consumer goods. That was a result of increased prosperity, of the increasing variety and availability of products, and of advertisers' adeptness in creating a demand for those products. It was also a result of the growth of consumer credit, which increased by 800 percent between 1945 and 1957 through the development of credit cards, revolving charge accounts, and easy-payment plans. Prosperity fueled the automobile industry, and Detroit responded to the boom with ever-flashier styling and accessories. Consumers also responded eagerly to the development of such new products as dishwashers, garbage disposals, televisions, hi-fis, and stereos. To a large degree, the prosperity of the 1950s and 1960s was consumer driven (as opposed to investment driven). Consumer Crazes Because consumer goods were so often marketed (and advertised) nationally, the 1950s were notable for the rapid spread of great national consumer crazes. For example, children, adolescents, and even some adults became entranced in the late 1950s with the hula hoop—a large plastic ring kept spinning around the waist. The popularity of the Walt Disney-produced children's television show The Mickey Mouse Club created a national demand for related products such as Mickey Mouse watches and hats. It also helped produce the stunning success of Disneyland, an amusement park near Los Angeles that re-created many of the characters and events of Disney entertainment programs. Page 763 The Davy Crockett Craze In the 1950s Walt Disney introduced a television show about American folk hero Davy Crockett. This boy wears to school his hero's famous "coon-skin" (that is, raccoon skin) cap and Davy Crockett T-shirt as he reads about Crockett. In addition to merchandising products associated with Crockett, Disney believed that it was time for Americans "to get acquainted, or renew acquaintance with, the robust, cheerful, energetic and representative folk heroes" from the American past. (© Bettmann/Corbis) The Landscape and theAutomobile The success of Disneyland depended largely on the ease of highway access from the dense urban areas around it, as well as the vast parking lots that surrounded the park. It was, in short, a symbol of the overwhelming influence of automobiles on American life and on the American landscape in the postwar era. Between 1950 and 1980, the nation's population increased by 50 percent, but the numbers of automobiles owned by Americans increased by 400 percent. Interstate Highways The Federal Highway Act of 1956, which appropriated $25 billion for highway construction, was one of the most important alterations of the national landscape in modern history. Great ribbons of concrete—40,000 miles of them—spread across the nation, spanning rivers and valleys, traversing every state, and providing links to every major city (and between cities and their suburbs). These highways dramatically reduced the time necessary to travel from one place to another. They also made trucking a more economical way than railroads to transport goods to markets. They made travel by automobile, truck, and bus as fast as or faster than travel by trains, resulting in the long, steady decline of railroads. Highways also encouraged the movement of economic activities—manufacturing in particular—out of cities and into surburban and rural areas where land was cheaper. The decline of many traditional downtowns soon followed, as many workers moved outside the urban core. There was rapid growth of what eventually became known as "edge cities" and other new centers of industry and commerce outside traditional city centers. The proliferation of automobiles and the spread of highways also made it easier for families to move into homes that were far away from where they worked. This enabled many people to live in larger houses with larger lots than they could have afforded previously. Garages began to be built onto houses in great numbers after World War II, and such suburban amenities as swing sets, barbecues, and private swimming pools became more common as backyards became more the focus of family life. The shift of travel from train to automobile helped launch a tremendous proliferation of motels—26,000 by 1948, 60,000 by 1960, well over 100,000 by 1970. The first Holiday Inn (starting what would soon become the largest motel chain in America) opened along a highway connecting Memphis and Nashville, Tennessee, in 1952. Drive-in theaters—a distinctively American phenomenon that had begun to appear in the 1930s—spread rapidly after the war. There were 4,000 drive-ins by 1958. The Suburban Nation "Levittown" By 1960, a third of the nation's population was living in suburbs. Suburbanization was partly a result of important innovations in home-building, which made single-family houses affordable to millions of people. The most famous of the postwar suburban developers, William Levitt, used new mass-production techniques to construct a large housing development on Long Island, near New York City. This first "Levittown" (there would later be others in New Jersey and Pennsylvania) consisted of several thousand two-bedroom Cape Cod-style houses, with identical interiors and only slightly varied façades, each perched on its own concrete slab (to eliminate excavation costs), facing curving, treeless streets. Levittown houses, and other, similarly low-priced homes, sold for under $10,000, and they helped meet an enormous and growing demand for housing. Young couples—often newly married and the husband a war veteran, eager to start a family, assisted by low-cost, government-subsidized mortgages provided by the GI Bill (see p. 741)—rushed to purchase the inexpensive homes. Why did so many Americans want to move to the suburbs? One reason was the enormous importance postwar Americans placed on family life after five years of disruptive war. Suburbs Page 764provided families with larger homes than they could find (or afford) in the cities. Many people were attracted by the idea of living in a community populated largely by people of similar age and background and found it easier to form friendships and social circles there than in the city. Women in particular often valued the presence of other nonworking mothers living nearby to share the tasks of child raising. Another factor motivating white Americans to move to the suburbs was race. There were some African American suburbs, but most suburbs were restricted to whites—both because relatively few blacks could afford to live in them and because formal and informal barriers kept out even prosperous African Americans. In an era when the black population of most cities was rapidly growing, many white families moved to the suburbs to escape the integration of urban neighborhoods and schools. Interstates The interstate highway system changed the physical landscape of the United States. Its great, sprawling ribbons of concrete—such as this photograph that shows the Hollywood Freeway, Harbor Freeway, and Arroyo Seco Freeway intersecting on multiple levels—sliced through cities, towns, and rural areas. But its biggest impact was in facilitating the movement of urban populations out of cities and into increasingly distant suburbs. (Photo by J. R. Eyerman/© Time & Life Pictures/Getty Images) Suburban neighborhoods had many things in common with one another. But they were not uniform. Levittowns and inexpensive developments like them ultimately became the homes of mainly lower-middle-class people one step removed from the inner city. Other, more affluent suburbs became enclaves of wealthy families. In virtually every city, a clear hierarchy emerged of upper-class suburban neighborhoods and more modest ones, just as such gradations had emerged years earlier among urban neighborhoods. The Suburban Family "Prevailing Gender Roles Reinforced" For professional men (many of whom worked in cities, at some distance from their homes), suburban life generally meant a rigid division between their working and personal worlds. For many middle-class, married women, it meant increased isolation from the workplace. The enormous cultural emphasis on family life in the 1950s strengthened popular prejudices against women entering the professions, or occupying any paid job at all. Many middle-class husbands considered it demeaning for their wives to be employed. And many women shied away from the workplace when they could afford to stay at home full-time with their children. Dr. Benjamin Spock One of the most influential books in postwar American life was a famous guide to child rearing: Dr. Benjamin Spock's Baby and Child Care, first published in 1946 and reissued (and revised) repeatedly for decades thereafter. Dr. Spock's approach to raising babies was child-centered, as opposed to parent-centered. The purpose of motherhood, he taught, was to help children learn and grow and realize their potential. All other considerations, including the mother's own physical and emotional requirements, should be subordinated to the needs of the child. Dr. Spock at first envisioned only a very modest role for fathers in the process of child rearing, although he changed his views on this (and on many other issues) over time. Women who could afford not to work faced pressures to remain in the home and concentrate on raising their children. But as expectations of material comfort rose, many middle-class families needed a second income to maintain the standard of living they desired. As a result, the number of married women working outside the home actually increased in the postwar years—even as the social pressure for them to stay out of the workplace grew. By 1960, nearly a third of all married women were part of the paid workforce. The Birth of Television Television, the most powerful medium of mass communication in the second half of the twentieth century, was central to the culture of the postwar era. Experiments in broadcasting pictures (along with sound) had begun as early as the 1920s, but commercial television began only shortly after World War II. Its growth was phenomenally rapid. In 1946, there were only 17,000 sets in the country; by 1957, there were 40 million television sets in use—almost as many sets as there were families. More people had television sets, according to one report, than had refrigerators. The television industry emerged directly out of the radio industry, and all three major networks—the National Broadcasting Company, the Columbia Broadcasting System, and the American Broadcasting Company—had started as radio companies. Like radio, the television business was driven by advertising. The need to attract advertisers determined most programming decisions; and in the early days of television, sponsors often played a direct, powerful, and continuing role in determining the content of the programs they chose to Page 765sponsor. Many early television shows bore the names of the corporations that were paying for them: the GE Television Theater, the Chrysler Playhouse, the Camel News Caravan, and others. Some daytime serials were actually written and produced by Procter & Gamble and other companies. Social Consequences of Television The impact of television on American life was rapid, pervasive, and profound. By the late 1950s, television news had replaced newspapers, magazines, and radios as the nation's most important vehicle of information. Television advertising helped create a vast market for new fashions and products. Televised athletic events gradually made professional and college sports one of the most important sources of entertainment (and one of the biggest businesses) in America. Television entertainment programming—almost all of it controlled by the three national networks and their corporate sponsors—replaced movies and radio as the principal source of diversion for American families. Television's Homogenizing Message Much of the programming of the 1950s and early 1960s created a common image of American life—an image that was predominantly white, middle-class, and suburban, and that was epitomized by such popular situation comedies as Ozzie and Harriet and Leave It to Beaver. Programming also reinforced the concept of gender roles that most men (and many women) unthinkingly embraced. Most situation comedies, in particular, showed families in which, as the title of one of the most popular put it, Father Knows Best, and in which most women were mothers and housewives striving to serve their children and please their husbands. Ozzie And Harriet In addition to depicting American life as predominantly white, middle class, and suburban, television in the 1950s and 1960s often portrayed Americans as optimistic and upwardly mobile. Here the Nelson family, characters from the show The Adventures of Ozzie and Harriet, enjoys time together in their backyard pool. (© ABC/Photofest) But television also conveyed other images: gritty, urban, working-class families in Jackie Gleason's The Honeymooners; the childless show-business family of the early I Love Lucy;unmarried professional women in Our Miss Brooks and My Little Margie; hapless African Americans in Amos 'n Andy. Television not only sought to create an idealized image of a homogeneous suburban America. It also sought to convey experiences at odds with that image—but to convey them in warm, unthreatening terms. Yet television also, inadvertently, created conditions that could accentuate social conflict. Even those unable to share in the affluence of the era could, through television news and other venues, acquire a vivid picture of how the rest of their society lived. And at the same time that television was reinforcing the homogeneity of the white middle class, it was also contributing to the sense of alienation and powerlessness among groups excluded from the world it portrayed. Travel, Outdoor Recreation, andEnvironmentalism The idea of a paid vacation for American workers, and the association of that idea with travel, entered American culture beginning in the 1920s. But it was not until the postwar years that vacation travel became truly widespread among middle-income Americans. The construction of the interstate high-way system contributed dramatically to the growth of travel. So did the increasing affluence of workers, which made it possible for them to buy cars. Echo Park Nowhere was this surge in travel and recreation more visible than in America's national parks, which experienced the Page 766beginnings of what became a permanent surge in attendance in the 1950s. People who traveled to national parks did so for many reasons—some to hike and camp; some to fish and hunt (activities that themselves grew dramatically in the 1950s and helped create a large number of clubs); some simply to look in awe at the landscape. Many visitors to national parks came in search less of conventional recreation than of wilderness. The importance of that search became clear in the early 1950s in the first of many battles over development of wilderness areas: the fight to preserve Echo Park. Chicago's Annexations and the Suburban Noose This map uses Chicago as an example of two important processes in the growth of American cities—municipal consolidation and suburbanization. In 1837, Chicago consisted of a small area on the shore of Lake Michigan (represented by the small dark orange area on the right center of the map. Over the next fifty years, Chicago annexed an enormous amount of additional land around its original borders, followed by a few smaller annexations in the twentieth century. At the same time, however, many of the areas around Chicago were separating themselves from the city by incorporating as independent communities—suburbs—with a particular wave of such incorporations in the first decades of the twentieth century, continuing into the 1990s. A map of New York, and of many other cities, would reveal a similar pattern. • What were the consequences for the city of its legal and financial separation from so many suburban communities? Echo Park is a spectacular valley in the Dinosaur National Monument, on the border between Utah and Colorado, near the southern border of Wyoming. In the early 1950s, the federal government's Bureau of Reclamation—created early in the century to encourage irrigation, develop electric power, and increase water supplies—proposed building a dam across the Green River, which runs through Echo Valley, so as to create a lake for recreation and a source of hydroelectric power. The American environmental movement had been relatively quiet since its searing defeat early in the century in its effort to stop a similar dam in the Hetch Hetchy Valley at Yosemite National Park (see p. 575). But the Echo Park proposal helped rouse it from its slumber. Sierra Club Reborn In 1950, Bernard DeVoto—a well-known writer and a great champion of the American West—published an essay in The Saturday Evening Post titled "Shall We Let Them Ruin Our National Parks?" It had a sensational impact, arousing opposition to the Echo Valley dam from many areas of the country. The Sierra Club, relatively quiet in previous decades, moved into action; the controversy helped elevate a new and aggressive leader, David Brower, who eventually transformed the club into the nation's leading environmental organization. By the mid-1950s, a large coalition of environmentalists, naturalists, and wilderness vacationers had been mobilized in opposition to the dam, and in 1956 Congress—bowing to the public pressure—blocked the project and preserved Echo Park in its natural state. The controversy was a major victory for those who wished to preserve the sanctity of the national parks, and it was an important impetus to the dawning environmental consciousness that would become so important in later decades. Organized Society and ItsDetractors White-collar workers came to outnumber blue-collar laborers for the first time in the 1950s, and an increasing proportion of them worked in corporate settings with rigid hierarchical structures. Industrial workers also confronted large bureaucracies, both in the workplace and in their own unions. Consumers discovered the frustrations of bureaucracy in dealing with the large national companies from whom they bought goods and services. More and more Americans were becoming convinced that the key to a successful future lay in acquiring the specialized training and skills necessary for work in large organizations. Growth of Specialized Education The American educational system responded to the demands of this increasingly organized society by experimenting with changes in curriculum and philosophy. Elementary and secondary schools gave increased attention to the teaching of science, mathematics, and foreign languages (particularly after the launching of the Soviet Union's Sputnik)—all of which educators considered important for the development of skilled, specialized professionals. Universities in the meantime were expanding their curricula Page 767to provide more opportunities for students to develop specialized skills. The idea of the "multiversity"—a phrase first coined by the chancellor of the University of California at Berkeley to describe his institution's diversity—represented a commitment to making higher education a training ground for specialists in a wide variety of fields. The debilitating impact of bureaucratic life on the individual slowly became a central theme of popular and scholarly debate. William H. Whyte Jr. produced one of the most widely discussed books of the decade: The Organization Man (1956), which attempted to describe the special mentality of the worker in a large, bureaucratic setting. Self-reliance, Whyte claimed, was losing place to the ability to "get along" and "work as a team" as the most valued trait in the modern character. Sociologist David Riesman had made similar observations in The Lonely Crowd (1950), in which he argued that the traditional "inner-directed" man, who judged himself on the basis of his own values and the esteem of his family, was giving way to a new "other-directed" man, more concerned with winning the approval of the larger organization or community. Novelists, too, expressed misgivings in their work about the impersonality of modern society. Saul Bellow produced a series of novels—The Adventures of Augie March(1953), Seize the Day (1956), Herzog (1964), and many others—that chronicled the difficulties American Jewish men had in finding fulfillment in modern urban America. J. D. Salinger wrote in The Catcher in the Rye(1951) of a prep-school student, Holden Caulfield, who was unable to find any area of society—school, family, friends, city—in which he could feel secure or committed. The Beats and the RestlessCulture of Youth The Beat Generation's Critiques The most caustic critics of bureaucracy, and of middle-class society in general, were a group of young poets, writers, and artists generally known as the "beats" (or, derisively, as "beatniks"). They wrote harsh critiques of what they considered the sterility and conformity of American life, the meaninglessness of American politics, and the banality of popular culture. Allen Ginsberg's dark, bitter poem "Howl" (1955) decried the "Robot apartments! invincible suburbs! skeleton treasuries! blind capitals! demonic industries!" of modern life. Jack Kerouac produced the bible of much of the Beat Generation in his novel On the Road (1957)—an account of a cross-country automobile trip that depicted the rootless, iconoclastic lifestyle of Kerouac and his friends. The beats were the most visible evidence of a widespread restlessness among young Americans in the 1950s. In part, that restlessness was a result of prosperity itself—of a growing sense among young people of limitless possibilities, and of the declining power of such traditional values as thrift, discipline, and self-restraint. Young middle-class Americans were growing up in a culture that encouraged them to expect rich and fulfilling lives; but they were living in a world in which almost all of them experienced obstacles to that fulfillment. Tremendous public attention was directed at the phenomenon of "juvenile delinquency," and in both politics and popular culture there were dire warnings about the growing criminality of American youth. The 1955 film Blackboard Jungle, for example, was a frightening depiction of crime and violence in city schools. Scholarly studies, presidential commissions, and journalistic exposés all contributed to the sense of alarm about the spread of delinquency— although in fact youth crime did not dramatically increase in the 1950s. The culture of alienation that the beats so vividly represented had counterparts even in ordinary middle-class behavior: teenage rebelliousness toward parents, youthful fascination with fast cars and motorcycles, and the increasing visibility of teenage sex, assisted by the greater availability of birth-control devices. The popularity of James Dean, in such movies as Rebel Without a Cause (1955), East of Eden (1955), and Giant(1956), conveyed a powerful image of youth culture in the 1950s. Both in the roles he played (moody, alienated teenagers and young men with a streak of self-destructive violence) and in the way he lived his own life (he died in 1955, at the age of 24, Page 768in a car accident), Dean became an icon of the unfocused rebelliousness of American youth in his time. Patterns of Popular Culture Lucy and Desi THE most popular show in the history of television began as an effort by a young comedian to strengthen her troubled marriage. In 1950, Lucille Ball was performing in a popular weekly CBS radio comedy, My Favorite Husband, in which she portrayed a slightly zany housewife who tangled frequently with her banker husband, played by Richard Denning. The network proposed to transfer the show from radio to television. Lucy said she would do so only if she could replace Denning with her real-life husband of ten years, Desi Arnaz—a Cuban-born bandleader whose almost constant traveling was putting a strain on their marriage. Network officials tried in vain to talk her out of the idea. Arnaz had no acting experience, they told her. Lucy herself recognized another reason for their reluctance: the radicalism of portraying an ethnically mixed marriage on the air. But she held her ground. On Monday, October 15, 1951, the first episode of I Love Lucy was broadcast over CBS. Desi Arnaz played Ricky Ricardo, a Cuban bandleader and singer who spoke, at times, with a comically exaggerated Latin accent. Lucille Ball was Lucy Ricardo, his stage-struck and slightly dizzy wife. Performing with them were William Frawley and Vivian Vance, who played their neighbors and close friends, Fred and Ethel Mertz. In the premiere episode, "The Girls Want to Go to a Nightclub," Ricky and Fred want to go to a boxing match on the night of Fred and Ethel's anniversary, while the wives are arranging an evening at a nightclub. The opening episode contained many of the elements that characterized the show throughout its long run and ensured its extraordinary success: the remarkable chemistry among the four principal actors, the unexpected comedic talent of Desi Arnaz, and most of all the brilliance of Lucille Ball. She was a master of physical comedy, and many of her funniest moments involved scenes of absurdly incongruous situations (Lucy working an assembly line, Lucy stomping grapes in Italy). Her characteristic yowl of frustration became one of the most familiar sounds in American culture. She was a beautiful woman, but she never hesitated to make herself look ridiculous. "She was everywoman," her longtime writer Jess Oppenheim once wrote; "her little expressions and inflections stimulated the shock of recognition in the audience." But it was not just the great talents of its cast that made I Love Lucy such a phenomenon. It was the skill of its writers in evoking some of the most common experiences and desires of television viewers in the 1950s. Lucy, in particular, mined the frustrations of domestic life for all they were worth, constantly engaging in zany and hilarious schemes to break into show business or somehow expand her world. The husbands wanted calm and conventional domestic lives—and time to themselves for conspicuously male activities: boxing, fishing, baseball. In the first seasons, the fictional couples lived as neighbors, without children, in a Manhattan apartment building. Later, Lucy had a child and they all moved to the suburbs. (The show used Lucy's real-life pregnancy on the air; and on January 19, 1953—only hours after Lucille Ball gave birth to her real son and second child—CBS aired a previously filmed episode of the fictional Lucy giving birth to a fictional son, "Little Ricky" Ricardo.) Lucy at Home Although Lucy and Desi at first portrayed a childless, ethnically mixed couple living in a Manhattan apartment, many of the comic situations in the early years of the show were purely domestic. Here, Lucy, wearing an apron, deals with one of her many household predicaments with the extraordinary physical comedy that was part of her great success. Desi, watching skeptically, was a talented straight man to Lucy's zaniness. (© CBS/Photofest) Vitameatavegamin One of the most popular episodes of I Love Lucy portrays Lucy at a trade show promoting a new health product called "Vitameatavegamin." In the course of the show, she herself drinks a great deal of the concoction, which has a high alcohol content and leaves her hilariously drunk. (© CBS/Photofest) Lucille Ball remained a major television star for nearly twenty years after I Love Lucy (and its successor, The Lucille Ball-Desi Arnaz Comedy Hour) left the air in 1960. Desi Arnaz, whom Lucy divorced in 1960, remained for a time one of Hollywood's most powerful and successful studio executives as the head of Desilu Productions. And nearly sixty-five years after the first episode of I Love Lucyaired, the series remains extraordinarily popular all over the world—shown so frequently in reruns that in some American cities it is sometimes possible to see six Lucy episodes in a single evening. "People identified with the Ricardos," Lucille Ball once said, "because we had the same problems they had. We just took ordinary situations and exaggerated them." • Promoting The Show The marriage of Lucille Ball and Desi Arnaz, which paralleled the television marriage of Lucy and Ricky Ricardo, was one of the most effective promotional devices for I Love Lucy. Here, Lucy and Desi pose for a promotional still—one of many they made for advertisements, magazine covers, and posters until their marriage (and the show) dissolved in 1960. (© Photofest) Understand, Analyze, and Evaluate In what ways did I Love Lucy reflect American society and family life of the 1950s? How have television situation comedies since I Love Lucy copied the formula for success established by that program? Do you see elements of the I Love Lucy pattern in today's situation comedies? Why do you think I Love Lucy has continued to be so popular, both in the United States and throughout the world? Rock 'n' Roll Elvis Presley One of the most powerful cultural forces for American youth was the enormous popularity of rock 'n' roll—and of the greatest early rock star, Elvis Presley. Presley became a symbol of a youthful determination to push at the borders of the conventional and acceptable. His sultry good looks; his self-conscious effort to dress in the vaguely rebellious style of urban gangs (motorcycle jackets and slicked-back hair); and most of all, the open sexuality of his music and his public performances made him wildly popular among young Americans in the 1950s. His first great hit, "Heartbreak Hotel," established him as a national phenomenon in 1956, and he remained a powerful figure in American popular culture until—and indeed beyond—his death in 1977. Elvis Elvis Presley is almost certainly the most famous and influential rock musician of the twentieth century. Born in Mississippi and influenced by the African American of the South, he became a singer in the mid-1950s and continued to be extraordinarily popular until his death in 1977. This photograph shows him very early in his career. (© AP Images) Rock 'n' Roll's Black Roots Presley's music, like that of most early white rock musicians, drew heavily from black rhythm and blues traditions, which appealed to some white youths in the early 1950s because of their pulsing, sensual rhythms and their hard-edged lyrics. Sam Phillips, a local record promoter who had recorded some of the important black rhythm and blues musicians of his time (among them B. B. King), reportedly said in the early 1950s: "If I could find a white man with a Negro sound, I could make a billion dollars." Soon after that, he found Presley. But there were others as well. Among them were Buddy Holly and Bill Haley (whose 1955 songPage 769 "Rock Around the Clock"—used in the film Blackboard Jungle—served to announce the arrival of rock 'n' roll to millions of young people), who were closely connected to African American musical traditions. Rock drew from other sources too: from country western music (another strong influence on Presley), from gospel music, even from jazz. But its most important influence was its roots in rhythm and blues. The rise of such white rock musicians as Presley was a result in part of the limited willingness of white audiences to accept black musicians. But the 1950s did see a growth in the popularity of African American bands and singers among both black and white audiences. Chuck Berry, Little Richard, B. B. King, Chubby Checker, the Temptations, and others—many of them recorded by the African American producer Berry Gordy, the founder and president of Motown Records in Detroit—never rivaled Presley in their popularity among white youths. But they did develop a significant multiracial audience of their own. The rapid rise and enormous popularity of rock owed a great deal to innovations in radio and television programming. By the 1950s, radio stations no longer felt obliged to present mostly live programming. Instead, many radio stations devoted themselves to playing recorded music. Early in the 1950s, a new breed of radio announcers, known now as "disc jockeys," began to create programming aimed specifically at young fans of rock music; and when those programs became wildly successful, other stations followed suit. American Bandstand, a televised showcase for rock 'n' roll hits that began in 1957, featured a live audience dancing to mostly recorded music. The show helped spread the popularity of rock—and made its Page 770host, Dick Clark, one of the best-known figures in America among young Americans. "Payola" Scandals Radio and television were important to the recording industry, of course, because they encouraged the sale of records. Also important were jukeboxes, which played individual songs on 45s (records with one song on each side) and proliferated in soda fountains, diners, bars, and other places where young people were likely to congregate. Sales of records increased threefold—from $182 million to $521 million—between 1954 and 1960. So eager were record promoters to get their songs on the air that some routinely made secret payments to station owners and disc jockeys to encourage them to showcase their artists. These payments, which became known as "payola," produced a briefly sensational series of scandals when they were exposed in the late 1950s and early 1960s.

chapter 28 section 4

The "Other America" It was relatively easy for white, middle-class Americans in the 1950s to believe that the world they knew—a world of economic growth, personal affluence, and cultural homogeneity—was the world all Americans experienced; that the values and assumptions they shared were ones that most other Americans shared too. But such assumptions were false. Even within the middle class, there was considerable restiveness—among women, intellectuals, young people, and others who found the middle-class consumer culture somehow unsatisfying, even stultifying. More importantly, large groups of Americans remained outside the circle of abundance and shared in neither the affluence of the middle class nor its values. On the Margins of the AffluentSociety The Other America In 1962, the socialist writer Michael Harrington created a sensation by publishing a book called The Other America, in which he chronicled the continuing existence of poverty in America. The conditions he described were not new. Only the attention he was bringing to them was. The great economic expansion of the postwar years reduced poverty significantly but did not eliminate it. In 1960, at any given moment, more than a fifth of all American families (over 30 million people) continued to live below what the government defined as the poverty line (down from a third of all families fifteen years before). Many millions more lived just above the official poverty line, but with incomes that gave them little comfort and no security. Persistent Poverty Most of the poor experienced poverty intermittently and temporarily. Eighty percent of those classified as poor at any particular moment were likely to have moved into poverty recently and might move out of it again as soon as they found a job—an indication of how unstable employment could be at the lower levels of the job market. But approximately 20 percent of the poor were people for whom poverty was a continuous, debilitating reality, from which there was no easy escape. That included approximately half the nation's elderly and a large proportion of African Americans and Hispanics. Native Americans constituted the single poorest group in the country, a result of government policies that undermined the economies of the reservations and drove many Indians into cities, where some lived in a poverty worse than that they had left. These were the people Harrington had written about in The Other America, people who suffered from what he called "a system designed to be impervious to hope." This "hard-core" poverty rebuked the assumptions of those who argued that economic growth would eventually lead everyone into prosperity; that, as many claimed, "a rising tide lifts all boats." It was a poverty that the growing prosperity of the postwar era seemed to affect hardly at all. Rural Poverty Declining Agricultural Prices Among those on the margins of the affluent society were many rural Americans. In 1948, farmers had received 8.9 percent of the national income; in 1956, they received only 4.1 percent. In part, this decline reflected the steadily shrinking farm population; in 1956 alone, nearly 10 percent of the rural population moved into or was absorbed by cities. But it also reflected declining farm prices. Because of enormous surpluses in basic staples, prices fell 33 percent in those years, even though national income as a whole rose 50 percent at the same time. Even most farmers who managed to survive experienced substantial losses of income at the same time that the prices of many consumer goods rose. Not all farmers were poor. Some substantial landowners weathered, and even managed to profit from, the changes in American agriculture. Others moved from considerable to only modest affluence. But the agrarian economy did produce substantial numbers of genuinely impoverished people. Black sharecroppers and tenant farmers continued to live at or below subsistence level throughout the rural South—in part because of the mechanization of cotton picking beginning in 1944, in part because of the development of synthetic fibers that reduced demand for cotton. (Two-thirds of the cotton acreage of the South went out of production between 1930 and 1960.) Migrant farmworkers, a group concentrated especially in the West and Southwest and containing many Mexican American and Asian American workers, lived in similarly dire circumstances. In rural areas without much commercial agriculture—such as the Appalachian region in the East, where the decline of the coal economy reduced the one significant source of support for the region—whole communities lived in desperate poverty, increasingly cut off from the market economy. All these groups were vulnerable to malnutrition and even starvation. Page 771 The Inner Cities Black Urban Migration As white families moved from cities to suburbs in vast numbers, many inner-city neighborhoods became vast repositories for the poor, "ghettos" from which there was no easy escape. The growth of these neighborhoods owed much to a vast migration of African Americans out of the countryside and into industrial cities. More than 3 million black men and women moved from the South to northern cities between 1940 and 1960. Chicago, Detroit, Cleveland, New York, and other eastern and midwestern industrial cities experienced a great expansion of their black populations—both in absolute numbers and, even more, as a percentage of the whole, since so many whites were leaving these cities at the same time. Similar migrations from Mexico and Puerto Rico expanded poor Hispanic neighborhoods at the same time. Between 1940 and 1960, nearly a million Puerto Ricans moved into American cities (the largest group to New York). Mexican workers crossed the border in Texas and California and swelled the already substantial Latino communities of such cities as San Antonio, Houston, San Diego, and Los Angeles (which by 1960 had the largest Mexican American population of any city, approximately 500,000 people). Why so many inner-city communities, populated largely by racial and ethnic minorities, remained so poor in the midst of growing affluence has been the subject of considerable debate. Some critics have argued that the new migrants were victims, in part, of their own pasts, that the work habits, values, and family structures they brought with them from their rural homes were poorly adapted to the needs of the modern industrial city. Others have argued that the inner city itself—its crippling poverty, its crime, its violence, its apparent hopelessness—created a "culture of poverty" that made it difficult for individuals to advance. Many others argue that a combination of declining blue-collar jobs, inadequate support for minority-dominated public schools, and barriers to advancement rooted in racism—not the culture and values of the poor themselves—was the source of inner-city poverty. It is indisputable that inner cities were filling up with poor minority residents at the same time that the unskilled industrial jobs they were seeking were diminishing. Employers were relocating factories and mills from old industrial cities to new locations in suburbs, smaller cities, and even abroad—places where the cost of labor was lower. Even in the factories that remained, automation was reducing the number of unskilled jobs. The economic opportunities that had helped earlier immigrant groups to rise up from poverty were unavailable to most of the postwar migrants. Nor can there be any doubt that historic patterns of racial discrimination in hiring, education, and housing doomed many members of these communities to continuing, and in some cases increasing, poverty. "Urban Renewal" For many years, the principal policy response to the poverty of inner cities was "urban renewal": the effort to tear down buildings in the poorest and most degraded areas. In the twenty years after World War II, urban renewal projects destroyed over 400,000 buildings, among them the homes of nearly 1.5 million people. In some cases, urban renewal provided new public housing for poor city residents. Some of it was considerably better than the housing they left; some of it was poorly designed and constructed, and deteriorated rapidly into dismal and dangerous slums. African American Migration, 1950-1980 Although there had been a substantial migration of African Americans out of the South and into northern industrial cities around the time of World War I and again during World War II, that process accelerated in the thirty years after 1950. By 1980, fewer southern states had black populations that accounted for 25 percent or more of their total population than in 1950. In the rest of the country, the number of states whose black populations exceeded 5 and 10 percent (the states shaded orange and purple) greatly increased. • What were some of the factors that produced the African American migration in this period?

Chapter 28 section 5

The Rise of the Civil Rights Movement Page 772 After decades of skirmishes, an open battle began in the 1950s against racial segregation and discrimination. Although white Americans played an important role in the civil rights movement, pressure from African Americans themselves was the crucial element in raising the issue of race to prominence. The Brown Decision and "MassiveResistance" Brown v. Board of Education On May 17, 1954, the Supreme Court announced its decision in the case of Brown v. Board of Education of Topeka. In considering the legal segregation of a Kansas public school system, the Court rejected its own 1896 Plessy v. Ferguson decision, which had ruled that communities could provide blacks with separate facilities as long as the facilities were equal to those of whites. The Brown decision was the culmination of many decades of effort by black opponents of segregation, and particularly by a group of talented NAACP lawyers, many of them trained at Howard University in Washington, D.C., by the great legal educator Charles Houston. Thurgood Marshall, William Hastie, James Nabrit, and others spent years filing legal challenges to segregation in one state after another, nibbling at the edges of the system, and accumulating precedents to support their assault on the "separate but equal" doctrine itself. The same lawyers filed the suits against the school boards of Topeka, Kansas, and several other cities that became the basis for the Brown decision. "Separate but Equal" Doctrine Overturned The Topeka suit involved the case of an African American girl who had to travel several miles to a segregated public school every day even though she lived virtually next door to a white elementary school. When the case arrived before the Supreme Court, the justices examined it not simply in terms of legal precedent but in terms of history, sociology, and psychology. They concluded that school segregation inflicted unacceptable damage on those it affected, regardless of the relative quality of the separate schools. Chief Justice Earl Warren explained the unanimous opinion of his colleagues: "We conclude that in the field of public education the doctrine of 'separate but equal' has no place. Separate educational facilities are inherently unequal." The following year, the Court issued another decision (known as "Brown II") to provide rules for implementing the 1954 order. It ruled that communities must work to desegregate their schools "with all deliberate speed," but it set no timetable and left specific decisions up to lower courts. "Massive Resistance" In some communities—for example, Washington, D.C.—compliance came relatively quickly and quietly. More often, however, strong local opposition (what came to be known in the South as "massive resistance") produced long delays and bitter conflicts. Some school districts ignored the ruling altogether. Others attempted to circumvent it with purely token efforts to integrate. More than 100 southern members of Congress signed a "manifesto" in 1956 denouncing the Brown decision and urging their constituents to defy it. Southern governors, mayors, local school boards, and nongovernmental pressure groups (including hundreds of "White Citizens' Councils") all worked to obstruct desegregation. Many school districts enacted "pupil placement laws" allowing school officials to place students in schools according to their scholastic abilities and social behavior. Such laws were transparent devices for maintaining segregation; but in 1958, the Supreme Page 773Court (in Shuttlesworth v. Birmingham Board of Education) refused to declare them unconstitutional. By the fall of 1957, only 684 of 3,000 affected school districts in the South had even begun to desegregate their schools. Many white parents simply withdrew their children from the public schools and enrolled them in all-white "segregation academies"; some state and local governments diverted money from newly integrated public schools and used it to fund the new, all-white academies. The Browndecision, far from ending segregation, had launched a prolonged battle between federal authority and state and local governments, and between those who believed in racial equality and those who did not. Little Rock The Eisenhower administration was not eager to commit itself to that battle. The president himself had greeted the Brown decision with skepticism (and once said it had set back progress on race relations "at least fifteen years"). But in September 1957, he faced a case of direct state defiance of federal authority and felt compelled to act. Federal courts had ordered the desegregation of Central High School in Little Rock, Arkansas. An angry white mob tried to prevent implementation of the order by blockading the entrances to the school, and Governor Orval Faubus refused to do anything to stop the obstruction. President Eisenhower responded by federalizing the Arkansas National Guard and sending troops to Little Rock to restore order and ensure that the court orders would be obeyed. Only then did Central High School admit its first black students. Little Rock, Arkansas African American student Elizabeth Eckford passes by jeering whites on her way to Little Rock Central High School, newly integrated by federal court order. The black students later admitted that they had been terrified during the first difficult weeks of integration. But in public, most of them acted with calm and dignity. (© Everett Collection/SuperStock) The Expanding Movement The Brown decision helped spark a growing number of popular challenges to segregation in the South. On December 1, 1955, Rosa Parks, an African American woman, was arrested in Montgomery, Alabama, when she refused to give up her seat on a Montgomery bus to a white passenger. Parks, an active civil rights leader in the community, had apparently decided spontaneously to resist the order to move. Her feet were tired, she later explained. But black leaders in Montgomery had been waiting for such an incident, which they wanted to use to challenge the segregation of the buses. The arrest of this admired woman produced outrage in the city's African American community and helped local leaders organize a successful boycott of the bus system to demand an end to segregated seating. Montgomery Bus Boycott The bus boycott owed much of its success to the prior existence of well-organized black citizens' groups. A black women's political caucus had, in fact, been developing plans for a boycott of the segregated buses for some time. They seized on Rosa Parks as a symbol of the movement. Once launched, the boycott was almost completely effective. Black workers who needed to commute to their jobs (of whom the largest group consisted of female domestic servants) formed car pools to ride back and forth to work, or simply walked, at times over long distances. The boycott put economic pressure not only on the bus company (a private concern) but on many Montgomery merchants as well. The bus boycotters found it difficult to get to downtown stores and tended to shop instead in their own neighborhoods. Still, the boycott might well have failed had it not been for a Supreme Court decision late in 1956, inspired in part by the protest, that declared segregation in public transportation to be illegal. The buses in Montgomery abandoned their discriminatory seating policies, and the boycott came to a close. An important result of the Montgomery boycott was the rise to prominence of a new figure in the movement for civil rights. The man chosen to head the boycott movement was a local Baptist pastor, Martin Luther King Jr., the son of a prominent Atlanta minister, a powerful orator, and a gifted leader. At first King was reluctant to lead the movement. But once he accepted the role, he became consumed by it. Martin Luther King's Strategy King's approach to black protest was based on the doctrine of nonviolence—that is, of passive resistance even in the face of direct attack. He drew from the teachings of Mahatma Gandhi, the Indian nationalist leader; from Henry David Thoreau and his doctrine of civil disobedience; and from Christian doctrine. And he produced an approach to racial struggle that captured the moral high ground for his supporters. He urged African Americans to engage in peaceful demonstrations; to allow themselves to be arrested, even beaten, if necessary; and to respond to hate with love. For the next thirteen years—as leader of the Southern Christian Leadership Conference, an interracial group he founded shortly after the bus boycott—he was the most influential and most widely admired black leader in the country. The popular movement he came to represent soon spread throughout the South and throughout the country. Pressure from the courts, from northern liberals, and from African Americans themselves also speeded the pace of racial change in other areas. One important color line had been breached as early as 1947, when the Brooklyn Dodgers signed Jackie Robinson as the first African American to play Major League baseball. By the mid-1950s, blacks had established themselves as a powerful force in almost all professional sports. Within the government, President Eisenhower completed the integration of the armed forces, attempted to desegregate the federal workforce, and in 1957 signed a civil rights act (passed, without active support from the White House, by a Democratic Congress) providing federal protection for African Americans who wished to register to vote. It was a weak bill, with few mechanisms for enforcement, but it was the first civil rights bill of any kind to win passage since the end of Reconstruction, and it served as a signal that the executive and legislative branches were beginning to join the judiciary in the federal commitment to the "Second Reconstruction." Causes of the Civil RightsMovement Why did a civil rights movement begin to emerge at this particular moment? The injustices it challenged and the goals it promoted were hardly new; in theory, African Americans Page 774could have launched the same movement fifty or a hundred years earlier, or decades later. Why did they do so in the 1950s and 1960s? Legacy of World War II Several factors contributed to the rise of African American protest in these years. The legacy of World War II was one of the most important. Millions of black men and women had served in the military or worked in war plants during the war and had derived from the experience a broader view of the world, and of their place in it. Urban Black Middle Class Another factor was the growth of an urban black middle class, which had been developing for decades but which began to flourish after the war. Much of the impetus for the civil rights movement came from the leaders of urban black communities—ministers, educators, professionals—and much of it came as well from students at black colleges and universities, which had expanded significantly in the previous decades. Men and women with education and a stake in society were often more aware of the obstacles to their advancement than poorer and more oppressed people, to whom the possibility of advancement may have seemed too remote even to consider. And urban blacks had considerably more freedom to associate with one another and to develop independent institutions than did rural blacks, who were often under the very direct supervision of white landowners. Television and other forms of popular culture were another factor in the rising consciousness of racism among blacks. More than any previous generation, postwar African Americans had constant, vivid reminders of how the white majority lived—of the world from which they were effectively excluded. Television also conveyed the activities of demonstrators to a national audience, ensuring that activism in one community would inspire similar protests in others. In addition to the forces that were inspiring African Americans to mobilize, other forces were at work mobilizing many white Americans to support the movement once it began. One was the Cold War, which made racial injustice an embarrassment to Americans trying to present their nation as a model to the world. Another was the political mobilization of northern blacks, who were now a substantial voting bloc within the Democratic Party; politicians from northern industrial states could not ignore their views. Labor unions with substantial black memberships also played an important part in supporting (and funding) the civil rights movement.


Kaugnay na mga set ng pag-aaral

CSC-7: Chapter 4: Section 4: Division into Cases and the Quotient-Remainder Theorem

View Set

4/2: - Liver Disease - Intracranial Regulation - Increased Intracranial Pressure - Mobility - Spinal Cord Injury

View Set

Varcarolis Ch. 16 Trauma, Stressor-Related, and Dissociative Disorders

View Set

Quiz Pre-Lab 5: EMB, Mannitol Salt, Streaking for Isolation

View Set

Mental Health PrepU Week 3 w/ Rationales

View Set

Texas life and health stimulated Exam

View Set