Volume 2, Chapter 11
The Cold War Boom, 1946-1960
“America at this moment stands at the summit of the world,” announced Winston Churchill in August 1945. That same month, Walter Lippmann wrote, “What Rome was to the ancient world, what Great Britain has been to the modern world, America is to be to the world of tomorrow.” The former British prime minister and America’s most respected newspaper columnist stood in awe for good reason. During World War II, the United States had mobilized an army of twelve million and had bankrolled or equipped an equal number of Allied troops. It had assembled the world’s largest navy and air force and had built the atomic bombs that wiped out two Japanese cities. But friend and foe alike found the great size and vibrancy of the U.S. economy most impressive. In the four wartime years, U.S. income, wealth, and industrial production all doubled or more than doubled. By 1947, the United States produced roughly half the world’s manufactures: 57 percent of the steel, 43 percent of the electricity, and 62 percent of the oil. America now dominated in precisely those industries—aviation, chemical engineering, and electronics—that spelled victory in modern war.
This enormous military and industrial power enabled the United States to become the guardian of a postwar Pax Americana, a restructuring of international politics and finance in a way that made them more responsive to U.S. interests. The United States now indissolubly linked its interests to a global order that depended on American economic strength and political will. This melding of foreign and economic policy created tensions with the other great world power, the Soviet Union. The resulting “cold war” pitted East against West and served as a backdrop to the impressive midcentury U.S. economic growth. In the immediate postwar years, more than ever before, international developments shaped the everyday lives of American women and men while New Deal pension, wage, and home loan guarantees provided the economic security that enabled many Americans to enjoy the fruits of the postwar economic boom.
The Cold War in a Global Context
Economic growth and political stability in the United States stood in stark contrast to the radical transformations that swept the rest of the globe in the years immediately following World War II. In 1947, Churchill described Europe as “a rubble-heap, a charnel house, a breeding ground of pestilence and hate.” There, depression and war so discredited the old elites that many people doubted that capitalism could long survive. Although the Soviet dictator Joseph Stalin had betrayed the ideals of the Russian Revolution to establish a brutal and despotic regime, millions of people throughout the world still looked to the Soviet Union as an alternative to capitalism, which they identified with the chaos of the Great Depression and World War II. In France, the Communists and Socialists, who had led the anti-Nazi resistance, were far more popular than business leaders were, and even many conservatives there supported the postwar nationalization of the country’s most important banks and manufacturing firms. In Britain, the Labour Party swept Churchill and the Conservatives out of power, raising the possibility that America’s closest ally might embrace socialism. In Greece, Italy, and Yugoslavia, Communist parties seemed on the verge of assuming power. Meanwhile, powerful independence movements emerged in Asia and Africa, shaking the colonial empires that France, Britain, and the Netherlands had built. After the 1949 revolution in China, American policymakers became more hostile to nationalist and reform movements in the colonial world and turned the Cold War “hot” by responding militarily to the North Korean invasion on that Asian peninsula and supporting the French armies in Indochina.
Origins of the Cold War
Nothing is inevitable in history or politics, but the Cold War antagonism that soon divided “East” and “West” would have been exceedingly hard to avoid. The Soviets sought to dominate a buffer zone of satellite states along Russia’s historic borders to prevent invasion from the west while at the same time probing for political and social weaknesses in Western Europe. In wartime conferences, American policymakers seemed to accept the idea of a Soviet sphere of influence in Eastern Europe. At a meeting in the Ukrainian resort city of Yalta in February 1945, Roosevelt and Churchill thought it fruitless to oppose Soviet dominance in Poland once the Red Army had fought its way across that nation. After mid-1945, however, President Harry Truman grew uncomfortable with these arrangements. Although Stalin showed no interest in an invasion of Western Europe, the Soviet Union’s suppression of nationalist aspirations in Eastern Europe, combined with its encouragement of anticapitalist and anticolonial movements worldwide, led Western leaders to view the Soviet Union as an inherently expansionist power.
The intensity of the Cold War—the costly and dangerous militarization of the rivalry; the all-encompassing, global character of the antagonism; and its protracted, half-century length—might have been mitigated or ameliorated through careful statesmanship. U.S. leaders believed that they had a duty to create a new economic order. Henry Luce, an influential spokesman for American internationalism and the publisher of Time and Life, urged readers of those magazines to “go over the earth, as investors and managers and engineers, as makers of mutual prosperity, as missionaries of capitalism and democracy.” Although the Soviets desperately needed assistance in rebuilding their country, President Truman canceled lend-lease aid to the Soviet Union almost immediately after Germany’s surrender, convinced that the United States had to “stop babying the Soviets.” In Poland, where Americans wanted elections, the Soviets suppressed opposition parties to keep that buffer state firmly under their control. And much haggling took place over the international control of atomic energy. Although some in the United States were willing to give the newly formed United Nations a role in this area, President Truman and military leaders insisted on a system of mandatory international inspections that would preserve U.S. atomic supremacy. The Soviets, rejecting this U.S. plan, forged ahead with a secret program to build their own atom bomb, which they would test in 1949.
Containing the Soviets and Dividing Europe
Two influential individuals, George F. Kennan and Winston Churchill, helped to codify and globalize the meaning of these growing tensions, which ultimately led to a divided Europe. Early in 1946, Kennan, the highest-ranking U.S. diplomat in Moscow, cabled an explosive 8,000-word assessment of Soviet intensions to State Department officials in Washington. Kennan’s “long telegram” quickly circulated throughout the federal government. The influential journal Foreign Affairs published an even more ideological, anti-Communist version, under the pseudonym “Mr. X.” Both traditional Russian insecurities and Marxist-Leninist dogma motivated Soviet leaders, wrote Kennan. These leaders based their totalitarian rule in the Soviet Union on an irrational fear of capitalist encirclement and of foreign hostility. To counter the Soviets, Kennan wrote, the United States should pursue a policy of “firm containment, designed to confront the Russians with unalterable counterforce at every point.” Kennan’s strategy of containment came to characterize the Cold War posture that the West adopted toward the Soviets for more than a generation. Although Kennan construed containment largely in economic and diplomatic terms, the strategy soon took on a decidedly military cast as Cold War antagonisms deepened.
In March 1946, Winston Churchill gave the idea of containment a powerful rhetorical flourish. In one of the most famous speeches of the twentieth century, Churchill warned that an “Iron Curtain” of Soviet domination had descended across central Europe, and he called for a new British-American military alliance to oppose it. Many Americans, including some within the policymaking elite, thought Churchill’s harsh speech unnecessarily provocative, given that in early 1946, the Soviets tolerated some pluralism and multiparty competition in Finland, Austria, Hungary, and Czechoslovakia. But Churchill’s Iron Curtain speech bolstered arguments that compromise or negotiation with the Soviet Union would prove fruitless.
Thus, the ideological confrontation with the Soviets soon turned into a military and economic projection of U.S. power, first in the Eastern Mediterranean and then throughout all of Western Europe. In Greece, President Truman faced a crisis early in 1947. There, as in so many other Eastern European societies, business and landowning elites had been discredited by their collaboration with the fascists during World War II. Local Communists, emerging from the anti-Nazi resistance, took up arms; soon, civil war engulfed the nation. The British government backed conservative Greek monarchists with troops and financial aid, but in February 1947, officials of Britain’s new Labour government informed Truman that Britain could no longer afford to assist anti-Communist forces.
President Truman and his advisers decided the United States should fill the political and military breach left by the British. However, they faced a skeptical American public and a fiscally conservative Republican Congress. Truman resolved this difficulty by announcing what came to be known as the Truman Doctrine. Requesting 0 million in economic and military aid for Greece and Turkey, America’s new ally in Asia Minor, the president framed his goals in sweeping terms: “At the present moment in world history, nearly every nation must choose between alternative ways of life.” To win popular support for this open-ended initiative, Truman took the advice of Republican senator Arthur Vandenberg “to scare [the] hell out of the country.” Whatever the merits of the particular conflict, Truman argued, “I believe that it must be the policy of the United States to support free peoples who are resisting attempted subjugation by armed minorities or by outside pressures.” Truman defined political upheaval as inherently undemocratic. With money and arms, America would guarantee the political status quo in Greece and defend friendly regimes elsewhere in the world.
The administration followed the Truman Doctrine within a few months by an even more ambitious program, the billion Marshall Plan for the reconstruction of Europe (0 billion in current U.S. dollars). Secretary of State George Marshall proposed the plan in a June 1947 speech. It offered aid even to the Communist regimes of Eastern Europe, but only under conditions that would link their economies to the West, thus threatening their role as buffer states for the Soviet Union. When the Soviet Union forced those nations to reject Marshall Plan assistance, the economic partition of Europe was confirmed. And when Czech Communists seized control of Prague’s coalition government in what became known as the “coup” of February 1948, Congress overwhelmingly approved funding for the Marshall Plan.
A tangible symbol of American wealth and generosity, the Marshall Plan scored a twofold victory in much of Europe. It strengthened the hand of conservatives in countries such as France, Italy, Greece, and Belgium, where Communist movements had strong support. And it sparked a powerful economic recovery in countries with an educated work force, a well-built infrastructure, and a social democratic tradition, such as Germany, Norway, the Netherlands, Great Britain, and France. Those countries were able to take advantage of Marshall Plan aid and to share its fruits relatively equitably. But in other parts of Europe—such as Italy, Portugal, Greece, and Spain—the Marshall Plan proved less successful. There, foreign aid merely reinforced existing inequalities and inefficiencies, lending support to corrupt or authoritarian governments for more than two decades.
The economic and political division of Europe led inexorably to a military divide as well. The central issue remained the revival of German power, feared both by the Soviets and by many in the West. In June 1948, U.S.-led efforts to link the currency in Berlin’s western sector to that of West Germany alarmed the Soviets: Berlin lay deep within the Soviet zone of occupation. They responded with a blockade of all Western European goods into the former German capital. President Truman ordered a spectacular airlift, which triumphantly supplied the city’s residents with coal, food, and clothing for nearly a year. By May 1949, when Stalin lifted the blockade, Berlin had become an international symbol of resistance to Soviet intimidation.
Against this contentious backdrop, the United States pushed for the creation of a Western European military alliance, called the North Atlantic Treaty Organization (NATO), and the eventual rearmament of West Germany. Policymakers and analysts saw NATO as part of a strategy of “double containment,” in which the revival of Germany, contained by a military alliance that was sensitive to French fear of German economic and military power, would in turn help to contain the Soviet Union. The Soviets replied with their own military alliance, the Warsaw Pact, thereby polarizing Europe further into two mutually hostile camps. In Eastern Europe, Communist governments suppressed all opposition political parties and institutions; in Western Europe, NATO froze the Communists out of the governments of France and Italy, halting the political and economic experimentation that was under way there.
Cold War Showdowns Outside Europe
While the Cold War solidified European political allegiances, upheaval continued in much of the rest of the world. From Africa to Iran, from India to Southeast Asia and China, World War II had undermined Western colonial powers, unleashing a great wave of nationalism. Burma, Indonesia, India, and the Philippines soon achieved independence, followed a decade later by most of the Western colonies in Africa. Anticolonial nationalists often allied themselves with revolutionary social movements. In Vietnam and China, Communists under the leadership of Ho Chi Minh and Mao Zedong created powerful military insurgencies that championed the peasants’ need for land as well as the nationalism of most urban workers and intellectuals. In India, the British imprisoned Mohandas Gandhi, Jawaharlal Nehru, and other nationalists during World War II, but jail merely enhanced their political and moral stature. In 1947, India was partitioned into two independent nations: Muslim Pakistan and predominantly Hindu India.
During the war, the United States looked with some favor on anticolonial movements in Asia, especially those that fought against the Japanese occupying forces. American foreign service officers who met with Chinese Communists during the war praised them as patriots and “land reformers,” even as U.S. military advisers grew increasingly frustrated with the corrupt rule of the pro-Western Chinese dictator Chiang Kai-shek, who preferred to deploy his army against the Communists rather than the Japanese. In September 1945, before honored guests from the U.S. Intelligence service, the Vietnamese Communist Ho Chi Minh declared his nation’s freedom from France in a speech that borrowed much language from the American Declaration of Independence.
But Cold War tensions transformed U.S. attitudes toward these left-wing, anticolonial movements. Although the United States provided Chiang Kai-shek with billions in military aid, Mao’s Communist army took power in October 1949, forcing the Nationalists to flee to the island of Taiwan. Many Americans soon asked, “Who lost China?” Truman’s new secretary of state, Dean Acheson, argued that the civil war there “was the product of internal Chinese forces, forces which this government tried to influence but could not.” But Republicans and other conservatives, many with prewar missionary experience on the Asian mainland, accused the State Department of being “soft” on the Red Chinese. American policymakers became more hostile to nationalist and reform movements in the colonial world. In 1950, the United States began to send military aid to the French, who were fighting Ho Chi Minh’s forces in Indochina.
The Soviet explosion of an atomic bomb in August 1949 combined with the “loss” of China accelerated both the arms race and the militarization of U.S. diplomacy. In January 1950, President Truman gave the go-ahead for the development of a controversial new “super” bomb, a thermonuclear weapon that was hundreds of times more powerful than the 10,000-kiloton device that destroyed Hiroshima. First tested late in 1952, the new hydrogen bomb threw off a fireball—five miles high and four miles wide—that vaporized Eniwetok Atoll in the Marshall Islands. Within a year, the Soviets began their own H-bomb tests, eventually exploding a monster weapon with the power of fifty million tons of TNT.
The United States accompanied this new stage in the nuclear arms race with a plan for tripling the nation’s arms budget. A secret report by the National Security Council, NSC-68, assumed that the Soviets were preparing a military assault on one or more of the “Free World’s” outposts. “We must realize that we are now in a mortal conflict; that we are now in a war worse than any we have experienced,” argued Robert Lovett, one of the report’s influential authors. “It is not a cold war; it is a hot war. The only difference between this and previous wars is that death comes more slowly.”
The “hot war” fears of NSC-68 seemed fulfilled on June 25, 1950, when a Communist North Korean army crossed the 38th parallel into South Korea, where American troops had once been stationed. Although Stalin approved of the invasion, the driving force behind his agreement was the North Korean revolutionary Kim Il Sung, who maneuvered Stalin into supporting reunification and ousting the American-backed conservatives who ruled the South. American policymakers instantly imposed a global, strategic template on what was essentially a regional struggle. Truman quickly won United Nations’ backing for the dispatch of U.S. Army troops to South Korea. (The Soviets, who might have vetoed this U.N. action, had absented themselves from meetings of the U.N. Security Council in protest over the United Nations’ failure to admit Communist-led China.) Despite early victories under General Douglas MacArthur, the invasion of Chinese troops across the Korean-Chinese border turned the war into a bloody, three-year stalemate. Korean casualties mounted to more than a million killed and wounded, and the United States lost 34,000 men before signing a July 1953 truce that left the Communists still in control of North Korea.
The Korean War made it politically possible for the Truman administration to triple defense spending to nearly billion a year. By 1955, the United States had hundreds of military bases in thirty-six countries. New alliance systems, such as the Southeast Asia Treaty Organization (SEATO) and the Rio Pact, kept Asian and Latin American states friendly to the United States through a combination of diplomacy, foreign aid, and covert manipulation of their newspapers, politicians, and trade unions. In Korea, Spain, and the Philippines, the mere presence of large U.S. military bases bolstered authoritarian governments. But clandestine operations, often orchestrated by the newly formed Central Intelligence Agency, sometimes supported pro-U.S. military coups d’états. In Iran (1953) and Guatemala (1954) and later in Brazil (1964) and Chile (1973), the CIA cooperated closely with right-wing military officers in removing popularly elected liberal or leftist leaders that threatened U.S. business interests. U.S. troops also intervened directly in Vietnam, the Congo, and the Dominican Republic.
The New Deal Under Attack
Although the Great Depression had discredited business leaders, the successful wartime production effort seemed to prove that American capitalism worked and that in government and outside it, corporate managers should once again be trusted public figures. As H. W. Prentis, a prominent spokesman for the National Association of Manufacturers, asserted during the war, “it is not government that has wrought the miracle that is being accomplished today in the production of war materials but the initiative, ingenuity and organizing genius of private enterprise.” Business leaders knew that they might have to pay higher wages to organized workers, but they rejected the idea of democratic power sharing in shops and offices. Industrial unions, executives complained, deprived them of the power to assign work as they saw fit, to fire unsuitable employees, and to speed up production. A larger conservative effort to dismantle the Roosevelt electoral coalition accompanied this corporate counterattack against labor. Together, they sought to repulse the Truman administration’s efforts to advance New Deal style legislation, including more public housing and a system of national health insurance. Southern Democrats became increasingly reactionary at the same time that Republicans discovered the usefulness of anti-Communism as an issue they could use to discredit New Deal liberals and trade union officials. In the 1950s, President Dwight D. Eisenhower’s “Modern Republicanism” marginalized the most virulent forms of anti-Communism and, for a time, created the appearance of an accord between labor and management.
Capitalism Regains the Initiative
In their postwar campaign to regain “the right to manage,” employers enlisted the state as an ally. By 1947, more than seventy antilabor bills had been introduced in the House of Representatives. Companies in construction, transport, and retail sales wanted to outlaw the secondary boycott, which the Teamsters and Longshoremen used to pressure antiunion employers whose goods they handled. Employers in the South and West simply wanted to stop the spread of unionism, and Republicans especially feared the CIO, whose Political Action Committee had helped to ensure Roosevelt’s reelection in 1944. Finally, conservatives and some liberals wanted to force the Communists out of the union movement. The Republican-dominated Congress that took power in 1947 lent a sympathetic ear to such antiunion interests; by June, a coalition of Republicans and southern Democrats had achieved a landmark revision of the New Deal labor law by overriding Truman’s veto of the Taft-Hartley Act.
The Taft-Hartley Act, named for its two principal sponsors, Fred Hartley of New Jersey and the GOP stalwart, Senator Robert Taft of Ohio, signaled a major shift in the tenor of class relations in the United States. The new law deprived foremen of the protection that the Wagner Act afforded workers, made sympathy strikes and boycotts more difficult to carry out, and allowed states (in practice, those in the South and mountain West with weak union movements) to ban the union shop. The act also legalized employer “free speech” during union organizing campaigns, which gave managers a greater opportunity to intimidate workers before a National Labor Relations Board (NLRB) election. And it gave the federal government considerable veto power over union politics and strikes. Labor leaders had to declare themselves to be non-Communist if they wanted their unions to participate in NLRB elections. And if they headed large unions, they had to bargain knowing that the president could postpone for eighty days any strike that was deemed a “national emergency”—a power that Truman used thirty-seven times in the remainder of his term. None of these restrictions made the Taft-Hartley Act the “slave labor law” unionists called it, but together they helped to deradicalize the union movement, curb interunion solidarity, and keep the movement from organizing new regions and new workers.
The Failure of Interracial Solidarity
Though the tides of public sentiment, congressional votes, and administration policy were all shifting against unions, the labor movement and its liberal allies were not without resources to mount a counterattack. The unions’ postwar political strategy was two-pronged: (1) a concerted campaign to organize the South, called Operation Dixie and (2) a political comeback in 1948, based on the reform and realignment of the Democratic party.
In Operation Dixie, the CIO sought to built a counterweight to the political power of the racist landlords and reactionary employers who dominated the postwar South. During World War II, labor shortages and government wage guidelines pushed wages upward more rapidly in the South than in any other region, and the unions had organized more than 800,000 southern workers, one-third of them Black. In the deep South, Black veterans, many in uniform, marched boldly into rural courthouses to demand the right to register and vote. Such patriotic, labor-based civil rights initiatives helped to double African American voting registration during the 1940s. Starting in 1946, Northern unions, especially those in the CIO, hired hundreds of organizers, opened scores of offices, and began vigorous organizing campaigns in textiles, lumber, tobacco processing, and other southern industries. “When Georgia is organized,” predicted a leader of the union drive, “you will find our old friend Gene Talmadge [the conservative governor of Georgia] trying to break into the doors of the CIO conventions and tell our people that he has always been misunderstood.”
But Operation Dixie failed. Resistance from the political and industrial leadership of the white South proved overwhelming: during the next few years, the proportion of unionized southern workers actually declined. City officials, churches, and police bitterly opposed “outside” organizers in the employer-dominated textile towns and lumber mill villages across the South. And although Black workers proved exceptionally union-conscious, many white southerners rejected interracial solidarity. Union organizers found that “mixed” meetings could be held only outdoors, that interracial handshakes were taboo, and that African American participation had to be downplayed. Facing physical intimidation from vigilantes and lawmen alike, Operation Dixie organizers were often beaten and run out of town. Organizing the South in the late 1940s would have required a massive interracial campaign by a militant CIO leadership that enjoyed backing in the northern Democratic Party and throughout the labor movement. But President Truman was ambivalent, and labor’s ranks were increasingly divided over the role of Communists and other radicals.
The 1948 Election
The failure of Operation Dixie meant that unions would not transform southern politics. But labor leaders still hoped to “realign” the American political system, either by building the power of labor, small farmers, and African Americans within the Democratic party (and thereby pushing most of the conservatives into the GOP) or by creating an entirely new third party based on a liberal-labor coalition. Until the spring of 1948, most union leaders therefore opposed Harry Truman as the Democratic presidential candidate. Many, including more than half of all CIO officials, expressed interest in forming a third party.
In 1948, the Communists, supported by a slice of the old New Deal coalition, formed a new Progressive Party, nominating former vice president Henry Wallace for president. Wallace’s vision of an expanded New Deal, of racial egalitarianism, and of peaceful coexistence with the Soviet Union differed sharply from the outlook of policymakers in the Truman administration: late in 1946, Truman dismissed him as secretary of commerce after making a speech critical of the administration’s tough line toward the Soviets.
But rather than realigning American politics, the Wallace effort put an end to political experimentation and wed labor even more closely to the Democrats. Anti-Communist liberals, including Eleanor Roosevelt and Walter Reuther, denounced Wallace as politically naive, called for the elimination of Communist influence in all liberal and labor organizations, and supported Truman’s tough stance toward the Soviet Union. The CIO and the AFL rejected the Progressive Party and endorsed Harry Truman’s candidacy.
In the 1948 election, most observers assumed that voters would put Republican Thomas Dewey, the financially well-connected governor of New York, into the White House. But Truman surprised everyone. A Missouri politician with traditional racial attitudes, Truman knew that his reelection would hinge, in the words of his adviser Clark Clifford, on winning the support of “labor and the urban minorities.” His administration therefore shifted leftward. Fearful that Wallace’s Progressive Party would appeal to liberal Democrats in the North, Truman made civil rights a major presidential priority for the first time in seventy-five years. He called on Congress to pass a new Fair Employment Practices Act that would end job discrimination, and in July 1948, Truman signed an executive order that desegregated the armed forces. He thereby capitulated to the antidiscrimination protest campaign led by the African American union leader, A. Philip Randolph. Truman pushed for national health insurance and a big public housing program and promised to work with a new Democratic Congress to repeal the Taft-Hartley Act. Denouncing “Wall Street Republicans” on a frenetic whistle-stop campaign across the country, he thwarted Wallace and Dewey by galvanizing midwestern farmers and urban workers who had been part of the old Roosevelt coalition. When Truman won in November, he excitedly told the press, “Labor did it!”
Truman’s victory in 1948 put the Democrats back in control of Congress, but conservatives retained a working majority there. Many Southern Democrats, who mounted their own “States’ Rights” presidential campaign in 1948, with Strom Thurmond of South Carolina heading the ticket, no longer saw the Democratic Party as a reliable bulwark of white supremacy. Over the next third of a century, they would abandon their old party and shift to either the Republicans or an even more conservative third party. When Truman sought to pass his liberal “Fair Deal” legislative program, a coalition of Republicans and Southern “Dixiecrats” blocked his initiatives at every turn. Congress did pass a National Housing Act in 1949, but it led primarily to the building of low-cost urban housing projects, which soon turned into slums.
The drive toward civil rights stalled as well. After Truman partisans marginalized the Progressive Party, most Democratic leaders sought to regain the loyalty of the white South. Truman’s commitment to Black equality flagged during his second term, and in 1952, the Democratic presidential nominee, Adlai Stevenson, downplayed civil rights even further, choosing for his running mate an Alabama senator who stood for the maintenance of the status quo. Southern politicians and business leaders soon mobilized against the extension of federal power, the spread of unions, and the push for civil rights. The growth in African American voting strength came to a halt in the early 1950s. Thereafter, xenophobic anti-Communism and outright appeals to racism increasingly characterized southern election campaigns. “Northern political labor leaders have recently ordered that all doors be opened to Negroes on union property,” declared one election flyer. “This will lead to whites and Negroes working and living together. . . . Do you want that?”
The Weapon of Anti-Communism
America’s encounter with the specter of domestic Communism proved a crucial contribution to the political stalemate and cultural conservatism of the early postwar years. Scores of Americans, perhaps as many as 300, did indeed provide information to Soviet agents, largely during the era of the Popular Front and World War II, when the politics of a fervent antifascism generated an apparent common bond among liberals, Communists, and the Soviet Union. Some of the people who provided information held high office, including Harry Dexter White, an assistant secretary of the Treasury, and Alger Hiss, a State Department official who participated in the Yalta conference with Roosevelt. But the actions of a few motivated post–World War II anti-Communism far less than a pervasive antiradicalism that now merged with a postwar hostility to the New Deal and its partisans.
Anti-Communism was both a partisan strategy that was exploited by top politicians and a popular grassroots movement. In 1947, fearing that he might be outflanked on “internal security” issues by the Republicans, Truman boosted funding for the FBI, set up a loyalty program for federal employees, and asked the attorney general to draw up a list of subversive organizations. And between 1945 and 1952, congressional committees conducted eighty-four hearings on Communist subversion, the House Committee on Un-American Activities (HUAC) held the most infamous to investigate Hollywood, higher education, unions, and the federal government. In the well-publicized HUAC hearings, investigators demanded that witnesses not only affirm their loyalty to the government but also prove it by naming former Communist associates. “Don’t present me with the choice of either being in contempt of this committee and going to jail,” pleaded the Hollywood actor Larry Parks, “or forcing me to really crawl through the mud to be an informer.” Though Parks did name his former left-wing friends and associates, his career, like those of many others, was ruined when film studios and other large employers blacklisted suspect employees. By one estimate, 13.5 million Americans fell within the scope of various federal, state, and private loyalty programs. Roughly one of every five working people had to take an oath or receive a security clearance as a condition of employment.
The most relentless interrogator, Republican senator Joseph McCarthy of Wisconsin, cast himself as the ultimate patriot, directing his inquisitorial skills against “respectable” targets in the State Department, Ivy League universities, and the U.S. Army. McCarthy achieved national stature early in 1950 by exploiting public frustration over the “loss” of China and the consolidation of Soviet power in eastern Europe. To McCarthy and his followers, these events represented more than just diplomatic setbacks. At one point, McCarthy claimed to have a list of 205 Communists employed in the State Department. In his eyes, Secretary of State Dean Acheson was the “Red Dean . . . Russian as to heart, British as to manner”; the years of Roosevelt and Truman he called “twenty years of treason.” Using his chairmanship of a minor Senate subcommittee to launch wide-ranging and often crudely partisan investigations, McCarthy charged that Communist sympathizers in the highest reaches of government shielded Soviet spies. Although such charges were nonsense, McCarthy’s manipulation of the press and the new medium of television (which broadcast many of the hearings) proved so masterful that he became one of the most feared political figures of the early 1950s.
Liberals, non-Communist radicals, labor activists, and others who questioned the direction of postwar society were often the real target of anti-Communist probes. As the head of one government loyalty board noted, “The fact that a person believes in racial equality doesn’t prove he’s a Communist, but it certainly makes you look twice, doesn’t it?” Many businessmen found Communist subversion to be a convenient explanation for labor conflict. “Whoever stirs up needless strife in American trade unions advances the cause of Communism,” asserted the Nation’s Business late in 1946. Such employers worked closely with congressional investigators and state officials, who were happy to “Red-bait” union officials when a strike or NLRB certification election was imminent. Nongovernmental groups such as the American Legion, with 17,000 posts, also played a powerful role in the anti-Communist movement. Feature articles in the Legion’s magazine asked such questions as “Does Your Movie Money Go to Commies?” and “Do Colleges Have to Hire Red Professors?”
Not surprisingly, anti-Communism polarized the labor movement. In the 1930s and 1940s, Communists and other radicals played an indispensable role in building the new unions; now anti-Communists accused them of “infiltrating” these organizations. Ironically, the Communists had been among the most zealous patriots of World War II. But after the war, this resurgent nationalism turned against the Communists. Millions of workers were still first- or second-generation immigrants, whose sense of “Americanism” had only recently been affirmed by the patriotism that surged through their communities during the New Deal and the wartime years. Although many immigrants had once sought to combine socialist politics with patriotic Americanism, the Cold War forced them to choose. To be a radical, let alone a Communist, seemed now to be un-American.
Religious, ethnic, and racial loyalties often determined the ways in which working Americans responded to the anti-Communist impulse. Catholics, especially those of Irish or eastern European extraction, who had themselves been the subject of nativist prejudice in the 1920s, were among the most ardent anti-Communists. The Red Army’s occupation of Eastern Europe had an electrifying impact on millions of Americans of Slavic and Hungarian origin, who constituted perhaps half of the CIO’s membership. When the Soviets arrested Church leaders in Poland, the Catholic Church in America mobilized tens of thousands of adherents to protest the “Satan-inspired Communist crimes.” Priests from working-class parishes played an aggressive part in the effort to oust Communists from the leadership of unions such as the United Electrical Workers.
In contrast, relatively few African Americans joined in the anti-Communist crusade. Their lack of enthusiasm reflected the Communist Party’s long commitment to civil rights and its active recruitment of Black workers as members. But more important, the discrimination that African Americans endured made them skeptical of white efforts to define “Americanism.” Coleman Young, who would later become the first African American mayor of Detroit, denounced a congressional investigating committee that questioned his loyalty:
I fought in the last war and . . . I am now in process of fighting against what I consider to be attacks and discrimination against my people. I am fighting against un-American activities such as lynchings and denial of the vote. I am dedicated to that fight, and I don’t have to apologize or explain it to anybody.
McCarthyism divided American Jews more sharply than any of the nation’s other ethnic groups. By the late 1940s, anti-Semitism had begun to wane in the United States, and prospects for assimilation seemed good. Therefore, second- and third-generation Jewish Americans entered the middle class more rapidly than any other ethnic group. Yet American Jews, who had been solid Roosevelt partisans, were often victims of the anti-Communist impulse. In New York City, 90 percent of all teachers who were fired by the Board of Education were Jewish; in Detroit and Flint, Michigan, Communists who were “run out” of the auto plants were often taunted with anti-Semitic epithets.
A notorious espionage case accentuated Jewish fear of renewed anti-Semitism. In 1950, the U.S. government tried and convicted Julius and Ethel Rosenberg, both active Communists and the children of Jewish immigrants, for conspiracy to commit espionage and delivering atomic secrets to the Soviet Union. In June 1953, they were executed. Julius clearly had given the Soviets information provided to him by his brother-in-law, a machinist at the Los Alamos weapons laboratory, but little evidence implicated Ethel. They were the first American citizens ever to be executed for treason in time of peace, and their deaths frightened Jewish progressives, causing many to abandon their longtime participation in radical causes.
Eisenhower’s “Modern Republicanism”
McCarthyism’s hold on American politics lost some of its power once the Republicans regained control of the White House and both houses of Congress in the 1952 elections. For Republicans like the new president, Dwight D. Eisenhower, Joseph McCarthy’s strident brand of anti-Communism paid few political dividends; meanwhile, the death of Joseph Stalin in March 1953, followed three months later by armistice in the Korean War, eased Cold War tensions and national anxieties. McCarthy still made headlines, but he became an object of growing criticism. In November 1954, the U.S. Senate voted to censure him. Thereafter he quickly lost influence, though the political and ideological constraints imposed by the anti-Communist impulse did not fully ebb until well into the 1960s.
Eisenhower called himself a “modern Republican”; he ran for president in 1952 as an internationalist who was determined to keep the right wing of his party in check. Born in Abilene, Kansas, in 1890, Eisenhower had made the Army his career, rising through the ranks as the consummate planner and military diplomat. During World War II, he used these skills to organize the Normandy landings on D-Day and to coordinate the British-American push into Germany.
Known to the public as “Ike,” Eisenhower and his vice presidential candidate, Richard Nixon, won 55 percent of the popular vote. An admirer of businessmen, Eisenhower put eight millionaires in his first cabinet. He pushed for private development of offshore oil and hydroelectric power, which the old New Dealers wanted to keep in federal hands. And he favored balanced budgets, even when the two recessions that took place during his administration cut federal income and generated high levels of unemployment. The corporate spirit of his administration became apparent early in 1953 when a Democratic congressman asked Eisenhower’s nominee for secretary of defense, Charles E. Wilson, the former head of General Motors, whether he foresaw a conflict between his new governmental responsibilities and his old employer. Wilson confidently, and controversially, responded, “What is good for our country is good for General Motors, and vice versa.”
Eisenhower was hardly a reactionary. Although most of the leading figures in his administration had fought New Deal labor and social welfare reforms, once in power they did little to undermine them. Indeed, the Eisenhower administration locked the New Deal in place, creating a new cabinet office (Health, Education, and Welfare), raising the minimum wage, and broadening Social Security coverage. After the Soviet Union launched the first earth-orbiting satellite, Sputnik, in 1957, Eisenhower endorsed the liberal view that federal funds should be used to improve American education in science, technology, and languages. Eisenhower had no enthusiasm for school integration, but neither did he pander to white racism in the South. Indeed, African Americans gave him more of their votes in 1956 than any Republican presidential candidate had received in twenty years.
On international policy, Eisenhower and his secretary of state, John Foster Dulles, were Cold Warriors who believed the Communist bloc an implacable foe of Western civilization. But as fiscal conservatives who feared high taxes and an intrusive “garrison” state, they also sought to limit the size of the military. Eisenhower and Dulles therefore relied on relatively inexpensive nuclear weapons and a worldwide aerospace delivery system. Dulles, a master of provocative speech and imagery, condemned the Truman era’s containment doctrine as “appeasement” and instead declared that the United States would employ “massive retaliation” to protect Free World interests. Critics called Dulles’s policy nuclear “brinksmanship” because it involved repeatedly taking the nation to the verge of war. But nuclear brinkmanship proved virtually useless in the real world. When Vietnamese nationalists came close to defeating the French in May 1954, Eisenhower refused to order a nuclear air strike to stop them. When Hungarians revolted against Communist rule two years later, the United States failed to intervene, fearing a larger confrontation with the Soviets. By the time he left office in 1961, Eisenhower himself criticized what he called the “military-industrial complex” and advocated vigorous U.S. diplomacy as a means of reducing Cold War tensions.
The Affluent Society and Its Discontents
From 1947 until the early 1970s, the United States enjoyed an unprecedented era of sustained economic growth. Even with five short recessions, real wages and the overall production of goods and services doubled, while unemployment and inflation remained low. This newfound material affluence deeply affected Americans on the job and at home, reshaping the way they thought about themselves and their society. In the 1930s, many New Deal theorists assumed that the U.S. economy was permanently crippled. Its strength, they believed, could be restored only by breaking up big corporations and redistributing wealth and income. But the postwar boom made that political agenda irrelevant. It eased class and ethnic tensions and seemed to create a kind of truce between capital and labor that limited the appeal of unionism and fostered a workforce that identified with the middle class. Inequalities of race, gender, and skill remained stark. In the 1950s, segregation by race remained pervasive; employers assigned women, white and Black, to clerical or domestic tasks; and only those with a college degree could obtain many of the better jobs. Despite these problems, one working-class youth likened the new prosperity to “a teenager’s first kiss. Not much, but never to be forgotten.”
The Postwar Economic Boom
Postwar affluence was no accident; it arose out of New Deal politics, the experience of World War II, and America’s new role in the world. From World War II, many Americans learned the great lesson that federal money and political will power could vanquish unemployment. Congress passed the Employment Act of 1946, which committed the federal government to promote “maximum employment, production, and purchasing power” and set up a Council of Economic Advisers that was charged with developing “national economic policies.” During the next quarter-century, government fiscal policymakers based their work largely on theories developed by the British economist John Maynard Keynes. Keynes argued that in modern capitalist societies, governments could combat business slumps by using its power to tax and spend in order to regulate consumer demand for goods and services. Liberals wanted to sustain consumer purchasing power through government spending on public works, schools, housing, Social Security, and unemployment insurance. Conservatives, who feared such programs would erode market incentives and open the door to government planning, favored tax reductions for business instead. Throughout the 1950s, liberals and conservatives fought to a standoff on these issues. Though business taxes remained at the high levels established during World War II, government spending on social programs grew slowly.
Two new forces dominated the nation’s economy in this era: a strong union movement and an enormous peacetime military establishment. Though the Taft-Hartley Act (1947) and the failure of Operation Dixie signaled the end of labor’s expansive phase, large unions still negotiated higher wages and pushed for increased government spending. The labor movement enrolled more workers than at any other time in U.S. history, reaching a high point of nearly 35 percent of the labor force in 1953. Sophisticated corporate executives realized that disruptive strikes and contentious wage negotiations—especially if they were part of a broad offensive against corporate power—would embitter relations on the shop floor and hamper the company’s long-range planning. In 1948, therefore, General Motors offered the UAW a contract that included two pillars of the postwar accord: an automatic Cost of Living Adjustment (COLA) clause keyed to the consumer price index and an unconditional 2 percent “annual improvement factor,” designed to give workers a share of GM’s substantial productivity gains. The GM idea soon spread, and by the end of the 1950s, more than 50 percent of all major union contracts included the COLA principle. Nonunion firms such as IBM and DuPont, anxious to keep unions out, copied the pattern, paying top wages, matching union benefits, and establishing employee grievance systems.
The second pillar of the postwar boom was military spending. By 1950, about half of the federal budget, or more than 10 percent of all goods and services consumed in the United States, went to the armed services. Because of the Cold War, most citizens accepted massive government spending for the military. Arms production helped to fuel the growth of key sectors of the economy, such as aircraft manufacturing and electronics. It fostered economic growth and urbanization in the South, where the military built many new bases, and in southern California, Seattle, and Long Island, where the growing aviation industry contributed to a vibrant postwar sprawl. The political consensus on military spending allowed the federal government to fund social programs that otherwise would have been controversial. Educational, medical, housing, and pension benefits for veterans grew rapidly, and in 1956, Congress voted to fund a multiyear, multibillion-dollar Interstate and Defense Highway Program, the largest public works project in the nation’s history.
The nation’s prosperity put real money into the average citizen’s pocket. Between 1941 and 1969, family income almost doubled. Because Americans felt more secure economically, they went ahead with marriages and pregnancies they had postponed during the Depression and World War II. The birth rate leaped by 25 percent after 1945 and remained high throughout the 1950s. “It seems to me,” wrote a British visitor in 1958, “that every other young housewife I see is pregnant.” Americans ate better, lived in more spacious homes, and could afford to see the doctor more often. During the full employment 1940s, the life expectancy of white Americans rose from sixty-three years to sixty-seven years, while that of African Americans increased from fifty-three to sixty-one.
The whole system seemed a never-ending spiral of growth and abundance, prompting contemporary observers to conclude that American capitalism had found the solution to all economic problems. “The world revolution of our times is ‘Made in the USA,’” wrote the business consultant Peter Drucker in 1949. “The true revolutionary principle is the idea of mass production.”
The Service Economy
In the quarter-century after the war, corporations added staff, governments hired more teachers and policemen, and unions upgraded many jobs that had once been casual or low paid, such as telephone repair, warehousing, and seafaring. For the first time in U.S. history, white-collar workers outnumbered blue-collar workers. Workers who sat behind a desk, stood behind a counter, or presided over a classroom had composed less than one- third of the work force in 1940 but swelled to almost half of all those employed by 1970. In the two decades after 1950, the growing demand for consumer goods spurred the creation of new department stores and supermarkets, staffed by three million additional employees. And in almost every large corporation, thousands of managers and supervisors manipulated paper, people, and products. The sociologist C. Wright Mills captured the new world of work in his influential study White Collar (1951): “What must be grasped is the picture of society as a great salesroom, an enormous file, an incorporated brain, a new universe of management and manipulation.”
The growth of the service industry reinforced the popular, but misleading, notion that the United States had become a classless, “postindustrial” society. Despite the deployment of much highly touted automated machinery, factory work still required an army of manual workers. Although blue-collar employment declined steadily as a proportion of the work force, to 35 percent by 1970, manufacturing workers actually increased in number from twenty-two million to twenty-six million between 1950 and 1970. Moreover, half of all service jobs involved manual labor—trash collection, maintenance, and food preparation, for example. The greatest growth in so-called white-collar employment came in sales and clerical work—jobs that might be considered white-collar, in the sense that a typist did not spot-weld body joints, but that involved little creativity or autonomy. “My job doesn’t have prestige,” bank teller Nancy Rogers noted. “It’s a service job. Whether you’re a waitress, salesperson, anything like that . . . you are there to serve them. They are not there to serve you.”
The postwar shift to service and clerical work would have been impossible without the influx of twenty million women into the workforce. In 1950, a total of 31 percent of all women were employed outside the home; twenty years later, the figure was 42 percent. Unlike the war years, the postwar job market confined women to a female labor ghetto. Ninety-five percent of them worked in just four job categories: light manufacturing (home appliances and clothing), the retail trade, clerical work, and health and education. And within those categories, men took the high-status work while the routine jobs went to women. In 1960, for example, males held 90 percent of high school principal positions, while 85 percent of elementary school teachers were women.
Such job segregation helped to keep women’s work low-paid and dead-end. In Baltimore, for example, employers kept women who were clerical employees on the job after World War II but forced women who worked in high-wage aircraft assembly to take lower-paying jobs as waitresses or service workers. Women’s average weekly wages in that city therefore fell from to . The same process of exclusion took place among professionals as well. Because many graduate schools discouraged women from enrolling and because World War II veterans took so many seats at the nation’s colleges and universities, there were actually fewer women doctors and lawyers in the 1950s than there had been two decades before.
The New Sexual Orthodoxy
The sexual ideology of the early postwar years relegated women to the secondary labor market. All too quickly the wartime self-confidence of “Rosie the Riveter” gave way to a rigid definition of gender roles that was reminiscent of mid-nineteenth-century social spheres. Experts celebrated women’s submissiveness and domesticity, branding sexual freedom as potentially subversive, even pro-Communist. The popular media portrayed women largely as incompetent and vulnerable, fulfilled only in the context of a stable and secure marriage. By the late 1940s, women’s magazines such as Ladies’ Home Journal and Redbook filled their pages with articles such as “What’s Wrong with American Women?” and “Isn’t a Woman’s Place in the Home?” Author Betty Friedan labeled all this the “feminine mystique” in her 1963 best-selling book of the same name.
The new sexual orthodoxy also applied to American men and equated masculinity with rationality and control over one’s emotions. Men, too, were expected to marry. To attain maturity and respectability required being a “family man,” even in late adolescence. Employers shunned men who were still single in their thirties. Homosexuality was a criminal offense that was thought to sap the moral fiber of both the individual and the nation. Indeed, the anti-Communist movement engendered a wave of homophobia, intensifying the persecution of male and female “perverts.” When the FBI mounted an all-out effort to discover the sexual habits of those who were suspected of subversive political behavior, gay-baiting rivaled Red-baiting in its ferocity, destroying careers, encouraging harassment, and forcing those who “confessed their guilt” to name their lovers and friends.
Despite the heavy emphasis on marriage and the family, or perhaps because of it, Americans indulged an appetite for vicarious sex. The postwar era gave birth to the cosmopolitan, sexually permissive, and hugely successful Playboy magazine in 1953. Filled with advertisements for liquor, stereo equipment, and cars, the magazine demonstrated that lust itself was a consumer commodity that was eminently suitable to the upwardly mobile. Although Playboy had a predominantly male audience, publishers marketed sexuality to women, too. Peyton Place, the steamy story of a town in rural New England, became the best-selling novel of the century. Published in 1956, it sold ten million copies, largely to women.
These rigid gender roles made the lives of midcentury working women particularly difficult. Most American men saw cooking, cleaning, and changing the baby’s diapers as “women’s work.” Consequently, women who did work outside the home carried the burden of housework and child rearing as well. And though American families acquired dishwashers, vacuum cleaners, and other labor-saving appliances, housework still demanded as much time as it had thirty years earlier for the average woman. According to one survey taken during the 1950s, half of all working women said that they had no leisure time at all.
A CLOSER LOOK: The Lavender Scare—Creating the Closet
Race and Ethnicity at Work
Millions of other workers—perhaps as many as 40 percent—found work outside the industrial and service sectors as poorly paid farm laborers, cabdrivers, cannery workers, and dime store clerks with little job security. Such jobs are often thought of as belonging to a distinct “secondary” labor market—segregated from the more secure work in the core economy, but still essential to the functioning of the system. Although casual employment has long been part of the working-class experience in America, in the postwar era this secondary labor market, once comprising mostly manual labor, evolved to include clerical and service employment. Workers in those fields tended to come from marginal groups: racial minorities, teenagers, and women.
The flight from farming expanded the ranks of casual laborers. Mining and agriculture together supported one in four American families before the war, but consolidation and mechanization eliminated fifteen million of these jobs in one generation. The nation’s agricultural population fell from one in five to one in twenty. In Midwestern states such as Iowa and Ohio, the factories and offices of the region’s cities readily absorbed rural migrants, mainly white people with some education. But in the rural South, Puerto Rico, and the Mexican borderlands, the process of depopulation was far more traumatic.
In the South, mechanization of the plowing, weeding, and picking of cotton displaced more than four million farmers and farm laborers. Mechanization hit African Americans the hardest; by 1960, fewer than 10 percent worked on the land. Between 1940 and 1970, more than five million African Americans moved from the South to the North, most of them to the largest cities. At one point in the 1950s, the Black population of Chicago swelled by more than 2,200 new arrivals each week. The South Side of America’s second city now rivaled Harlem as the cultural capital of Black America.
The same economic transformation that drove African Americans north also drove Puerto Ricans from their island farms and into the mainland cities. During the 1940s, a U.S. government program, “Operation Bootstrap,” encouraged the mechanization of the island’s sugarcane plantations and the growth of low-wage, tax-free industries in cities such as San Juan. Rural employment plunged, and the island’s urban population tripled. Yet Puerto Rico’s unemployment rate remained among the highest in the Caribbean.
These conditions prompted a large migration to the mainland. Since the 1920s, a small but steady stream of Puerto Ricans had moved to New York and other eastern cities. In the postwar years, cheap airfares, family connections, and the hope of steady employment lured 40 percent of all islanders to make the move. By the end of the 1960s, New York City had a larger Puerto Rican population than San Juan did. El Barrio in East Harlem became the center of Puerto Rican life in New York, the home of salsa music, scores of social clubs, and hundreds of small Puerto Rican grocery stores, or bodegas, which served as the center of social and economic life for many immigrants.
Mexican immigrants fleeing the poverty of their homeland also poured into American cities. Though many crossed the border illegally, some 4.5 million Mexicans came to the Southwest between 1942 and 1964 through a government-sponsored bracero work program, although such workers were still often cheated out of wages and savings. Some of the new immigrants found employment on the vast factory farms that dominated California and Arizona agriculture, but many went to the growing cities of California and the Southwest. From 1950 to 1960, the Chicano (Mexican American) population in Los Angeles County more than doubled, growing from 300,000 to more than 600,000. By 1968, the Mexican American community in East Los Angeles approached one million. In that year, 85 percent of the nation’s Latinx population lived in urban areas.
This economic revolution also marked a watershed in the history of American cities. The arrival of millions of Mexican Americans, African Americans, and Puerto Ricans reshaped the character of the nation’s urban centers. In some ways, the lives of these postwar migrants and immigrants resembled those of earlier immigrants, such as the Irish in the 1850s or the Jews, Italians, and Eastern Europeans of the early twentieth century. They, too, had to make the change from rural to urban life, figure out the city’s ways, and in some cases learn a new language. Much like earlier immigrants, Mexican Americans, Puerto Ricans, and African Americans drew on their traditional cultures while adapting to the new world of the city.
In other ways, however, the postwar arrivals confronted a very different economic landscape. Although the U.S. economy boomed, it no longer generated the urban construction and manufacturing jobs, which had provided employment for so many early twentieth century immigrants. In Chicago and Detroit, a decline in meatpacking, auto parts manufacturing, and steelmaking permanently eliminated hundreds of thousands of high-wage, unionized jobs, thus forcing many of the new migrants and immigrants into insecure, low-wage positions at the bottom of the job ladder.
Puerto Ricans, Mexican Americans, and African Americans faced discrimination not only in hiring, but also in housing, schooling, and social services. When they tried to move into traditionally white neighborhoods, they faced discrimination by realtors and landlords and, in some instances, violence at the hands of angry white homeowners. Urban police forces, virtually all white in the 1950s, clashed repeatedly with the new urban migrants. The trouble was a product not only of white racism, but also of the aggressive professionalism with which urban police departments now did their job. Police in Oakland and Los Angeles were then among the most innovative and corruption-free in the nation, but they were also among the most brutal and insensitive when it came to arrests and interrogations.
The new arrivals could not easily achieve political remedies for such problems without U.S. citizenship. Since many Mexican Americans could not vote, they had few political representatives. And though Puerto Ricans had been granted full U.S. citizenship in 1917, most Puerto Ricans found mainland literacy tests an obstacle to voter registration. Urbanized African Americans outside of the South could and did vote in the 1950s, which gave them some political leverage. But urban political machines, such as the one headed by Richard Daley in Chicago, managed to contain and channel African American demands. Throughout the 1950s, then, these urban minorities lived in a segregated and culturally isolated world, even in the most cosmopolitan of cities. Symbolic changes, such as the integration of major league baseball after Jackie Robinson joined the Brooklyn Dodgers in 1947, did not change the realities of everyday life. Although many African Americans organized protests against job discrimination and police brutality, such campaigns did not generate mass involvement or much media attention until the dramatic boycotts and sit-ins in the South captured national attention (Chapter 12). For the most part, these minority communities in the North remained inwardly focused, sustained by their own institutions and cultural traditions.
Unity and Division Within the Working Class
Although postwar trade unions helped to create the economic conditions that greatly reduced divisions within the industrial working class, they had increasingly limited capacity to transform the work lives of racial minorities and those who labored in the service economy. Since the 1930s, unions had greatly reduced wage differences between skilled and unskilled blue collar workers; by 1958, tool and die makers in an auto plant made only 20 percent more than unskilled assembly-line workers. Through grievance and seniority systems, organized labor had also reduced the influence of personal or ethnic favoritism in the workplace. In many steel mills, Catholic workers of Eastern European extraction, who had labored for three generations at heavy, sweaty jobs, finally got a chance to do skilled work. And with a definite set of rules to govern the authority of supervisors to assign work, the old saying “It’s not what you know, but who you know” became obsolete in many workplaces.
During the 1950s and 1960s, workers wanted both higher pay and the rights of conscious dignity that they believed their unions had been established to defend. They rejected the idea that one could be traded off for the other, which many managers and some union officials favored. One-third of all strikes in the 1950s were wildcat stoppages that arose when workers balked at speedups or contested management efforts to erode what they saw as their hard-earned work rights. In 1959, rank-and-file steelworkers pressured union officials into calling a 119-day strike that forestalled the elimination of work rules and safety standards that labor had won in the 1930s and 1940s.
Such massive stoppages did not have the political impact of the strikes that had made headlines in 1937 or 1946, however. After 1950, most companywide strikes were aimed at adjusting wage and benefit packages, not at changing the distribution of power in the workplace. And wildcat strikes had little long-range effect, even when they were temporarily successful. “We’re moving rapidly away from the crusading spirit of the thirties,” admitted a shop steward in an aircraft union at the end of the decade. “In 1953 we had one of the most militant unions in the labor movement. We had wildcat strikes, direct job action. . . . Today there is much less of this. People no longer file grievances because they think it is no use.”
The merger of the AFL and the CIO in 1955 ratified these changes in the union movement. With the CIO’s expulsion of Communist-dominated unions, few substantial differences remained between the two federations. The AFL, dominated by the construction trades and other “business” unionists, was almost twice the size of the CIO. It was therefore fitting that the AFL-CIO did not choose the CIO’s visionary president, Walter Reuther, as their new leader but George Meany, a Bronx plumber who had risen to leadership in the AFL during the 1930s and 1940s. Meany, who would later boast that he had never led a strike, won high wages for union members by adapting the labor movement to the contours of American capitalism. In the years after the 1955 merger, the amount of energy and money unions devoted to organizing new workers declined, as did the unions’s relative independence from the Democratic Party. “We do not seek to recast American society in any particular doctrinaire or ideological image,” Meany asserted. “We seek an ever rising standard of living.”
Because both labor law and management practice encouraged an insular, depoliticized form of collective bargaining, some trade union leaders became little better than corrupt businessmen, who undercut the price of labor they “sold” for kickbacks and payoffs from employers. Union corruption, exposed in a series of congressional hearings presided over by Senator John McClellan in 1957 and 1958, proved to be especially prevalent in decentralized, highly competitive industries, such as trucking, the restaurant business, and dock work, in which autocratic union leaders could cut “sweetheart” deals with employers and ignore rank-and-file sentiment. Gangsters actually ran some locals; nepotistic families presided in many others. The moral standing of the union movement plummeted—one reason for the decline in the proportion of American workers who were AFL-CIO members, from about 33 percent of the workforce in 1955 to little more than 20 percent two decades later.
The stagnation of the union movement also fostered increasingly sharp divisions within the working class. Race and gender prejudice always separated American workers, but the inability of the unions to organize white women, African Americans, and others in the growing secondary labor force hardened these divisions in the working class. Meanwhile, union policy on two key issues of the period, automation and employee fringe benefits, further divided workers. In the 1930s, unions sought to spread the burden of unemployment by reducing the length of the workweek, even if that meant smaller paychecks for everyone. In the 1950s, with unemployment far less of a problem, workers faced rapid technological change—then called automation—that eliminated many of the best blue-collar jobs. But the unions had no effective response. On unionized West Coast docks, the “containerization” of most cargo generated a two-tier work force: a small group of well-paid, steadily employed machine operators and a large group of casual workers who did the dangerous work of lifting and hauling.
An even more pervasive division within the working class took place when unions focused their energy on bargaining over health and pension plans, which came to constitute a semiprivate welfare state for union members. In the late 1940s, the labor-liberal effort to expand Social Security and inaugurate national health insurance had stalled, so unionists turned to the bargaining table to secure pensions and medical care for their members, so-called fringe benefits. By the end of the 1960s, almost all unionized workers had some sort of employer-paid health insurance, and two-thirds were covered by pensions. But the hefty benefit packages that unionized workers won made their lives and expectations very different from those of insecure workers who labored in poorly paid service and clerical occupations. The relatively egalitarian wage pattern of the mid-1940s eroded, and soon high-wage workers came to resent the taxes they paid for state-supported welfare. Thus, the weakness of the postwar welfare state and the resulting creation of a privatized substitute helped to split the American working class into two segments, one relatively secure and the other—predominantly young, minority, and female—left out in the cold.
The Ethos of a Classless Society
During the 1950s, many Americans thought that the nation’s “labor problem” had been solved. Although the distribution of income did not change in the postwar years—the top tenth of the population consistently took home almost 40 percent of the national income—the economic pie was getting bigger, so there seemed no need to redivide it. “The union,” wrote the editors of Fortune, “has made the worker, to an amazing degree, a middle class member of a middle class society.” Many Americans, especially those who enjoyed a measure of economic security, had always wanted to believe that the United States was a nation in which social class was unimportant, wealth was widely shared, and social conflict was muted. In the postwar era, however, this vision became pervasive, not only among conservatives, but also on college campuses and in union halls, newsrooms, and television studios. One fantasy of a classless society—achieving instantaneous fame and fortune—gained widespread popularity on television quiz shows in the 1950s. As even Philip Murray, the president of the CIO, asserted to a union audience, “We have no classes in this county."
At the same time that many people denied the existence of class differences, ethnic and religious divisions were growing less important to white Americans. Almost two generations had passed since the end of mass European immigration. With only half as many foreign-born residents in the United States as there had been during the Depression, mass institutions such as the military, the public schools, and the big corporations downgraded ethnicity as a social marker. Although churches and synagogues benefited from increased attendance in the 1950s, many worshipers came to see their faith as part of a homogenized “civic religion” that validated the “American way of life.” Perhaps President Eisenhower put it best when he affirmed, “Our government makes no sense unless it is founded in a deeply religious faith—and I don’t care what it is.”
The growth of the comprehensive high school had the same homogenizing effect. After the war, secondary school enrollment rose to 80 percent of its potential constituency. Virtually all white Americans spent three or four formative years in a public institution that offered classless homogeneity as part of its official ideology. Elaborate sports contests and the emergence of a distinct teen culture in the 1950s soon eclipsed the ethnic antagonisms that had bitterly divided white youths since the late nineteenth century. For millions of working-class young men, for whom service in the military provided a rite of passage from adolescence to adulthood, the postwar draft, which lasted from 1948 until 1971, further diluted ethnic, religious, and regional parochialism.
The explosive growth of college and university enrollments also contributed to the process. The enactment of the G.I. Bill (officially known as the Servicemen’s Readjustment Act) in 1944 had democratized higher education by making it broadly available to those who had served in the armed forces. World War II veterans took advantage of generous government payments. “Everybody went to college,” remembered a Sicilian-born architect who had spent his childhood in the Bronx. “Suddenly we looked up, we owned property. Italians could buy. The G.I. Bill, the American dream. Guys my age had really become Americanized.”
Still, class and racial divisions clearly structured upward mobility . The “tracking” of high school students into academic or vocational courses usually replicated social class divisions in the local community, and many of the white working-class youths who found higher education suddenly within reach enrolled in community colleges and technical schools rather than in the prestigious liberal arts colleges. Until the late 1960s, African Americans and Mexican Americans found managerial and professional jobs largely closed to them.
Suburban America
The New Deal, the new unions, and the new sense of citizenship held by ethnic Americans laid the basis for the proliferation of suburban housing tracts that spread outward from the urban fringe during the early postwar decades. New Deal planners believed in cheap credit, which they extended to farmers, hospitals, homeowners, and veterans. After the war, the government continued to guarantee low-interest loans through the Veterans Administration and other government agencies. For the first time in their lives, huge numbers of working people could afford better housing. And their demand was desperate. The Depression and war virtually halted residential construction; now millions of veterans and workers needed homes for their growing families. The demand was so great that in 1945, the city of Chicago had put 250 old streetcars up for sale as potential homes.
Before World War II, the suburbs had been reserved largely for the well-to-do. Working-class Americans lived near their work, often in apartments or cramped row houses in ethnic neighborhoods. Except in the Midwest, most workers rented, because purchasing a house required a down payment of 50 percent on a ten- or fifteen-year mortgage. In the postwar era, inexpensive single-family suburban homes, best symbolized by the three huge Levittowns that sprouted in potato fields outside New York City and Philadelphia, seemed to reverse this trend. William Levitt’s wartime experience in constructing family quarters on a navy base convinced him that if financing were available, contractors could make millions of dollars housing veterans and their families. With other builders, he prodded officials of the Veterans Administration and the Federal Housing Administration (FHA) to guarantee low-interest loans that would make suburban homes cheaper than rental apartments. Assured of a mass market, Levitt used assembly-line methods to erect thousands of identical homes, complete with white picket fence, green lawn, and a well-equipped kitchen. Buyers snapped up 1,400 houses in the first three hours after sales began in March 1949.
By 1960, three out of every five families owned their own dwelling. Some thought the new suburbs would transform feisty urban white ethnics into conservative homeowners concerned chiefly with keeping the crab grass at bay. “No man who owns his own house and lot can be a Communist,” Levitt asserted. “He has too much to do.” The sociologist David Riesman compared life in suburbia to “a fraternity house at a small college in which like-mindedness reverberates upon itself.”
But suburbanization itself did little to ameliorate social class divisions. Although workers might own homes that were nearly identical to those of their middle-class neighbors, they were unlikely to vote Republican, repudiate their union, or join the Rotary Club. Urban autoworkers who followed a relocated Ford plant to exurban Milpitas, California, liked the spaciousness of their new tract homes, but most did not believe that they had left the working class. More than mobility, blue-collar families valued security. In a suburban community, one sociologist reported, “the people of working-class culture stay close to home and make the house a haven against a hostile, outside world.”
Many women had mixed feelings about their new suburban surroundings. Suburban housewives loved the spaciousness and convenience of their new homes, the safety of the neighborhood, and the access to good public schools. But at a time when millions of married women were entering the labor market, most suburban housing developments were designed for families in which Mom stayed home, Dad worked in the city, and relatives remained at a distance. The early housing tracts contained few of the social institutions—the corner grocery store, the nearby grandparent, the convenient streetcar—on which women had long relied to ease their burden of shopping, housework, and child rearing. By making work outside the home more difficult for women and cutting them off from traditional support networks, the insular suburban neighborhood enforced conformity to postwar gender roles.
As the suburbs grew, government housing policies actually deepened racial and class divisions across metropolitan America. The Federal Housing Administration (FHA), which financed about 30 percent of all new homes in the 1950s, advised developers to concentrate on a particular housing market based on age, income, and race. To ensure neighborhood homogeneity and preserve property values, the agency endorsed “restrictive covenants” that barred Jews and African Americans from buying homes. (William Levitt permitted neither Black families nor single women to sign a mortgage.) And because federal housing agencies followed private lenders in “redlining”—refusing to write mortgage loans—in the inner city, housing stock there deteriorated in the 1950s. Such neglect quickly turned many neighborhoods into slums. In cities such as Chicago, Philadelphia, and Buffalo, white families typically moved to the suburbs a few years after the appearance of the first African Americans on their block. Northern housing soon became more rigidly segregated than it had been at any time since the Civil War.
The nation’s public housing failure exacerbated urban apartheid. Because of resistance from realtors, mortgage bankers, and home builders, the government funded only 320,000 units of public housing in the decade after Congress passed President Truman’s housing act in 1949. To minimize land costs, developers built most of the new “projects” in massive blocks. Unlike the tax subsidies for single-family suburban homes, public housing was thought of as welfare, so local governments usually imposed income restrictions on project residents. Families with rising incomes had to leave, ensuring the economic segregation of those who remained. In the end, no one liked American-style public housing—not the taxpayers, not the housing industry, not the politicians, not even the people who lived there.
Massive expressways added insult to injury as they slashed through urban neighborhoods in the late 1950s. The new superhighways, which replaced more accessible trolleys and interurban trams, often disrupted stable working-class communities. When residents protested, policymakers told them, “You can’t stop progress.” Like low mortgage rates for single-family homes, the government-sponsored freeway boom represented a massive subsidy for suburban commuters and a tax, both fiscal and social, on city dwellers.
A CLOSER LOOK: Redlining in Richmond
The World of "Father Knows Best"
Television reinforced the family-oriented privatization of American society in the postwar era. By 1960, TV was a fixture in 90 percent of all homes, and TV programming mirrored the nation’s social and cultural landscape, if often in an exaggerated form. In the early days of television, radio-inspired situation comedies offered TV viewers a sympathetic glimpse of urban working-class families enmeshed in a world of tenements, street-corner stickball, and manual labor. During the 1950s, shows such as The Honeymooners, starring Jackie Gleason as a New York City bus driver, offered a comic but sometimes realistic portrait of working-class dreams and aspirations. But the television networks soon replaced urban farces of this sort with situation comedies and westerns that bleached ethnicity, class, and social commentary out of their story lines. Father Knows Best, introduced in 1953, exemplified the new world of suburban respectability: the Andersons lived in a large house on Maple Street in “Springfield,” U.S.A. Jim Anderson, the father (played by Robert Young) had no politics, held few strong opinions, and never had a bad day at the office. All the action in Father Knows Best took place at home, where the middle-class father exercised a benevolent despotism over three not particularly rebellious children. Jane Wyatt, who played the mother, maintained perfect order in her house and kept her opinions to herself.
Although small-town domesticity permeated the dominant cultural ideal, some unexpected cultural works challenged the placid world of Father Knows Best. In the mid-1950s, the “beat” writers, led by Allen Ginsberg and Jack Kerouac, denounced what they saw as the materialism, sexual repression, and spiritual emptiness of middle-class American life. “These have been years of conformity and depression,” wrote Norman Mailer in 1957, voicing a critique that was common among intellectuals who were sympathetic to the beats. “A stench of fear has come out of every pore of American life, and we suffer from a collective failure of nerve.” Though the beats attracted only a small following, they stirred widespread controversy and comment. Newspaper and magazine reporters sneered at the “beatnik” style of dress and speech, hinting darkly about “racial mixing” and sexual immorality at their parties.
Rock-and-roll, which exploded onto the American cultural scene in the mid-1950s, originated in the music that dominated Black working-class communities during and after World War II. Many of the leading artists were newly urbanized migrants from the rural South; among the most important was the blues singer Muddy Waters, who moved from rural Mississippi to Chicago in 1941. Waters and his band brought the pulse and energy of the electric guitar to the traditional country blues form. By the mid-1950s, he and other Black singers, such as Chuck Berry and Ray Charles, had created and refined a new musical genre, rhythm-and-blues. Charles’s gospel-inspired piano playing in “I Got a Woman” launched the rock-and-roll revolution in 1954. Elvis Presley, a white Memphis teen, made rock-and-roll a national craze when he combined the energy and beat of African American rhythm-and-blues with the lyrics and sentiments of southern white “country music.” Flaunting his sexuality and his working-class demeanor, Presley scandalized an older generation of viewers when he appeared on the normally staid The Ed Sullivan Show in 1956. But he immediately became a teenage idol, and his songs became rock-and-roll anthems.
While some parents and conservative social critics denounced rock-and-roll as an evil influence on the young, Hollywood quickly discovered that money could be made from the rebellious youth culture. Films such as The Wild One, Rebel Without a Cause, and The Blackboard Jungle offered sympathetic portraits of teenage “delinquents” trapped in a crass adult society that neither cared about nor understood them. Marlon Brando’s and James Dean’s brilliant acting performances communicated the personal alienation of a generation. Partly as a result of the popularization of such attitudes, millions of Americans would, within a decade, come to see the rejection of bourgeois values not as a semicriminal impulse but as a “counterculture”—an alternative way of looking at, and perhaps changing, an unhappy world.
Conclusion: New Challenges for the Postwar Order
During the great boom that began after World War II, the real income of working-class Americans, blue-collar and white-collar, began the advance that would double the standard of living of the majority of families within a single generation. Cold War arms spending helped to boost the economy, but the New Deal legacy—strong unions and a state-regulated business system—generated the stability and security that enabled postwar capitalism to flourish. Americans bought cars, houses, vacations, and TV sets, but they also began a quest for a more expansive and plastic sense of life. Some urban youths chafed at the ideological and social constraints that a claustrophobic Cold War culture imposed, and a few middle-class women reassessed the virtues of suburban family life. But the largest measure of frustration, as well as hope, arose among those whose skin color had denied them full access to the opportunities that the New Deal and the postwar prosperity had opened for so many white Americans. Even before the 1960s had begun, Black Americans would demand that American citizenship, both legal and social, was their birthright as well.
Supplementary Materials
Timeline
1944
The Bretton Woods Conference makes the dollar the basis for international financial transactions.
1945
Harry S Truman becomes president following the death of Franklin Roosevelt.
1946
George F. Kennan lays out “containment” doctrine.
1947
The Taft-Hartley Act, which undercuts unions, passes over President Truman’s veto.
1948
The Soviets blockade Berlin; the United States responds with an airlift.
1949
The CIO expels nine unions for refusing to purge themselves of Communist leaders and support government policies.
1950
Communist North Korea invades South Korea, and U.N.-backed American troops enter conflict; a truce leaving the Communists in control of North Korea is reached in 1953 after a three-year stalemate that results in 34,000 American deaths.
1952
Republican Dwight D. Eisenhower defeats Democrat Adlai Stevenson for the presidency.
1953
The CIA supports a military coup in Iran, restoring the Shah to power.
1954
The Senate votes to censure Joseph McCarthy, who had led anti-Communist crusade.
1955
The AFL and the CIO merge.
1956
Congress funds the multibillion-dollar Interstate and Defense Highway Program, the largest public works project in U.S. history.
1957
The Soviet Union launches the Sputnik satellite into orbit.
1958
The United States launches its first space satellite.
1959
More than half a million steelworkers strike for 119 days to successfully defend union work rules and safety standards.
Additional Readings
For more on the Cold War in a global context, see:
Norman Friedman, The Fifty-Year War: Conflict and Strategy in the Cold War (2000); John Lewis Gaddis, We Now Know: Rethinking Cold War History (1997); Victoria de Grazia, Irresistible Empire: America's Advance Through Twentieth-Century Europe (2005); Jon Halliday and Bruce Cumings, Korea: The Unknown War (1988); Akira Iriye, Cultural Internationalism and World Order (1997); Michael Kort, The Columbia Guide to the Cold War (1998); Melvyn P. Leffler, A Preponderance of Power: National Security, the Truman Administration, and the Cold War (1992); Thomas J. McCormick, America’s Half-Century: United States Foreign Policy in the Cold War (1989); Ronald McGlothen, Controlling the Waves: Dean Acheson and U.S. Foreign Policy in Asia (1993); Brenda Gayle Plummer, Rising Wind: Black Americans and U.S. Foreign Affairs, 1935–1960 (1996) and Odd Arne Westad, The Global Cold War: Third World Interventions and the Making of Our Times (2007).
For more on the decline of the New Deal, the economic boom, and the shifts in the labor movement, see:
Howard Brick, Transcending Capitalism: Visions of a New Society in Modern American Thought (2006); Ely Chinoy, Automobile Workers and the American Dream (1992); Robert M. Collins, More: The Politics of Economic Growth in Postwar America (2000); John M. Findlay, Magic Lands: Western Cityscapes and American Culture After 1940 (1992); Joshua Freeman, Working-Class New York: Life and Labor Since World War II (2000); Barbara S. Griffith, The Crisis of American Labor: Operation Dixie and the Defeat of the CIO (1988); Martin Halpern, UAW Politics in the Cold War Era (1988); James J. Lorence, Palomino: Clinton Jencks and Mexican-American Unionism in the American Southwest (2013); Marc Levinson, The Box: How the Shipping Container Made the World Smaller and the World Economy Bigger (2006); Nelson Lichtenstein, Walter Reuther: The Most Dangerous Man in Detroit (1997); Richard H. Pells, The Liberal Mind in a Conservative Age: American Intellectuals in the 1940s and 1950s (1994); and David Stebenne, Arthur J. Goldberg: New Deal Liberal (1996).
For more on presidential politics in the late 1940s and 1950s, see:
Stephen E. Ambrose, Eisenhower (1987); Alonzo L. Hamby, Man of the People: A Life of Harry S Truman (1998); Michael J. Hogan, A Cross of Iron: Harry S Truman and the Origins of the National Security State, 1945–1954 (1998); David K. Johnson, The Lavender Scare: The Cold War Persecution of Gays and Lesbians in the Federal Government (2004); Zachary Karabell, The Last Campaign: How Harry Truman Won the 1948 Election (2000); Michael J. Lacey, ed, The Truman Presidency (1989); and Arnold Offner, Another Such Victory: President Truman and the Cold War (2002).
For more on the domestic anti-Communist campaigns, see:
Peter Biskind, Seeing Is Believing: How Hollywood Taught Us to Stop Worrying and Love the Fifties (1983); Paul Buhle and Dave Wagner, Hide in Plain Sight: The Hollywood Blacklistees in Film and Television, 1950–2002 (2003); David Johnson, The Lavender Scare: The Cold War Persecution of Gays and Lesbians in the Federal Government (2006); Ronald Radosh and Joyce Milton, The Rosenberg File: A Search for the Truth (1997); Ellen Schrecker, Many Are the Crimes: McCarthyism in America (1998); and Allen Weinstein, Perjury: The Hiss-Chambers Case (1977).
For more on women and gender politics during the 1950s, see:
Stephanie Coontz, The Way We Never Were: American Families and the Nostalgia Trap (1992); K. A. Courdileone, Manhood and American Political Culture in the Cold War (2005); Susan J. Douglas, Where the Girls Are: Growing Up Female with the Mass Media (1994); Barbara Ehrenreich, The Hearts of Men: American Dreams and the Flight from Commitment (1983); Daniel Horowitz, Betty Friedan and the Making of the Feminine Mystique: The American Left, the Cold War, and Modern Feminism (1998); Eugenia Kaledin, Mothers and More: American Women in the 1950s (1984); Helen Laville, Cold War Women: The International Activities of American Women’s Organizations (2002); Elaine Tyler May, Homeward Bound: American Families in the Cold War Era (1999); Joanne Meyerowitz, ed., Not June Clever: Women and Gender in Postwar America, 1945–1960 (1994); Kate Weigand, Red Feminism: American Communism and the Making of Women’s Liberation (2000).
For more on race, ethnicity, and division within the working class, see:
Michelle Brattain, The Politics of Whiteness: Race, Workers, and Culture in the Modern South (2001); Cindy I-Fen Cheng, Citizens of Asian America Democracy and Race During the Cold War (2013); Elizabeth A. Fones-Wolf, Selling Free Enterprise: The Business Assault on Labor and Liberalism, 1945–1960 (1994); Jack Metzgar, Striking Steel: Solidarity Remembered (2000); Stephen Grant Meyer, As Long as They Don’t Move Next Door: Segregation and Racial Conflict in American Neighborhoods (2000); Thomas J. Sugrue, The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit (1998); and Heather Ann Thompson, Whose Detroit?: Politics, Labor, and Race in a Modern American City (2001).
For more on the rise of suburban America and popular culture during the 1950s, see:
Glenn C. Altschuler, All Shook Up: How Rock 'n' Roll Changed America (2003); Rosalyn Baxandall and Elizabeth Ewen, Picture Windows: How the Suburbs Happened (2000); Herbert J. Gans, The Levittowners: Ways of Life and Politics in a New Suburban Community (1982); Kenneth T. Jackson, Crabgrass Frontier: The Suburbanization of the United States (1985); Tom Engelhardt, The End of Victory Culture: Cold War America and the Disillusioning of a Generation (1998); James B. Gilbert, A Cycle of Outrage: America’s Reaction to the Juvenile Delinquent in the 1950s (1988); Peter Guralnick, Last Train to Memphis: The Rise of Elvis Presley (1994); David Halberstam, The Fifties (1993); Dolores Hayden, Building Suburbia: Green Fields and Urban Growth, 1820–2000 (2003); Lisa McGirr, Suburban Warriors: The Origins of the New American Right (2001); Rickie Solinger, Wake Up Little Susie: Single Pregnancy and Race Before Roe v. Wade (1992); Ed Ward, Geoffrey Stokes, and Ken Tucker, Rock of Ages: The Rolling Stone History of Rock & Roll (1986).