Thank you for using Who Built America?  The project is currently in beta with new features to be implemented over the coming months, so please check back. If you have feedback or encounter any bugs, please fill out this form.

Volume 2, Chapter 14

The American People in an Age of Global Capitalism, 1989-2001

For twenty-eight years, the Berlin Wall symbolized the Cold War division of Europe and the power of political ideology to shape the economic and social lives of millions of people in the alliance systems that enfolded most of the globe. Then, early on the evening of November 9, 1989, a young East German couple—we do not know their names—walked to the Invalidenstrasse gate to find out whether the political upheavals in their homeland had opened the barrier to ordinary Berliners. To their amazement, the once fearsome Volkspolizei, who were now the demoralized agents of a rapidly crumbling system, let them pass to the bright lights of the West. Within hours, men and women from both sides of the Wall were attacking the edifice with hammers, picks, and any other instruments they could find. Communism was in collapse, and the Cold War would soon be history.

The startlingly abrupt end of European Communism revolutionized international politics. For the first time in nearly half a century, the United States faced no superpower rival. Indeed, for the first time since the end of World War I, capitalism was once again truly a world system, unchallenged on any continent, including Asia, where even the hard-line Communist rulers of China and Vietnam now welcomed foreign investment and encouraged a new class of entrepreneurs.

An increasingly unfettered system of global trade and finance undermined national sovereignty and economic autonomy, not only in Europe and North America, where barrier-free markets were being put in place, but also throughout East Asia and Latin America, where the International Monetary Fund and other supranational bodies came to play a highly prominent role. McDonald’s, Nike, Toshiba, and the other transnational corporations actively sought to shed their old national attachments. But the power and pervasiveness of capitalist consumer culture hardly eliminated the search for ethnic, racial, and linguistic identity in the years leading up to the turn of the millennium. In the United States, the debate over cultural values and racial identity continued, though never in as tragic and bloody a form as the warfare that erupted over similar issues in the Caucasus region of the former Soviet Union, the Balkan region of Southeastern Europe, and Central Africa.

A New Geopolitical Order

The Cold War cost more than trillion. But such enormous military expenditures did not bring about the collapse of the Soviet Union and its satellites. No NATO tank fired a shot. No bomb fell on the Kremlin. Instead, a massive, home-grown insurgency, led by workers, dissident intellectuals, and advocates of national self-determination, and fueled by the brittle nature of the Soviet economic system, cracked the Communist bloc regimes, thereby leaving the United States as the world’s sole nuclear superpower. During the presidency of George Herbert Walker Bush, the United States took advantage of this new configuration of international power to end the “Vietnam syndrome,” send half a million troops to the Persian Gulf, and inaugurate an era in which the United States saw few obstacles to the direct application of its military power abroad.

The End of the Cold War

The downfall of Soviet power began in 1980 when striking Polish workers organized Solidarność (“Solidarity”), an independent trade union with nearly ten million members. Solidarity, which had strong support from the powerful Polish Catholic Church, demonstrated how a working-class movement could offer an entire nation moral and political leadership. The Polish military drove Solidarity underground late in 1981, but when it reemerged later in the decade, it won a smashing electoral victory. Lech Walesa, a shipyard unionist, was soon installed as the first freely elected president of the Polish nation in more than sixty years.

Solidarity’s example had an impact throughout Eastern Europe. Under a relatively youthful party secretary, Mikhail Gorbachev, the Soviet Union in the late 1980s undertook a series of reforms: perestroika, which was designed to restructure the production system, and glasnost, which was meant to open the society to political and artistic debate. Gorbachev knew that in a world of increasing technical complexity and communications, Soviet-style authoritarianism had become economically dysfunctional. No regime could keep track of all the computers, copiers, and communication devices that were necessary to modern production in the information age. Gorbachev therefore wanted to liberalize Communist rule. But strikes and demonstrations soon erupted throughout the Soviet Union. Coal miners, railroad workers, Baltic nationalists, and urban intellectuals all formed independent organizations that called on Gorbachev to quicken the pace of political and economic reform. In Hungary and Poland, like-minded officials lifted most of the old restrictions on travel and emigration beyond the Iron Curtain.

Beginning in September 1989, a wave of huge demonstrations shook Communist regimes across Eastern Europe. A massive tide of East German emigrants surged through Czechoslovakia and Hungary to the West, undermining the authority of the Communist hard-liners who still clung to power in the German Democratic Republic (GDR). Finally, ordinary Germans poured through the Berlin Wall. The GDR quickly disintegrated, and by the end of 1990, all of East Germany had been incorporated into the wealthy, powerful Federal Republic of Germany. The Communist government in Czechoslovakia also tumbled, and reformers strengthened their hand in Hungary and Bulgaria. In Romania, the Communist dictatorship fell only after a week of bloody street battles between ordinary citizens and police, who defended the old order to the bitter end.

Radical change finally reached the Soviet heartland in August 1991, when thousands of Russian citizens poured into the streets to defeat a reactionary coup d’état. The Communist Party quickly collapsed, and the Soviet Union began the painful and uncertain process of reorganizing itself as a loose confederation of independent republics. Boris Yeltsin, who headed the Russian Republic, replaced Gorbachev as president of a much-diminished state.

The collective courage and willpower of ordinary men and women ended the Cold War. Most insurgents had sought civil rights and political democracy, not a capitalist revolution, but they got one nonetheless. New governments in the Soviet Union and Eastern Europe began opening their economies to Japan and the West, selling off state-owned enterprises and even establishing stock markets. In Poland, Hungary, Estonia, Latvia, and the Czech Republic, the process generated relatively successful economic systems that were oriented toward the West; but in Ukraine, Russia, Romania, Slovakia, and Bulgaria, the standard of living plunged amid political instability and plummeting production. Cuba, China, Vietnam, and North Korea remained authoritarian regimes, but almost all the Asian Communist countries abandoned state planning and encouraged capitalist trade and enterprise.

In the United States, partisans of Ronald Reagan claimed much of the credit for ending the Cold War, greeting the demise of European Communism as a political and ideological triumph. Reagan’s forthright denunciation of the Soviet Union as an “evil empire,” along with his administration’s military buildup, were said to have heartened eastern bloc dissidents at the same time that the arms race exhausted the productive capacity of the Soviet Union and other inefficient Communist regimes. But the Cold War doves held that the West’s militarized posture had long helped the Communists to rationalize their authoritarian rule. As the historian and diplomat George Kennan put it, the more U.S. policies had followed a hard line, “the greater was the tendency in Moscow to tighten the controls . . . and the greater the braking effect on all liberalizing tendencies in the regime.”

George Bush’s “New World Order”

George Herbert Walker Bush, who was elected President in 1988, did not embody the New Right fervor of the Reaganites. He was a traditional upper-class conservative whose father, Prescott Bush, had been a New York investment banker and Connecticut senator. Unlike Reagan, Bush had easily weathered the Great Depression, though he faced real danger as a naval flier in World War II. After he graduated from Yale University, his father staked him to a business career in Texas, where Bush adapted himself both to the entrepreneurial unpredictability of panhandle oil operations and to the hard-edged politics of southwestern Republicanism. After serving two terms in the House of Representatives, he held a series of high-profile appointments under presidents Nixon and Ford, including U.S. ambassador to the United Nations and director of the Central Intelligence Agency.

Bush stood for Reaganite continuity during the 1988 presidential campaign, even as he implicitly criticized the Reagan presidency by calling, in his acceptance speech at the Republican National Convention, for a “kinder, gentler” society. But such moderation would have to wait until after the campaign. Bush chose a brashly conservative Indiana senator, J. Danforth Quayle, for his running mate to appease conservative Republicans. During the fall, his campaign slandered his opponent, Massachusetts governor Michael J. Dukakis, calling him unpatriotic and soft on crime. In an infamous commercial, the GOP touched a racist nerve when it charged Governor Dukakis with responsibility for rapes committed by a Black convict named Willie Horton during a work furlough from a Massachusetts prison.

President Bush felt sufficiently emboldened by the collapse of Communism to announce an American-dominated “new world order.” During his administration, several of the most troublesome proxy wars that had been energized by the larger East-West conflict rapidly wound down. The Soviets withdrew from Afghanistan in the late 1980s (see Chapter 13), after which the CIA stopped supplying the Afghan rebels with advanced weapons. The brutal war there, waged among rival Islamic religious factions, did not stop, but it no longer played a role in superpower politics. Likewise, in Central America, the Bush administration ended U.S. support for the antigovernment Contra forces in Nicaragua in return for a pledge that the Soviets and Cubans would stop supplying arms and aid to the ruling Sandinista party. The deal paid off handsomely for Bush when free elections gave the Nicaraguan presidency to a pro-U.S. candidate in 1990. And in neighboring El Salvador, a peace treaty signed in 1992 ended the civil war there, allowing left-wing rebels to participate in electoral politics.

Finally, the end of the Cold War created a far more advantageous atmosphere for the liberation of South Africa. There, the apartheid regime headed by F. W. de Klerk released from prison the African National Congress (ANC) leader Nelson Mandela, who had spent twenty-seven years in confinement. The ANC had long maintained an alliance with the Communists, which kept U.S. governments wary, especially during the Reagan presidency, when administration policy favored a “constructive engagement” with the racist white South African government. But as Mandela and the ANC swept to power in the early 1990s, even conservative U.S. diplomats applauded. The disintegration of South Africa’s all-white regime quickly reduced the level of violence throughout southern Africa.

The end of the Cold War left the United States as the world’s only superpower, so even as international tensions declined, U.S. leaders had a relatively free hand to deploy American military might abroad, unhampered by a countervailing Soviet response. This new state of affairs became clear in December 1989, when President Bush dispatched thousands of troops to Panama to oust a corrupt dictator, Manuel Noriega, in an overt display of U.S. military power that echoed early-twentieth-century interventions in Latin America. A short, bloody war ended with Noriega’s capture and extradition to the United States, where he was convicted in a Miami courtroom of drug-related offenses. Little actually changed in Panama, which remained one of the major transshipment points for illegal drugs in the Western Hemisphere.

About a year later, in an even more massive deployment of its military might, the United States confronted the troops of the Iraqi dictator Saddam Hussein, whose invasion and annexation of the oil-rich emirate of Kuwait threatened to destabilize U.S. allies in the Persian Gulf. The Bush administration assembled an international coalition, including the Soviets, Germany, Japan, and most Arab states, to endorse the dispatch of nearly 500,000 U.S. troops to the region by the end of 1990. Britain, France, and Saudi Arabia committed an additional 200,000 troops. Although many Americans, including most congressional Democrats, believed that economic sanctions imposed by the United Nations might force Iraq to withdraw from Kuwait, Bush administration officials argued that economic pressure alone would prove ineffective.

On January 15, 1991, U.S. forces launched a massive, high-tech bombing campaign against Iraq called Operation Desert Storm that paved the way for the tank-led ground assault a month later. Mobilizing hundreds of thousands of U.S. and British armored troops, the February campaign took but one hundred hours to reclaim Kuwait and enter southern Iraq. Antiwar demonstrations filled the streets of Washington and San Francisco briefly, but light casualties and a swift victory soon generated a wave of patriotic fervor and soaring approval ratings for the president.

But the Gulf War left a mixed legacy. President Bush hoped that the successful, massive use of military power against Iraq would shatter the nation’s “Vietnam syndrome” by confirming a renewed U.S. willingness to intervene abroad as the world’s unchallenged superpower. But U.S. public opinion remained skittish when it came to the deployment of U.S. troops, especially when diplomatic interventions or humanitarian missions turned violent, as they would in Somalia in East Africa and the former Yugoslavia in Eastern Europe. In the Persian Gulf itself, Kuwait was once again an independent nation, but the Bush administration chose to end the war with Saddam Hussein still in power, largely because the Bush administration saw his regime as a regional counterweight to Iran. However, this meant that the Iraqi dictator would likely seek to rebuild his army and suppress domestic opposition, which he did in brutal fashion immediately following the withdrawal of American troops. The United Nations therefore continued an economically debilitating trade boycott, while the United States maintained an active military presence in the region, periodically bombing Iraqi targets.

A CLOSER LOOK: The First Gulf War and the Media

A New Economic Order

The U.S. military victory in the Persian Gulf was not matched by a sense of economic well-being at home. By the fall of 1990, the U.S. economy had plunged once again into recession. The Iraqi invasion of Kuwait generated a huge spike in the price of oil, which shook consumer confidence and depressed corporate profits and investment. Although oil prices soon moderated, the recession did not lift for a full two years, after which unemployment declined slowly, and real wages remained stagnant. White-collar workers and professionals, like factory workers in earlier decades, found their income and status subject to the ebb and flow of global economic forces. Economic instability also intensified traditional class and racial divisions and tensions in the country, which flared throughout the 1990s. Although the U.S. economy would ultimately boom by decade’s end, the rapid growth of the World Wide Web and other elements of the telecommunications revolution put all labor for sale within an increasingly integrated global marketplace. Millions of new immigrants flocked across U.S. borders seeking better jobs and a secure future. These new arrivals, along with a new generation of labor leaders who had been groomed in the 1960s, became a decisive force in pushing labor back to the left.

The Postwar Economy

Unlike the early 1980s, when blue-collar workers received most of the pink slips, a decade later professional, managerial, and other white-collar workers were just as likely as factory workers to become victims of corporate “downsizing.” Nearly two million people lost their jobs in the three years that followed the Persian Gulf War; 63 percent of American corporations cut their staffs during that time. Post–Cold War layoffs in the high-paying defense industries hit the California economy, which had sailed through the 1980s, particularly hard.. Although the national economy produced millions of new jobs, the threat of layoffs generated a pervasive insecurity at all levels of the workforce. As AT&T’s vice president of human resources explained: “People need to recognize that we are all contingent workers in one form or another.” No wonder some called the downturn of the early 1990s the “silent depression.”

But U.S. corporations staged a remarkable turnaround once the recession began to lift. In contrast to the 1970s and 1980s, American businesses again competed successfully at home and abroad. Productivity growth finally rebounded to levels that had not been seen since the late 1960s, profits leaped upward, and the stock market soared more than fourfold between 1991 and 1999 in one of the great Wall Street booms of all time. Key industries—steel, automobiles, telecommunications, microchip manufacturing, computer software, entertainment, aircraft, and finance—were once again creative, innovative, and profitable. By the late 1990s, unemployment, inflation, mortgage rates, and oil prices had fallen to their lowest levels in three decades. Most experts once again counted the United States the most competitive industrial nation in the world. Time magazine captured the ambiguity of this accomplishment in a 1994 cover story headlined “We’re Number 1. And It Hurts.”

What accounted for this simultaneous sense of productive accomplishment and economic insecurity? The two phenomena were closely linked, because corporations benefited enormously from the wage stagnation, deunionization, low taxes, and deregulatory business climate that characterized this era. Conservatives lobbied to cut taxes and government spending on social programs and embraced the return to a free market economy that became known as neoliberalism. The virtual abandonment of antitrust action during the 1980s and 1990s led to a massive wave of corporate mergers and reorganizations, which opened the door to both cost-cutting layoffs and speculative stock market recapitalizations. “There is no job security anymore,” reported Karen Tarlow, a Wall Street bank officer in her late forties. “It’s very insidious how the rich get richer.” In oil, telecommunications, health care, and finance, a new set of powerfully competitive firms emerged almost overnight. Exxon bought Mobil, Bell Atlantic acquired GTE, WorldCom bought MCI, and Daimler-Benz merged with Chrysler. Between 1992 and 1998, the value of all corporate mergers advanced nearly tenfold to more than one trillion dollars. In real dollar terms, this was the biggest merger wave in nearly a century.

The U.S. government also spurred the competitiveness of corporations that were headquartered in North America by devaluing the dollar against the currencies of other industrial nations. Suddenly, U.S. corporations that traded abroad, such as Boeing, or faced import competition, such as General Electric and the automotive giants, found that they could sell their products more cheaply than those of many foreign competitors. Further easing the competitive pressure on U.S. firms, wages had risen far more rapidly in Europe and Asia than in the United States during the 1980s. Additionally, foreign nations faced some unique challenges in this period. The German government had to raise taxes to pay the huge costs of reunification, while Japan struggled with a decade-long recession brought on by the near-collapse of its overheated banking and real estate sectors. Thus, after decades of decline, the U.S. share of world manufacturing exports began to rise in the years after 1986.

U.S. firms also benefited in the 1990s from adopting the most advanced organizational techniques and technological innovation. In the steel industry, the introduction of German-style mini-mills slashed by more than half the number of hours that were required to produce a ton of steel. In autos, Japanese inventory and production scheduling methods and new levels of automation enhanced quality, cut labor costs, and shortened the engineering time for new models, enabling Chrysler and Ford to earn record profits. And the telecommunications revolution, which made fax machines, e-mail, cellular telephones, and overnight package delivery pervasive, enhanced productivity in factories, offices, and hospitals.

Perhaps most important, the computer revolution finally began to pay off. Despite the fanfare that accompanied dramatic advances in digital technology, computerization had been slow to boost white-collar service sector productivity, whose growth lagged far behind that of even run-of-the-mill factories during the 1970s and 1980s. But by the early 1990s, a generation of employees had been trained on the new machines, which sat on the desks of more than six out of ten workers (compared to fewer than two out of ten in Japan). As Business Week put it, “Networking [the linking together of large numbers of desktop computers] finally united all the systems, and voila! Productivity began to take off.”

But a rising economic tide—including the powerful surge flowing out of the computer industry in Silicon Valley in Northern California—could not lift all boats or solve all problems. The reconfiguration of the American economy brought real social and psychological costs, which were borne not only by the unemployed and the Rust Belt factory workers, but also by millions of suburban families and college-educated “knowledge workers” who might otherwise seem to be the beneficiaries of the new economy. Real family income continued to drop throughout the first half of the 1990s, even as record numbers of women, teenagers, and new immigrants entered the workforce. Health care expenses rose inexorably, twice as fast as the overall consumer price index. In response, insurance companies and corporations restricted coverage and demanded copayments. By 1992, more than thirty-seven million Americans did not have medical insurance.

HISTORIANS DISAGREE: Neoliberalism

On Trial: Race, Gender, and National Identity

The decade of the 1990s was not only a period of growing economic inequality. It was also a period of high-profile investigations, hearings, and trials in which the new politics of race, gender, and American identity played out before a media-savvy audience of millions. The courtroom and the hearing room now served as sites of furious contention. These televised spectacles were not, in fact, well suited to resolving the nation’s deep-seated cultural and social divisions. But a fascinated citizenry focused its gaze on these events because no new election and no new statute could fully represent the complicated and contradictory values that Americans brought to their understanding of race, sex, and nationhood.

President Bush’s decision in 1991 to nominate Clarence Thomas to fill the U.S. Supreme Court seat that had been vacated by the death of civil rights pioneer Thurgood Marshall would have been controversial in any event. Although Thomas had been born into southern poverty, the Yale-educated Black conservative criticized civil rights leaders and had helped to undermine affirmative action litigation as chairman of the Equal Employment Opportunity Commission (EEOC) during the Reagan administration. Republicans backed Thomas, while leaders of the civil rights community, who wanted to keep an African American on the Supreme Court, were split as to whether such a conservative Black figure could properly fill Marshall’s slot on the bench.

Thomas’s confirmation battle before the Senate Judiciary Committee became red hot in October 1991 after Anita Hill, a Black law professor who had been an aide to Thomas, testified that the nominee had frequently made lurid remarks to her and repeatedly pressured her for dates. Feminists and liberals who were hostile to the Thomas nomination immediately championed Hill’s charges. The nominee charged that his accusers were turning the confirmation proceedings into a “high-tech lynching.” The nation watched in stunned amazement as a panel of white male senators subjected Hill’s motives and veracity to fierce personal attack. The Senate confirmed Thomas by a vote of fifty-two to forty-eight, even as Americans became far more sensitive to issues of sexual harassment in the workplace. In 1992, the EEOC recorded a 50 percent jump in official complaints on the issue.

Television played a key role in another racial spectacle that transfixed the nation. On the evening of March 3, 1991, an African American motorist, Rodney King, became a symbol of white racism and police brutality when a nearby resident captured on videotape the beating that King suffered at the hands of four Los Angeles policemen after a traffic stop. Millions of people saw King take more than fifty blows from club, foot, and flashlight as he lay on the ground. Sympathy for King turned to violent outrage in April 1992 when a white suburban jury acquitted his police assailants. African American and Latinx rioters burned hundreds of houses and stores in the South Central section of Los Angeles, particularly those occupied by Korean immigrant shopkeepers who, despite their recent arrival in Los Angeles, enjoyed more economic success than the rioters. The city called in police and National Guard, but the violence killed 53 people and injured thousands. Property damage amounted to a billion dollars, making the riot the most costly in U.S. history.

A Digital Revolution

Television was not the only powerful cultural and political force that transformed American society in these years. The deployment of millions of easy-to-use computers began to replicate the productivity breakthrough brought on by the birth of the mass-production assembly line early in the twentieth century. Then, skilled tool and die craftsmen built the precision metal-cutting machinery that would be operated by so many untutored farmhands and immigrants. In the 1990s, skilled programmers churned out thousands of different computer programs (“software”) that allowed clerical workers and managers to perform tasks once restricted to well-trained professionals. “Up until the early 1980s, the only people able to use personal computers were a very tiny elite,” reported a Princeton economist. “Now, a lot of software is for numskulls.”

Soon the Internet and the World Wide Web linked together millions of computers all across the globe. The Internet had its origins in Pentagon efforts, begun in the 1960s, to build a communications network that would be capable of surviving a nuclear war and to share expensive computer resources. But imaginative scientists and clever hackers soon spread this digital network well beyond the military laboratories and university research facilities of its inception. By the late 1980s, e-mail was becoming commonplace, and in the mid-1990s, the development of graphical, interactive web pages generated an explosive new stage in the growth of this medium. The number of Web pages doubled every eighteen months thereafter, transforming the locus of commerce, entertainment, and information retrieval.

The imaginative hold of this vast digital network approached that of other great world-transforming technologies: steam power and the railroad in the early nineteenth century, electricity fifty years later, and the internal combustion engine during the first third of the twentieth century. Like these technologies, computerization promised a revolutionary transformation in the structure of production, the organization of society, and the meaning of work. And for the people who were in the right place at the right time, it generated enormous wealth.

Like Rockefeller and Ford before him, Microsoft president Bill Gates combined technical expertise and business savvy to make himself the richest man of his era. Born to a wealthy Seattle family in 1955, Gates dropped out of Harvard to join the wave of youthful West Coast computer “hackers” who refounded the computer industry in the late 1970s. Like Steve Jobs and Steve Wozniak, creators of Apple Computer Corporation and pioneers in personal computing, Gates put his entrepreneurial faith in the proliferation of millions of inexpensive desktop computers. But rather than building the hardware, his firm purchased, rewrote, and copyrighted DOS, the essential operating system for the machines. Within a decade, Microsoft programs had become the de facto software standard for more than 80 percent of all the personal computers that were sold in the world. The growth requirements of the software industry meshed seamlessly with many of the characteristics that were peculiar to late-twentieth-century U.S. capitalism: entrepreneurial flexibility, a close partnership with a large number of commercially oriented universities, a cosmopolitan and multiethnic workforce, and “ownership” of the cyberworld’s lingua franca: American English.

The craft workers of this cyber-revolution were the driven, youthful programmers, whose workaday roles actually resembled those of the skilled machinists, technicians, and draftsmen who had been crucial to the industrial transformation of the nation nearly a century before. Like the proud craft workers of Bridgeport and Cincinnati, college-educated programmers stood at the very nexus of production. “It is their skill in coding, in turning strings of numbers into life-altering software, that is Microsoft’s lifeblood,” observed a computer-savvy journalist. But for most programmers and other skilled workers, 35 percent of whom were immigrants, loyalty lay with the craft, not with any single company. Job-hopping was endemic in the software industry. Indeed, a growing proportion of these skilled technical workers were “temps,” temporary workers who received few benefits other than a paycheck from the companies in whose offices they spent their many hours, months, and years.

The Internet made the world smaller. Employing digital technology, firms could now farm out highly skilled jobs across the globe, not just manufacturing jobs. This phenomenon, what many companies called “outsourcing,” included both technology and service sector jobs. For example, in the 1990s, Dell Computers began setting up call centers in India to handle customer support. This move proved to be so successful that roughly 44 percent of Dell’s workforce toiled outside the United States by 2003. Dell then slashed almost 6,000 U.S. jobs, most of them in central Texas near the firm’s headquarters.

The New Immigration

Globalization did not just send goods, technology, and jobs abroad, it also made possible, even necessary, the massive immigration that transformed U.S. politics and economic life in the decades after 1970. Throughout the late twentieth century, huge numbers of immigrants from Asia and Latin America flocked to U.S. shores to fill millions of new service, retail, clerical, and light manufacturing jobs. This new wave of immigrants rivaled in sheer numbers the great trans-Atlantic flows of a century earlier. In the 1960s, annual immigration had totaled only a quarter-million; by the 1990s, the United States was admitting more than 800,000 legal immigrants a year and perhaps half again as many undocumented immigrants, mainly from Mexico. More than 40 percent of the newcomers were from Asia, especially the Philippines, China, South Korea, and Vietnam; about 35 percent came from Latin America and the Caribbean.

One of every three new immigrants entered the United States through California, making the nation’s most populous state its unofficial Ellis Island as well. By the 1990s, one-third of the population of Los Angeles, the nation’s second-largest city, was foreign-born; on the North American continent, only Mexico City had more Spanish-speaking residents. As hundreds of thousands of Mexicans and Central Americans streamed into poor neighborhoods and communities in East Los Angeles and the San Gabriel Valley, tens of thousands of Koreans settled in an old working-class neighborhood just west of downtown Los Angeles. At the same time, equally large numbers of immigrants from Taiwan, Hong Kong, and Vietnam transformed the old Chinatown neighborhood near City Hall. Asians and Latinx residents would shortly make up more than half the workforce in southern California.

New York’s foreign-born population, like that of Los Angeles, also approached 35 percent of its total populace in the 1990s—a level that the city had last reached in 1910, at the height of southern and eastern European immigration. Long-established immigrant communities, including those composed of Puerto Ricans, Irish, and Poles, grew rapidly during the 1970s and 1980s, even as hundreds of thousands of Haitian, Dominican, Colombian, East Indian, Chinese, and Russian immigrants settled into the city’s poorer neighborhoods. Nearly 200,000 Mexican immigrants arrived after the peso was devalued in 1986. Mostly undocumented, they traveled thousands of miles by truck and car from some of the poorest rural regions of Mexico in search of work. “We came because we are poor farmers and our parents did not have enough to send us to school,” explained a young food deliverer.

Newcomers also transformed Miami; by the 1980s, it had the highest percentage of foreign-born residents of any U.S. city. First came an influx of nearly 600,000 Cubans, many of them well-to-do exiles from the Cuban Revolution of 1959. Then, in the late 1970s and early 1980s, tens of thousands of political and economic refugees arrived from Haiti, Guatemala, El Salvador, and Nicaragua. The huge number of Latinx immigrants changed the face of the city; as Spanish became the language of foreign trade, Miami became the commercial “capital of Latin America,” a politically stable, financially well-regulated marketplace that was hospitable to businesspeople from a dozen countries. Latinx businesses flourished in the North as well. In old industrial cities such as Passaic, Paterson, and Union City, New Jersey, Latinx-owned businesses, including restaurants, nightclubs, cigar shops, fruit stands, and clothing stores, transformed the look, sound, and smell of the main shopping areas. As one Union City resident noted, a few years earlier, many shops “used to [have] signs saying, ‘We speak Spanish.’ Now the signs say, ‘We speak English.’”

In the United States, immigrants with skills, family connections, and an entrepreneurial outlook could do very well. In Los Angeles, New York, and other cities, many Korean families owned and managed fruit and vegetable markets. Vietnamese, Chinese, Thais, Mexicans, and Iranians opened tens of thousands of new restaurants, making the American dining experience far more cosmopolitan. Indian, Pakistani, and Chinese immigrants with English-language, engineering, and computer electronics skills won a solid beachhead in Silicon Valley and other high-tech enclaves. And a small number of wealthy individuals from Hong Kong, Saudi Arabia, and Japan took advantage of the undervalued dollar to make substantial investments in commercial real estate, residential properties, and stateside industries.

Most of the new immigrants were solidly working class, however. They came because even minimum-wage work in the United States paid five or ten times more than they could earn in the cities, barrios, and villages of their homelands. Like the Italians and Irish who arrived in the nineteenth century, many of the new immigrants hoped to return to their native countries to buy a farm, build a house, or open a business. But like their predecessors, most did not succeed. In New York and Los Angeles, Latinx and Asian immigrants labored in hundreds of sweatshops of the sort that progressive reformers had once condemned. Likewise, in Nebraska, Colorado, and South Dakota, a new generation of immigrants labored in slaughterhouses, under conditions similar to those portrayed in Upton Sinclair’s 1904 novel The Jungle. “We came with illusions of earning a little money, investing it, and doing something in Mexico,” noted one undocumented immigrant in New York City. “But those who get 0 for working seven days, what can they do? They return defeated to Mexico.” Even where the work was technology-based and legal (as in many of California’s most successful new computer firms), English-speaking clerical, sales, and research and development employees worked in the front office; in the back shop, scores of Asian and Latinx women built chips, stuffed circuit boards, or moved inventory. Their work remained as routine and insecure as that of sweatshop garment workers a hundred years before.

Labor’s Leftward Movement

By the early 1990s, American trade unions were in decline. Unionized labor represented only 16 percent of all American workers; each year, the proportion dropped. Trade unions no longer set the wage standard in any major industry—not even in automaking or steel, in which nonunion factories and mills represented an increasingly large production sector. Perhaps most threatening, the idea of unions seemed stale and antiquated. “‘Organized Labor.’ Say those words, and your heart sinks,” wrote Thomas Geoghegan, an embattled prounion attorney. “Dumb, stupid organized labor.” Under the leadership of Lane Kirkland, whose politics had been molded by the conservative labor chieftain George Meany, the AFL-CIO devoted few resources to organizing and had little presence on television or in other media. Kirkland and the generation of unionists who came of age during the administrations of Franklin Roosevelt and Harry Truman thought that labor’s revival depended on the election of a prolabor president and Congress, which would pass laws to liberalize U.S. labor law to make organizing easier. But growing Republican control of both the Senate and the House in the 1990s foreclosed any expectation that the unions would find much help from the federal government.

In this crisis, an insurgent group of top union officials, mostly representing workers in public employment, the service trades, and manufacturing, overthrew Kirkland and took control of the AFL-CIO. It was the first successful challenge to a sitting AFL or AFL-CIO president in more than one hundred years. The new AFL-CIO chose John Sweeney as president.. As the former president of the Service Employees International Union (SEIU), Sweeney recruited a multiracial organizing corps numbering in the hundreds, some with New Left backgrounds. SEIU had spent a sizable proportion of its dues to successfully organize thousands of janitors, health care workers, and public employees. Its organizers sometimes employed the tactics of the civil rights movement—sit-ins, civil disobedience, and public marches—to organize low-wage African American, Asian, and Latinx workers in the fast-growing service sector of the economy.

Sweeney and his allies moved the labor movement to the left in hopes of revitalizing it. They reached out to feminists, civil rights leaders, ecologists, left-wing academics, and liberal clergy in an effort to build a prounion coalition. The AFL-CIO also recruited thousands of students for a high-profile “union summer” organizing experience. “Labor must organize without the law,” asserted the new AFL-CIO president, “so that we can later organize under the law.” It had taken a quarter-century, but the Sixties generation had finally made its voice heard at the union movement’s leadership level.

For a time, it seemed that this initiative might work. The SEIU, the largest union in the country, organized hundreds of thousands of poorly paid home health care workers in California, Illinois and other states. Workers in textile mills in Martinsdale, Virginia, and Kannapolis, North Carolina, whose unionization efforts had been thwarted for decades, finally won National Labor Relations Board elections. And in Las Vegas and Los Angeles, citywide labor movements led by New Left veterans successfully leveraged the organization of thousands of Latinx immigrants to build the kind of labor political power that shifted politics in both Nevada and California in a more liberal direction. In almost every national election that was held in the late 1990s and into the new century, organized labor proved to be effective in mobilizing its own members and their families, and they became increasingly reliable Democratic voters.

The powerful International Brotherhood of Teamsters underwent a remarkable transformation during these years as well. For decades, the top leadership of the Teamsters seemed to be synonymous with corruption and criminality, but in the 1990s, a vigorous reform movement, the Teamsters for a Democratic Union, helped to elevate Ron Carey and a rank-and-file slate to top union posts. Carey and other reformers battled old-guard unionists to democratize and energize the Teamsters. Their efforts paid off in the spectacular strike victory of 185,000 Teamsters against the United Parcel Service (UPS) in August 1997. The first successful nationwide strike in nearly two decades, the UPS strike won widespread public support for labor’s demand that UPS upgrade thousands of low-wage, part-time jobs to permanent, full-time positions. “The rank and file felt like, yes, we do make a difference,” said Barb Joyce, a Des Moines truck driver. “The day after the strike when we went back to work everybody on the road was waving and blowing kisses to us.”

But Carey’s success was short-lived. In a bitter reelection contest, Carey relied on Washington consultants who, unknown to him, illegally laundered contributions to bolster his campaign. When this scheme became known, a federal judge overturned the election and threw Carey out of the union. Carey’s fall (he was ultimately replaced by James P. Hoffa, son of the legendary Teamster leader) hurt union reform, but for the first time since the 1940s, the unions maintained a firm alliance with other liberal organizations. The Seattle meeting of the World Trade Organization in November 1999 graphically illustrated this alliance when unionists and ecological activists united to demand that the WTO enforce worldwide labor and environmental standards. A protest banner proclaimed: “Teamsters and Turtles: Together at Last!”

Historically low levels of unemployment in the last half of the 1990s raised wages and emboldened workers. By the end of the decade, the official jobless rate stood just above 4 percent, the lowest level in three decades. Low-paid service employees, especially Latinx and African American workers, who had not shared in the 1980s boom, now began to find jobs at wages that were rising smartly for the first time in a generation. Full employment, sustained for several years, had a radically beneficial impact on the lives of the poorest Americans. Faced with a labor shortage, companies began to offer jobs to people who had been described as unemployable just a few years before. In the cities, crime rates began to fall, the drug culture dissipated, and some young African American men began to find steady employment. Campaigns to institute citywide “living wages” (sufficient income to put workers above the poverty line) proliferated across the country.

But organized labor failed to take real advantage of these more favorable circumstances. Employers such as Wal-Mart, Kmart, FedEx, and the Japanese auto factories that had been transplanted to the United States remained uncompromisingly hostile to unionism. Union membership rose in the public sector, but when the imports and recession devastated industrial employment in the early 2000s, union ranks again dropped dramatically, such that organized labor represented just one in ten American workers. A sense of crisis once again gripped sections of the AFL-CIO top leadership, precipitating the formation in 2005 of a rival labor federation called Change to Win. Determined to organize the unorganized, the SEIU-led group hoped to recapture the insurgent spirit that had been sparked by the old Congress of Industrial Unions seventy years before.

The Rise and Fall of Clintonian Liberalism

Labor was hopeful in the 1990s because the political winds shifted, if briefly, in its favor. President Bush had been a caretaker executive who saw his domestic policy goals in largely negative terms—avoiding, as he once put it, “stupid mistakes” in order to “see that the government doesn’t get in the way.” He once told his chief of staff, John Sununu, “We don’t need to remake society.” William Jefferson Clinton came into office in 1992 with ambitious plans for social reforms, especially in the area of health care. But he also promised to recast American liberalism so as to make it acceptable to the generation who had voted for Ronald Reagan and George Bush. No matter how much the Democratic Party’s agenda shifted toward issues of middle-class social and economic concern, Clinton partisans could not appease the rightward-tilting Republicans, who gained control of Congress in 1994 and impeached the president in 1998 for lying about a sexual affair with a White House intern. Despite the labor movement’s bold effort at revitalization, such intractable ideological conflicts, along with Clinton’s own personal and political blunders, constrained the rebirth of liberal politics and social policy in the 1990s.

The 1992 Election

In the recession that followed the Gulf War, the Bush administration’s passivity in the face of economic hardship cost the president virtually all the goodwill the U.S. military victory had generated. President Bush did sign legislation that raised the minimum wage, stiffened clean-air regulations, and protected Americans with disabilities against discrimination, but Bush championed one piece of legislation with the greatest consistency: a reduction in the capital gains tax, whose benefits would flow largely to the wealthy. Democratic prospects therefore looked bright in 1992 when Bill Clinton, the forty-six-year-old governor of Arkansas, emerged from the crowded primary field to become the party’s presidential nominee. Clinton called himself a “New Democrat,” and he was anything but a conventional liberal. He supported the death penalty, favored a work requirement for parents receiving welfare support for their children, and proved to be indifferent to organized labor. As the five-term governor of a conservative southern state, he accommodated himself to the interests of the region’s economic elite. Within the Democratic Party, he allied himself with prodefense, free-trade conservatives. His wife, Hillary, was a partner in Arkansas’s most powerful law firm and for a time a member of Wal-Mart’s board of directors.

Although his political career had tilted toward the center, Clinton was in many ways a product of the 1960s. The son of a widowed nurse growing up in a small Arkansas town, he became a consummately ambitious student-politician. He identified with the civil rights movement in high school, and as a Rhodes Scholar at Oxford University, he took part in anti–Vietnam War demonstrations. Though not as radical as some activists in his generation, Clinton avoided the draft, experimented with marijuana (though he claimed he “never inhaled”), and campaigned for left-liberal Democrats, including George McGovern in 1972. After graduating from Yale Law School, he returned to Arkansas, where his large circle of friends helped him to win the governorship in 1978. He was just thirty-two, the youngest U.S. governor in four decades.

The 1992 campaign was intensely ideological and the first election contest since 1964 in which domestic political issues held center stage. George H. W. Bush stood for the status quo, but he faced a determined challenge from the Republican Party’s right wing. In 1988, when he accepted the Republican nomination for president, Bush had pledged to the convention delegates, “Read my lips: no new taxes.” But two years later, Bush broke that pledge to reach a budget deal with the Democrats that sought a much-needed reduction in the federal budget deficit, which had climbed to more than 3 percent of the entire gross domestic product. His pact with the Democrats infuriated conservative Republicans and sparked a primary challenge from Pat Buchanan, a former high-level aide to presidents Nixon and Reagan. A prominent New Right warrior, Buchanan was hostile to abortion and gay rights. He also exemplified a new strain of post–Cold War conservatism that rejected Wilsonian internationalism and trade liberalization. Buchanan therefore campaigned against the highly controversial North American Free Trade Agreement (NAFTA), which was supported by most multinational corporations and the leadership of both major political parties.

Billionaire computer services entrepreneur Ross Perot proved to be an even more significant political outsider. Although Perot had once been a conservative Republican and a right-wing backer of the Vietnam War, he spent million of his own money to run an independent campaign for president as a commonsense “populist.” Perot argued for more education and training and a kind of hands-on economic governance that was at odds with laissez-faire doctrine. “It’s time to take out the trash and clean up the barn,” he told TV audiences in his folksy, down-home manner. Perot therefore struck a nerve among both liberals and conservatives.

Bill Clinton and his running mate, Al Gore, an environmentally conscious senator from Tennessee, promised to break the budget gridlock in Washington and raise living standards by rebuilding the nation’s infrastructure, restoring higher taxes on the rich, launching a federally funded jobs program for welfare recipients, and, perhaps most important, reorganizing the nation’s entire health care system. Clinton saw himself in the tradition of Franklin Roosevelt and the New Deal. As the candidate promised to focus “like a laser” on the work-related anxieties of ordinary Americans, campaign strategist James Carville tacked a soon-to-be famous note above his desk: “It’s the economy, stupid!”

Clinton and Perot crushed Bush in the 1992 election. The Republican took only 37 percent of the popular vote, while the mercurial Texas businessman garnered a remarkable 19 percent—the best third-party showing since Theodore Roosevelt’s Progressive Party campaign of 1912. Clinton and Gore won 43 percent, which translated into 370 electoral college votes, 100 more than they needed to win. As usual, African Americans voted for the Democratic ticket by almost nine to one. And the proportion of women who voted for Clinton and Gore was some eight points higher than that of men. Poll after poll demonstrated that women were particularly likely to favor and benefit from the kind of social programs that Clinton and most Democrats advocated. Fewer women than men enjoyed employer-funded health insurance; conversely, children and their mothers were the prime beneficiaries of food stamps and Medicare. Because of their child-care needs and employment status, many women endorsed the programs to aid women and children that conservatives derisively called “the Nanny State.” Liberals and feminists therefore labeled 1992 “the Year of the Woman.” Female representation in the House of Representatives nearly doubled—from twenty-eight to forty-seven—and the number of women in the Senate tripled, from two to six. African Americans increased their number in the House to forty-one, an all-time record.

The Clinton Administration

The political trajectory of the Clinton administration can be divided into two phases. Between late 1992 and the fall of 1994, Clinton and his advisers sought to implement an ambitious program of social reform that harked back to issues that had last debated during the liberal heyday of the mid-1960s. But Clinton’s failures, both personal and political, led to a sweeping Republican victory in the 1994 congressional elections, after which his administration sought little more than survival in office, even at the cost of a programmatic accommodation to Congressional conservatives.

Clinton appointed an unusually diverse cabinet. He selected African Americans to run the departments of Agriculture, Commerce, and Energy, heretofore the reserve of conservative white businessmen; and he made Janet Reno, a Florida law enforcement official, the first female attorney general. Latinx politicians from Colorado and Texas took over stewardship of the Department of Transportation and the Department of Housing and Urban Development. Clinton named the liberal economist Laura Tyson as head of the Council of Economic Advisers and Robert Reich, a well-known critic of the nation’s growing inequality of wealth, education, and wages, as Secretary of Labor. Not long after his inauguration, Clinton had the chance to appoint to the U.S. Supreme Court Ruth Bader Ginsberg, a pioneer in the legal fight against gender discrimination. And in 1994, he nominated Stephen Breyer, an economic conservative whose social views tended toward liberalism, to the Supreme Court.

Early in 1993, Clinton signed social legislation that Bush would have vetoed: the Family and Medical Leave Act, which guaranteed workers their jobs when they returned from childbirth or a family medical emergency; the Brady bill, which regulated the sale of handguns; and the “Motor-Voter” bill, which made voter registration available through many state agencies, including those that issued driver’s licenses. Clinton ended the Reagan era ban on abortion counseling in family-planning clinics and won new funding for more police and prisons, as well as a youth-oriented job corps.

But on the big economic issues, the Clinton administration demonstrated substantial continuity with the policies of his Republican predecessors. In his presidential campaign, Clinton emphasized the need for new social investment: in infrastructure, education, environmental technology, and health care. Once in office, however, he dropped the fight for large-scale infrastructure spending—and the jobs it would have created—when his more conservative advisers warned that the federal government needed to sacrifice social spending and emphasize deficit reduction. Clinton did push through Congress a substantial tax increase on wealthy individuals, which restored some of the tax progressivity that had been lost during the Reagan era. Otherwise, his administration remained fiscally conservative, which mollified Wall Street bond traders and generated the lower interest rates that Clinton’s more orthodox advisers thought necessary for business investment and economic recovery.

Despite this accommodation to the fiscal conservatives within his own administration, Clinton proved to be a polarizing figure whose person and presidency evoked social and cultural controversies that had been smoldering since the Vietnam era. To many Americans, Clinton never seemed to be an entirely legitimate president, especially with regard to military issues. Although they maintained a facade of apolitical neutrality, many military officers were contemptuous of their commander-in-chief, who had avoided the draft during the Vietnam War. Therefore, Clinton’s early effort to support LGBTQ+ rights within the armed services generated a storm of criticism, forcing his administration to promulgate a confusing “Don’t Ask, Don’t Tell” doctrine regarding the orientation of LGBTQ+-identified enlisted people.

When it came to the use of American military forces abroad, the Clinton administration was hesitant and inconsistent. For the first time in more than half a century, a powerful, isolationist current emerged within the ranks of both political parties. Clinton and his key advisers continued to support United Nations peacekeeping missions and NATO military interventions in unstable regions, but they sought to avoid, at almost all costs, the death of American troops in foreign combat. This first became clear in Somalia, where a U.N. humanitarian effort to end political anarchy and provide food and relief supplies had turned into a military conflict with regional warlords. When a firefight cost the lives of 17 U.S. servicemen in Somalia in October 1993, Clinton responded by withdrawing all American troops. In the former Yugoslavia, where Serbian and Croatian nationalists battled each other and the Serbs instigated a series of bloody “ethnic cleansing” campaigns against Bosnian Muslims, the Clinton administration refused to use sufficient military force to actually stop the massive bloodshed. In Bosnia, therefore, the United States and other Western powers stood aside while Serbian nationalists shelled Muslim Sarajevo for more than two years. Finally, in late 1995, after NATO bombed Serbian gun emplacements and tank units, the United States brokered a settlement that sent American and other U.N. troops to Bosnia as peacekeepers for a set of new, ethnically based mini-states. Three years later, the same dynamic played itself out in Kosovo, a Serbian province that was inhabited largely by ethnic Albanians.

Clinton considered his economic diplomacy of even greater importance than these skittish military interventions. His administration backed U.S. membership in NAFTA over the adamant opposition of organized labor and most Democratic liberals. NAFTA made it far easier for Canadian and U.S. corporations to buy low-cost goods from Mexico, sometimes produced by the American subsidiaries that had fled south to take advantage of the low wages there. Although the movement of jobs to Latin America did not amount to the “giant sucking sound” Ross Perot had predicted in the 1992 presidential campaign, NAFTA proved a powerful weapon in the hands of employers, who used the specter of a factory shutdown to forestall employee drives for higher pay and unionization. Some economists estimated that nearly one-quarter of all the recent growth in wage inequality derived from this downward pressure on U.S. wages.

Health Care Reform

In the most important legislative battle of his administration, Clinton fought to establish a system of universal health care. More than 20 percent of all Americans under the age of sixty-five had no insured access to a doctor. Moreover, health care costs were rising at twice the level of inflation, and the United States was spending more of its total income, 14 percent, on medical care than any other nation. America’s fragmented, employment-based commercial health insurance system created a paperwork nightmare. A Toronto hospital administrator who was familiar with Canada’s “single-payer” system of universal coverage (the government paid doctors and hospitals from tax revenues) found U.S. health care costs bloated by “overwhelming duplication of bureaucracies working in dozens of insurance companies, no two of which have the same forms or even the same coverage.”

The U.S. health insurance system raised contentious political and social issues. During the 1980s, management efforts to trim health insurance costs precipitated more than 80 percent of all strikes that took place in the United States. The United Mine Workers (UMW) of America fought the most spectacular of these struggles in 1989—an eleven-month siege of the Pittston Coal Company—in defense of miners’ health care benefits and pension rights. Police arrested some 3,000 miners and UMW supporters during a campaign that resurrected the sit-down tactics and mass demonstrations that had been characteristic of the industrial union movement during the Great Depression.

President Clinton and his wife, Hillary, whom he put in charge of the health care project, rejected a Canadian-style single-payer system. Although recognizing its economic efficiency and political popularity, they argued that health care reform had to be built on the existing system of employer-paid benefits and private insurance. The Clinton plan would have regulated insurance costs, allowing the government to mandate employers to provide health insurance for all their employees.

But the Clintons miscalculated. American capitalism had transformed itself dramatically since the last era of health care reform in the 1960s (when Medicaid and Medicare became law), and low-wage, low-benefit companies in the expanding service sector, especially restaurants and retailers, bitterly resisted employer health care mandates. In addition, almost all the smaller insurance companies, which sought the youngest and least risky clients, assailed the plan. These companies bankrolled a widely viewed set of television commercials that pointed to the complexity and regulatory burden that were inherent in the Clinton plan. Trade unions strongly backed the Clinton plan, but after more than two decades of waning strength, organized labor commanded far less political influence than it had when Congress had enacted Medicare three decades earlier.

Indeed, the fate of the Clintons’ plan turned into a referendum on the capacity of the state to resolve social problems. Their proposed reforms would have instituted a new level of social citizenship, but conservatives feared such an entitlement, both because of the expense and because of the legitimacy it conferred on governmental activism. The crisis of confidence in all levels of government greatly strengthened the right-wing critique. In the 1960s, more than 60 percent of all Americans trusted the federal government. Thirty years later, after Vietnam, Watergate, and three severe recessions, fewer than half that many thought as well of their national governing institutions.

By August 1994, when the Clinton health plan died in Congress, even many Democrats had abandoned the ambitious effort to restructure one-seventh of the nation’s economy. Health care inflation did moderate in the mid-1990s, both because of the regulatory scare and because of the rapid rise of health maintenance organizations, which had eclipsed hospitals and individual physicians as primary providers of medical services. But health insurance remained linked primarily to employment, which meant that the twenty-first-century rise in unemployment and health care inflation would strand millions of additional citizens without medical insurance.

Congressional Conservatives Going to Battle

The collapse of the Clinton health care initiative generated a large vacuum in American politics, which a reinvigorated Republican Party promptly filled. With a Democrat as president for the first time in twelve years, the leading conservative media and political spokesmen became more strident and more ideological. In a 1994 election manifesto, Georgia congressman Newt Gingrich codified the broad right-wing militant attacks on “feminazis,” “Washington insiders,” and “political correctness” into the “Contract with America,” which sought to unify ideologically scores of Republican congressional campaigns. Gingrich and other GOP conservatives avoided divisive cultural issues such as abortion rights and school prayer, calling instead for large reductions in federal social spending, congressional term limits, partial privatization of Medicare and public education, the elimination of five cabinet departments, and a new set of tax cuts.

Although many voters were apparently unaware of Gingrich’s Contract with America during the 1994 elections, liberals were nonetheless dispirited during the campaign. Labor did not mobilize its troops, and among women, turnout was the lowest it had been in twenty years. The Republicans captured control of both the House and Senate for the first time in forty years, won several governorships, and gained ground in most state legislatures. In the House of Representatives, the elections sent to Washington a large, unified class of ideologically right-wing GOP freshmen. As the newly chosen speaker, Gingrich embodied a dramatic transformation within the Republican Party, whose legislative leadership now shifted from the old Midwest to the Deep South.

The resurgence of the Republican Party was made possible in large measure by the dramatic resurgence of conservative, evangelical religious sentiment and organization across America. Most members of the Religious Right simply advocated their beliefs and values while continuing to participate in civic life, but some fundamentalist sects sought to remove themselves entirely from the temptations, corruptions, and regulations of contemporary American society. One such Christian group, the Branch Davidian cult and their charismatic leader, David Koresh, occupied a compound near Waco, Texas, in the early 1990s. When U.S. Bureau of Alcohol, Tobacco, and Firearms (ATF) agents sought, in early 1993, to arrest the Branch Davidians, who, they suspected, possessed illegal firearms, at their Waco compound, the Branch Davidians offered armed resistance. In the battle that followed, four ATF agents were shot and killed. After a fifty-one-day standoff, Attorney General Janet Reno approved a new assault on the compound. But instead of a firefight, government agents triggered a conflagration and mass suicide that killed 86 people in the compound, including 25 children.

For many Americans on the extreme right, the fiasco at Waco transformed Koresh into a martyr who had defended both gun ownership and Christian separatism. An anti-Semitic, antiBlack militia movement that was intensely hostile to the authority of the federal government emerged in some economically hard-pressed rural areas. The nation became acutely aware of such widespread sentiment on April 19, 1995, precisely two years after the Waco tragedy, when Timothy McVeigh and Terry Nichols, who had contact with right-wing militias, exploded a fertilizer truck bomb in front of the Murrah Federal Building in Oklahoma City, killing 168 people. At the time, it was the most costly terrorist attack on U.S. soil.

Clinton’s hopes at the outset of his presidency for carefully orchestrated reform were over. He managed a successful rearguard defense of his presidency after 1994, but only by shifting his politics to accommodate Republican conservatives. The president quickly distanced himself from the remaining liberals in Congress, later announcing, in his 1996 State of the Union address, “The era of big government is over.” He signed into law a drastic revision of U.S. welfare law that ended the federal government’s sixty-year commitment to families with dependent children. Henceforth, caregivers—most of whom were young mothers—would be eligible during their entire lifetime for only five years of federal benefits. Many states soon enacted even more restrictive guidelines. The government provided no new monies for child care or job training yet expected most welfare recipients to get a job in the private sector, a task that the boom in fast-food and other service-sector employment eased in the late 1990s. Conservatives claimed that this reform would break the welfare “cycle of dependency.”

Gingrich Republicanism had its echo in California, where voters enacted a state ballot initiative in 1994 curbing the rights of undocumented immigrants to access vital social services. Proposition 187 denied unlawful residents of the United States access to prenatal and childbirth services, child welfare, public education, and nonemergency health care. Its passage, which reflected the severity of the economic downturn in California, was a belated effect of a 1978 ballot proposition that had frozen most property taxes, costing local government more than 0 billion in badly needed revenue. (California schools, once among the best funded in the country, had deteriorated sharply, to quality levels that had historically been characteristic of the far poorer South.) In a pattern reaching back to the anti-Asian movements of the late nineteenth and early twentieth centuries, conservatives laid the blame for economic insecurity on nonwhite immigrants. An infamous TV spot that was broadcast during the 1994 campaign showed Mexican immigrants rushing through a San Diego border checkpoint, with the ominous voiceover: “They keep coming.” Although federal courts overturned the statute, the controversy over immigration would continue to polarize California, and eventually national, politics.

Persistence of Racial Divisions

America’s political divisions were matched by growing racial tensions that flared anew in the second half of the decade. The wall of mutual distrust that had long existed between Black city dwellers and the police, revealed so starkly during the Rodney King beating and the bloody Los Angeles riot that followed, grew throughout the 1990s. These tensions became visible to tens of millions of Americans in 1995 when police arrested the African American former football star and sports commentator O. J. Simpson after the stabbing murder of his ex-wife Nicole and her friend Ronald Goldman, both of whom were white. Simpson’s televised trial, which saturated the airwaves and print media and dominated everyday conversation for nine months, turned into an international media extravaganza. His high-powered defense team exploited America’s racial divide, charging detectives who investigated the murder with racism and portraying their client as the victim of a biased system of justice. Sixty percent of all African Americans thought Simpson innocent, while 75 percent of white people believed him to be guilty. His acquittal in October 1995 by a jury of nine Black people, two white, and one Latinx person revealed once again the pervasive racial attitudes that constitute the diverging social and legal realities of white and Black Americans.

The success of the 1996 “Million Man March” seemed to confirm the continuing centrality of race and gender—as opposed to class or politics—as a defining identity for huge numbers of African Americans. Answering the call of the Black Muslim leader Louis Farrakhan, hundreds of thousands of African American men assembled peacefully on the Mall in Washington, D.C., to affirm their solidarity and dignity. Leaders of the march, the largest convocation of African Americans since the 1963 March on Washington headed by Dr. Martin Luther King, Jr., made no demands on the government but implored Black men to reclaim the moral and economic leadership of the Black community and Black family.

In the same year as the Million Man March, however, voters in the nation’s largest state enacted the California Civil Rights Initiative, which banned affirmative action at the University of California and in state agencies, further intensifying racial divisions and hostilities between white and Black Americans. Most university administrators and corporate executives strongly backed affirmative action guidelines. They saw diversity as essential to the legitimacy and effectiveness of their businesses in a multiracial society. An exhaustive study by two former Ivy League college presidents demonstrated that African American students who were admitted under affirmative action guidelines graduated as readily as any other group of students. They then went on to careers in the professions, government, and business in proportions even greater than that of their white peers. But affirmative action remained highly controversial in the United States in the mid-1990s. Most conservatives argued that it contravened the idea of a color-blind society, of a social order based on merit, and that the civil rights laws of the 1960s had successfully ended most racism, eliminating the necessity for such policies. Indeed, the conflict over affirmative action turned into an argument over the degree to which racism remained a reality within American politics and society.

Polarization and Stalemate

In the years just before and immediately after the turn of the millennium, American politics became highly polarized. A plurality of all Americans supported President Clinton, sought a separation of church and state, favored abortion rights, and looked to the government to regulate business and tax the wealthy. But free trade, affirmative action, and foreign policy issues, especially those involving the rights of Palestinians and Israel’s settlement policy on the West Bank of the Jordan River, divided the Democrats. In contrast, the Republicans formed an increasingly unified political, cultural, and religious identity. By the late 1990s, with a “new economy” boom in full swing, big corporations became increasingly hostile to government taxes and regulations. Their agenda won the support of a politicized and rapidly growing evangelical Protestant community, which now conflated liberalism, secularism, progressive taxes, and federal interest in social welfare with gay rights, out-of-wedlock children, hostility to Christianity, and a subversion of patriotic values. Bitter division over such issues characterized both the impeachment of President Clinton late in 1998 and the presidential campaign and disputed election of 2000.

Impeachment Politics

In the 1996 election, Clinton and Gore ran a relatively muted, apolitical campaign. Running against Robert Dole, a veteran GOP senator from Kansas, they had little trouble winning reelection, albeit with only 49 percent of the popular vote. (Ross Perot took only 8 percent this time.) Turnout in the election fell to its lowest level since 1924. But the election provided a political coming of age for one group of voters. The Latinx electorate, which had increased substantially, swung sharply to the Democrats. Although this rapidly growing population had given Ronald Reagan more than 40 percent of their vote, many Latinx voters were furious with the GOP’s anti-immigrant politics. In both 1996 and 1998, their wholesale shift into the Democratic column locked up California for Clinton and his party.

Clinton’s reelection did nothing to ameliorate conservative hostility to his presidency and his person. The U.S. Constitution provides for the removal of a sitting president if the House of Representatives, by a majority vote, impeaches (charges) and the U.S. Senate, by a two-thirds vote, convicts the president of “high crimes and misdemeanors.” Three presidents have faced such a quasi-judicial drama. In 1867, the politics of Reconstruction stood at the heart of Andrew Johnson’s Senate trial; in 1974, Richard Nixon resigned when the House seemed certain to impeach him for abuse of presidential power during Watergate; and in December 1998, the House impeached Bill Clinton on charges of perjury and obstruction of justice. The constitutional crisis that enveloped Clinton hinged not on his public statecraft, but on his character and the consequences of his personal conduct. To conservatives, President Clinton was an intensely polarizing figure. He had appropriated many Republican issues, such as a balanced budget, welfare reform, and tough-on-crime measures to expand the death penalty and put more police on the street. At the same time, he seemed to embody the conservative’s view of the lax social and cultural values of the 1960s, including an unconstrained sexuality and freewheeling capacity to spin the truth in his personal and business affairs.

Clinton’s legal and political troubles began in 1994, when Attorney General Reno appointed an independent counsel to investigate questionable real estate investments that Bill and Hillary Clinton had made in 1978 in what became known as “Whitewater.” Although several of the president’s earlier associates were eventually jailed in connection with these Arkansas land dealings, Clinton and his wife never faced criminal charges. However, four years later, another independent counsel, Kenneth Starr, an aggressive conservative former judge, transformed the Whitewater investigation into a probe of Clinton’s sexual conduct and the truthfulness of his testimony on this subject.

In 1995 and 1996, President Clinton had an affair with Monica Lewinsky, a twenty-two-year-old White House intern. This episodic sexual relationship was not itself illegal, but lying about it under oath or attempting to convince others to do so would constitute perjury and obstruction of justice. Clinton might well have done so when Lewinsky and he gave depositions in a sexual harassment suit. Clinton damaged his credibility when, in January 1998, he publicly and vehemently denied having had sex with Lewinsky and then, six months later, admitted to an “inappropriate relationship” with Lewinsky before a grand jury.

The furor over the Lewinsky affair revealed deep national divisions and revived the culture wars that had characterized public debate in the late 1980s and early 1990s. To those who sought the president’s impeachment and removal from office, Clinton’s mendacity about the affair spoke for itself. His recklessness shamed the nation, subverted its legal institutions, and proved him unfit for office. Clinton seemed to embody all that conservative moralists found intolerable in contemporary American life. Representative Tom DeLay of Texas, one of the most militant of the conservative Republicans, thought the impeachment fight “a debate about relativism versus absolute truth.” House Judiciary Committee chairman Henry Hyde wondered whether, “after this culture war is over that we are engaged in, an America will survive that’s worth fighting to defend.”

Clinton’s supporters did not defend his sexual conduct or the lies he told about them. But they thought impeachment and removal from office a punishment that was disproportionate to his transgressions. They drew a sharp line between his personal conduct and his role as an effective political leader. Moreover, to many Clinton partisans, Kenneth Starr’s massive, intrusive investigative effort was, in the words of Hillary Clinton, part of “a vast right-wing conspiracy” that represented an attempt by the Republican Right to persecute Clinton and “criminalize” normal political debate.

Nearly two-thirds of all Americans solidly backed Clinton’s continuation in the White House, even as they repudiated his private conduct. The president’s strong approval ratings were bolstered by steady economic growth, low unemployment, and his popularity as a defender of Social Security, public education, and racial reconciliation. African Americans, white women, and liberal Democrats fervently supported Clinton, in part because they found nothing particularly dreadful in lying about extramarital relationships and in part because of the prominent role played by conservative white southerners in the impeachment investigation. Therefore, Clinton partisans applauded when the Democrats recaptured the California statehouse and gained five congressional seats during the 1998 midterm elections, a remarkable showing for the party that held the White House. Taking responsibility for the GOP setback, Newt Gingrich resigned both his speakership and his seat in the House. (Ironically, it was later revealed that Gingrich was having an extra-marital affair with a young assistant while actively criticizing Clinton’s similar behavior in the midst of the impeachment crisis.)

But conservative Republicans still controlled both houses of Congress, and they doggedly persisted in their effort to oust the president. By a narrow partisan majority, the House of Representatives approved four articles of impeachment against Clinton on December 18, 1998. But the House impeachment managers could not convince two-thirds of the Senate to follow suit; they acquitted the president on February 12, 1999. For conservative Republicans, the abortive prosecution of the president constituted a sharp political setback, but their success in tarnishing the Clinton persona was sufficient to sideline him during the presidential election season that was about to unfold.

The 2000 Election

Despite Clinton’s difficulties, the Democrats had a lot going for them in 2000. A seven-year economic boom had pushed the stock market to new highs and lowered unemployment to levels that had not been seen since the prosperous 1960s. The nation was at peace, and the federal budget actually ran a surplus for the first time in three decades. Vice President Al Gore, whom the Democrats chose at their Los Angeles convention, represented continuity—perhaps too much so given Clinton’s tarnished persona. Gore therefore selected as his running mate Senator Joseph Lieberman, a centrist Democrat from Connecticut, who had been one of the few Democrats to publicly chastise Clinton during the Lewinsky affair.

Most Republican leaders threw their support behind Texas Governor George W. Bush, the oldest son of the former president. Bush had avoided going to Vietnam by serving in the Texas Air National Guard. He had then followed his father into the Texas oil business but proved to be most successful as the politically well-connected owner of the Texas Rangers baseball team. He was more authentically Texas than his patrician father had been, and in the late 1970s, he had experienced a religious conversion that put him at ease and at one with evangelical Christians.

Surrounding himself with advisors from his father’s administration, candidate Bush represented a kind of restoration: he would rescind the Clinton tax increases, restore a traditional sense of honor to the White House, and avoid humanitarian interventions abroad. Although he had to fight his way through the early primaries against Vietnam War hero John McCain, who represented Arizona in the U.S. Senate, Bush captured the Republican nomination because of staunch support from two key constituencies. He had the overwhelming support of corporate America, in particular the oil industry, whose allegiance he confirmed by selecting as his running mate Richard Cheney, the former Reagan era defense secretary and Wyoming congressman, who had most recently been CEO of Halliburton, an international oil services and construction conglomerate. And Bush had the passionate endorsement of Protestant evangelicals, whose social and cultural infrastructure—a growing set of radio and television networks; huge megachurches that functioned as social and educational centers for busy suburbanites; and a vigorous, well-funded network of think tanks and advocacy groups—made them a potent and organic part of the Republican political universe.

But George Bush could not win a majority of American voters for his program. Late in the 2000 election season, Al Gore’s adoption of a rhetorical populism revitalized his labor and liberal base, while on the left, the consumer advocate Ralph Nader, running as an independent, denounced corporate influence in both parties, thereby winning 2.7 percent of the national vote. Bush accommodated this Democratic and liberal majority by showcasing African American and Latinx speakers at the Republican convention and then by asserting a Republican “compassionate conservatism” that endorsed higher educational standards in the public schools and favored subsidizing the cost, but not controlling the price, of medicine for the elderly.

On November 4, 2000, Gore won the national popular ballot by more than half a million votes, but the Electoral College count—and the presidency—would be determined by who won Florida, with its crucial twenty-five electoral votes. The television networks mistakenly gave Florida to Bush on election eve, but the GOP margin, fewer than one thousand votes out of some six million ballots cast, was far too close to be decisive. As all eyes focused on a required recount, it became clear that the electoral process, in Florida and elsewhere, was deeply flawed. The highly partisan and constitutionally questionable set of politics that had been on display during the impeachment crisis now spilled over into bitter wrangling that engulfed the election outcome. Even before the election, the Florida Republicans who controlled most state offices purged tens of thousands of likely Democratic voters from the rolls. In Palm Beach County, a poorly designed ballot led several thousand Gore supporters to vote inadvertently for the right-winger, Pat Buchanan, who was also running for president. In other Florida counties, especially those with lots of poor and African American voters, antiquated vote-counting machines threw out thousands of ballots on which the paper tabs, called “chads,” were not fully punched out.

Gore’s lawyers demanded a hand count of these rejected ballots, but an even more determined set of Bush attorneys filed suit to halt the recounts. For thirty-seven days, no one knew who would be the next president. Florida secretary of state Katherine Harris, who had cochaired Bush’s Florida campaign, certified a Bush victory in the state ballot, but on December 8, the Florida Supreme Court ordered a recount of all ballots that the state’s voting machines had thrown out. The issue finally went to the U.S. Supreme Court, which ruled in a partisan five- to-four decision on December 12 that the recounts should stop. George W. Bush was made president by court decision.

The Early Days of the George W. Bush Administration

Although the Republicans had hardly won a mandate either in the general election or in Congress, where the Senate was divided 50-50, the Bush administration quickly signaled that it would govern from the right in order to push forward its ambitious program of tax reductions and regulatory reform at home, combined with nationalist unilateralism abroad. Bush appointed Colin Powell, the widely respected former head of the Joint Chiefs of Staff, as the nation’s first African American secretary of state and put a traditional business conservative, Paul O’Neill, at the Treasury. But other appointments tilted much farther to the right. His national security advisor, Condoleezza Rice of Stanford University, also an African American, was a tough nationalist conservative who had served under Ronald Reagan. Secretary of Defense Donald Rumsfeld, who had held the same post under President Ford, surrounded himself with foreign policy hawks. Attorney General John Ashcroft and chief political advisor Karl Rove, both highly politicized evangelicals, sought to enhance and expand the linkages between the GOP and conservative Protestants and Catholics.

President Bush used all his powers to push social and economic policy to the right. Early in 2001, he rescinded regulations that had been approved by President Clinton that would have better protected workers from repetitive motion injuries. He set up a White House office to funnel federal funds to religious organizations that provided social services, and he greatly restricted government-funded medical research that used the stem cells that were produced during an early stage of human embryo development. And he supported a voucher program by which the federal government would pay a part of the tuition of students who wanted to attend private or religious schools.

While these ideologically charged initiatives served to consolidate Bush’s evangelical base, the president placed a massive tax reduction for business and the wealthy at the top of his domestic agenda. In 2001, he asked for a package of tax breaks for the wealthy, totaling .6 trillion over ten years. The Senate balked, but Congress gave the president .35 billion nevertheless.

When it came to foreign policy, the Bush administration was unilateralist to a far greater degree than that of any recent president, including his father and Ronald Reagan. The new administration revived work on the “Star Wars” antimissile system that had first been proposed by Reagan; it requested funds for a new generation of tactical nuclear weapons; and it allowed the U.S.-Russian Anti-Ballistic Missile Treaty to lapse. In the Middle East, Bush administration hawks tacitly endorsed the Israeli settlement program on the West Bank and stepped up pressure against Iran, which had inaugurated its own nuclear program. The Bush administration was suspicious of the United Nations and refused to join its International Criminal Court.

The president rejected the view, endorsed by most nations and almost all scientists, that carbon dioxide emissions from power plants and motor vehicles were responsible for the global “greenhouse effect” that was warming the earth’s atmosphere. The administration therefore rejected domestic conservation measures and worked closely with oil and chemical companies to reduce costly environmental regulations. Vice President Cheney dismissed conservation measures as but a “sign of personal virtue” that had no serious role in “a sound comprehensive energy policy.” Alone among 178 nations, the United States refused to sign the Kyoto Treaty that addressed global warming by setting emission standards for industrialized nations. Then the Bush administration boycotted a conference in Bonn that had been called specifically to meet U.S. objections.

Conclusion: America’s Political Rift

The boldness and early success of the Bush agenda reflected the ideological and political coherence of American conservatism at the dawn of the twenty-first century. But it had not won over a majority of Americans; nor was it popular abroad. By the late summer of 2001, the stalemate that had gripped U.S. politics in the late Clinton years seemed pervasive once again. The Republicans lost their control of the Senate when a maverick GOP Vermonter declared himself an independent. Bush’s approval rating in the polls stood at a remarkably low 50 percent; indeed, the very legitimacy of his presidency remained in question as the bright early days of September 2001 unfolded. But all that was about to change—and in the most dramatic and tragic fashion.

Timeline

1980

Striking workers in Poland organize Solidarity, which challenges Communist rule.

1981

IBM licenses the MS-DOS operating system, owned by a small Seattle company called Microsoft, for its line of personal computers.

1988

Soviet leader Mikhail Gorbachev introduces perestroika and glasnost reforms. Three years later, the Communist Party collapses.

1989

Germans pour through the Berlin Wall, signaling the end of Communism in Eastern Europe.

1990

President Bush breaks his “no new taxes” pledge.

1991

In Operation Desert Storm, U.S. forces launch a massive air assault against Iraq in response to its invasion of Kuwait.

1992

Democratic Arkansas governor Bill Clinton defeats President Bush and third-party candidate Ross Perot to become president.

1993

Congress enacts the employer-friendly North American Free Trade Agreement legislation.

1994

U.S. troops intervene in Haiti in an effort to restore democracy.

1995

Reformer John Sweeney is elected AFL-CIO president.

1996

President Clinton signs into law a major revision of the welfare law that ends the sixty-year commitment to poor families with dependent children

1997

One hundred eight-five thousand Teamsters wage a successful strike against the United Parcel Service, the first successful nationwide strike in nearly two decades.

1998

Economic growth and new tax revenues combine to turn the federal budget deficit into a surplus for the first time in years.

1999

The Senate acquits Clinton in the second impeachment trial of a president in the nation’s history.

2000

Nation faces thirty-seven days of uncertainty following contested presidential election between George W. Bush and Al Gore in which the U.S. Supreme Court, in a 5-4 decision, decides in Bush’s favor.

2001

George W. Bush is inaugurated as the forty-third president.

Additional Readings

For more on the new global economic order, see:

James K. Galbraith, Created Unequal: The Crisis in American Pay (1998); Doug Henwood, Wall Street: How It Works and for Whom (1997); Arlie Russell Hochschild, The Time Bind: When Work Becomes Home and Home Becomes Work (1997); Chalmers Johnson, Blowback: The Costs and Consequences of American Empire (2000); Michael T. Klare, Resource Wars: The New Landscape of Global Conflict (2001); Robert Kuttner, Everything for Sale: The Virtues and Limits of Markets (1999); Richard K. Lester, The Productive Edge: How U.S. Industries Are Pointing the Way to a New Era of Economic Growth (1998); Thomas Piketty, Capital in the Twenty-First Century (2016); Quinn Slobodian, Globalists: The End of Empire and the Birth of Neoliberalism (2018)/ Tom Vanderbilt, The Sneaker Book: Anatomy of an Industry and an Icon (1998); and Peter Schrag, Paradise Lost: California’s Experience, America’s Future (1999); Joshua Freeman, American Empire, 1945-2000: The Rise of a Global Power, the Democratic Revolution at Home (2013).

For more on the computer revolution, see:

Paul E. Ceruzzi, Computing: A Concise History (2012); Tung-Hui Hu A Prehistory of the Cloud (2015); Steven Levy, Hackers: Heroes of the Computer Revolution (1994); Steven Manes and Paul Andrews, Gates: How Microsoft’s Mogul Reinvented an Industry—And Made Himself the Richest Man in America (1994); and Dan Schiller, Digital Capitalism: Networking the Global Market System (1999).

For more on the labor movement in the 1990s, see:

Leon Fink, The Maya of Morganton: Work and Community in the Nuevo New South (2007); Bill Fletcher and Fernando Gapasin, Solidarity Divided: The Crisis in Organized Labor and a New Path toward Social Justice (2008); Kim Moody, Workers in a Lean World: Unions in the International Economy (1997); Jo-Ann Mort, ed., Not Your Father’s Union Movement: Inside the AFL-CIO (1998); Bob Ortega, In Sam We Trust: The Untold Story of Sam Walton and How Wal-Mart Is Devouring America (1998); and Ray M. Tillman and Michael S. Cummings, eds., The Transformation of U.S. Unions: Voices, Visions, and Strategies from the Grassroots (1999).

For more on the Clinton Administration, see:

Kenneth Baer, Reinventing Democrats: The Politics of Liberalism from Reagan to Clinton (2000); James Macgregor Burns and Georgia J. Sorenson, Dead Center: Clinton-Gore Leadership and the Perils of Moderation (1999); David Maranis, The First in His Class: The Biography of Bill Clinton (1996); Michael Meeropol, Surrender: How the Clinton Administration Completed the Reagan Revolution (1998); Theda Skocpol, Boomerang: Clinton’s Health Security Effort and the Turn Against Government in U.S. Politics (1996); and Bob Woodward, The Agenda: Inside the Clinton White House (1995).

For more on the new immigration, see:

Elliott Robert Barkan, And Still They Come: Immigrants and American Society, 1920 to the 1990s (1996); Nancy Foner, From Ellis Island to JFK : New York’s Two Great Waves of Immigration (2000); Ruth Milkman, L.A. Story: Immigrant Workers and the Future of the U.S. Labor Movement (2006); David M. Reimers, Still the Golden Door: The Third World Comes to America (1985); and Reed Ueda, Postwar Immigrant America: A Social History (1994).

For more on race in the 1990s, see:

Michelle Alexander, The New Jim Crow: Mass Incarceration in the Age of Colorblindness (2010); Lani Guinier, Lift Every Voice: Turning a Civil Rights Setback into a New Vision of Social Justice (1998);  Cindy Hahamovitch, No Man’s Land: Jamaican Guestworkers in America and the Global History of Deportable Labor (2013); George Lipsitz, The Possessive Investment in Whiteness: How White People Profit from Identity Politics (rev. ed., 2006); and DeWayne Wickham, Bill Clinton and Black America (2002).