Thank you for using Who Built America?  The project is currently in beta with new features to be implemented over the coming months, so please check back. If you have feedback or encounter any bugs, please fill out this form.

Volume 2, Chapter 13

Economic Adversity Transforms the Nation, 1973-1989

Like his father before him, Steve Szumilyas worked at Wisconsin Steel on Chicago’s Southeast Side. At 4:00 p.m. on Friday, March 28, 1980, he was checking steel slabs before they went into the reheating furnace when his foreman came by with news that would shatter his world. The company would lock the gates at the end of the shift; the mill was closing, and 3,400 steelworkers were out of a job. Szumilyas was on the street. Had it not been for his wife’s new job, his family would have lost their suburban home. Steve Szumilyas would be employed again, but at only half the wages he earned in basic steel production.

Szumilyas’s layoff would prove symptomatic of the era. “Nobody,” wrote Time magazine, “is apt to look back on the 1970s as the good old days.” After a quarter-century of rapid growth, wages, productivity, and output dropped sharply in all of the great industrial nations. Recessions became more severe and more frequent, while unemployment rose to double the average level of the immediate postwar years. Many Americans connected this economic turmoil to liberal foreign and civil rights policies. Throughout the 1970s, the radical right gained more notoriety and support for attacking government activism and traditional liberalism. Thus, these conservative men and women, who decried the developments in the rights-conscious Sixties and the pillars of the New Deal social and political order, paved the way for conservative Ronald Reagan’s election to the presidency in 1980.

The Shifting World Economy

The economies of all the major industrial powers became more integrated in the 1970s and afterward. Oil, food, manufactured goods, money, and people circulated at a much higher level throughout the global economy. But globalization did not generate stability or prosperity. In the United States, the growth in the efficiency of the economy—what economists call productivity—dropped like a stone. Over the next quarter-century, economic growth would continue, but at an annual rate of about 2 percent—far lower than the rate during the twenty-five years after World War II. Meanwhile, West Germany and Japan, which had been bested on the military battlefield, would forge ahead in peacetime, competing successfully for the American market in steel, autos, machine tools, and electrical products. Inflation combined with low economic growth reduced corporate profits. In response, most businesses sought to drive down their costs by cutting wages, lowering their tax burden, and attempting to end the governmental regulations they found most burdensome. Many U.S. corporations moved production to low wage, nonunion regions of the South or to “offshore” sites in Latin America or Asia.

As corporate concerns moved to the center of American politics, the liberal statecraft that had animated the New Deal and the Great Society became unworkable. The Nixon and Carter administrations’ efforts to apply the fiscal remedies of economist John Maynard Keynes—increasing taxes and government spending to dampen inflation, boost employment, and encourage business investment—proved increasingly ineffective in the years after 1973. The policy of spending to end an economic slump, which liberal economists of the 1960s had seen as a remedy for high unemployment, proved politically unattractive a decade later when federal budget deficits also stoked the inflationary fires.

The End of the Postwar Boom

With inflation levels soaring—reaching a high of more than 18 percent in one year—and the unemployment rate rarely below 7 percent, the decade after 1973 brought the postwar boom in the United States to an unsettling end. For two full decades, from the early 1970s to the early 1990s, real wages stagnated for most Americans, and for young males, they actually dropped by 25 percent. At the end of these two decades, a freshly minted male college graduate could look forward to earning only slightly more each year than a typical high school graduate of the previous generation. Family income increased during these decades of slow growth, but largely because many women and teenagers took paying jobs and because all Americans worked longer hours. Meanwhile, income inequalities widened dramatically. Top corporate executives earned about 20 times as much as ordinary workers did in the 1960s; thirty years later, the multiplier was an astounding 115 times.

What accounted for this new era of economic hard times and social inequality? To begin with, the Vietnam War’s spiraling costs set off the first of several great waves of inflation, which soon made U.S. products more expensive than those of other countries. In 1971, the amount that was paid to foreign producers for imported goods exceeded that paid for U.S. exports; for the first time in the twentieth century, the United States registered a balance-of-payments deficit. Consequently, the United States could no longer maintain the dollar as the world’s currency standard, so in August 1971, President Nixon allowed the value of a dollar to “float”—to go up or down in relation to the price of gold and to other currencies according to the world’s shifting economic and political currents. In effect, Nixon had devalued the dollar, which made U.S. exports cheaper but imports more expensive. Soon, imported oil became much more costly. In October 1973, in the midst of a war between Israel and its Arab neighbors, Arab oil producers declared an embargo on oil shipments to the United States and Western Europe. The Organization of Petroleum Exporting Countries (OPEC) manipulated the resulting shortage to raise prices from to a barrel. Just five years later, after a revolution in Iran toppled the pro-Western monarchy there, oil prices reached a barrel.

Although many economists and pundits thought that the oil shortage signaled the onset of a worldwide drop in the production of fossil fuels, the energy crisis of the 1970s was actually a political phenomenon. The marketplace reflected the U.S. defeat in Vietnam and the subsequent shift in power from Western consumer countries to the Latin American and Middle Eastern states that controlled OPEC and other producer cartels. In 1974, the disruption of energy supplies and the dramatic rise in the price of oil led to federally mandated gasoline rationing. Panicky motorists lined up at gas stations to buy a commodity that most had taken for granted only weeks before; in the Northeast, thousands of truckers blocked interstates demanding a price rollback. Conservation measures, the discovery of new oil fields, and a reassertion of Western economic power would end the worldwide energy crisis about a decade later, but while the oil shortage lasted, Americans began to feel increasingly insecure, hostage to economic forces well beyond their control.

As the oil shock reverberated through the U.S. economy, the rising price of gas and oil forced many energy-reliant industries to close. In 1974 alone, factory output fell 10 percent, and unemployment nearly doubled. Inflation rose into the double digits, eroding the value of pensions and paychecks. Economists coined a new term—“stagflation”—to describe this unusual mix of economic problems: low levels of economic growth, high unemployment, and persistent inflation. Overnight, Americans became far more pessimistic. “You always used to think in this country that there would be bad times followed by good times,” commented a Chicago housewife. “Now, maybe it’s bad times followed by hard times followed by harder times.”

Higher oil prices and growing competition from foreign producers would have been less of a problem had not thirty years of Cold War militarization distorted key sectors of the U.S. economy. Encouraged by huge government procurement contracts, American businesses—especially electronics and aviation—had focused their capital resources and technological know-how on producing armaments. Such massive spending provided employment for large numbers of defense workers in the postwar years. But ultimately, the nation’s huge military budget—on a proportionate basis twice as great as Germany’s and seven times that of Japan’s—sapped America’s productive strength, diverting resources from the development of commercially competitive products.

Worse, the expertise that developed sophisticated, high-tech military products was not readily transferable to the increasingly competitive consumer market. For example, during the 1960s, managers at the venerable Singer Sewing Machine Company, whose product stood for Yankee ingenuity in every hamlet from Spain to Surinam, focused corporate research efforts on missile and warplane guidance systems. Singer’s share of the world sewing machine market spiraled downward when the company failed to retool its U.S. factories and lost its reputation for high-quality production. Sewing machines from Sweden and Korea took over the American market, forcing Singer to close its flagship New Jersey factory in 1979. By the 1980s, Singer manufactured sewing machines only in Hong Kong.

Finally, a legacy of the 1960s, the growing rights consciousness in the workplace, created legal and regulatory difficulties for many U.S. companies. Workplace safety and health as well as the equitable treatment of women and African Americans became inflammatory issues during the 1970s. A new feminist consciousness in the workplace helped to generate laws and court decisions covering areas of interpersonal relations and employer-employee contact that had once been considered exclusively private. The Equal Employment Opportunity Commission received more than one hundred thousand complaints per year. Only later, after much debate and protest throughout the 1970s and 1980s, did Congress enact the Americans with Disabilities Act in 1990 and the Family and Medical Leave Act in 1993, both of which prohibited employers from discriminating against their employees because of physical incapacity or absence from work because of childbirth or family emergency. Thus, the hiring, pay, promotion, and layoff of employees became subject to governmental review and private litigation to an extent that was unmatched even during the heyday of the industrial union movement three decades earlier.

The ecological consciousness that emerged after 1970 also demonstrated the extent to which a sense of democratic empowerment had transformed the corporate economy. Until the early 1970s, nuclear power seemed the embodiment of technological progress, national security, and economic efficiency. Powerful, authoritative supporters in Congress, industry, and science argued that the “mighty atom” would end U.S. dependence on foreign oil and dirty coal. But commercial nuclear power proved neither as safe nor as cheap as its advocates claimed, and some “experts,” radicalized by the Vietnam War, defected from the pronuclear consensus. When thousands of antinuclear protestors took to the streets in the late 1970s, numerous engineers and scientists bolstered the credibility of this new social movement. Dramatic confirmation of such fears came in 1979 when a near-meltdown at Pennsylvania’s Three Mile Island utility near Harrisburg forced the evacuation of 100,000 people. Radioactive contamination outside the facility seemed slight, but the actual cleanup of the reactor proved a time-consuming, billion-dollar project. Soon after the incident at Three Mile Island, thirty energy companies canceled plans to build nuclear reactors.

A CLOSER LOOK: Expanding Disability Rights Activism

The New Shape of American Business

With economic hard times in the offing, many conservatives in the business community and in politics argued that the nation would have to shift resources from labor to capital and cut back the environmental programs and regulatory laws that investors and industrialists thought burdensome. Thus, Business Week editorialized in 1974, right after the first oil shock: “it will be a hard pill for many Americans to swallow—the idea of doing with less so that big business can have more.” Earnings dropped one-third from the previous two decades; in manufacturing, workers earned only about one-half of their earlier pay. Although an excess of managers, the growth in foreign competition, and the rise in energy prices accounted for the lion’s share of the difficulties, the new rights-conscious activism proved highly visible and vexing. In the decade before 1975, Congress passed more than twenty-five major pieces of regulatory legislation that required 40,000 new federal workers to administer. In response, corporations stepped up their efforts to influence government decisions, increasing fivefold their lobbying operations in the nation’s capital.

For generations, American firms had periodically moved their factories from one state to another to take advantage of low wages and cheap land. In the 1970s, this trend accelerated as firms moved out of the Northeast and into the Sunbelt, a broad crescent stretching south from Virginia to Florida and west through Texas and southern California. Federally funded superhighways and more efficient telecommunications linked this vast region to the population centers of the North and enabled firms to build small, highly efficient factories and warehouses. Generous tax incentives encouraged the move, and the introduction of air-conditioning made the region more suitable to office work. Finally, the massive influx of Latin American and Southeast Asian workers into Florida, Texas, and California offered corporations an even larger pool of cheap Sunbelt labor. By the 1980s, formerly rural North Carolina had the highest percentage of manufacturing workers of any state as well as the lowest blue-collar wages and unionization rates in the country. The North lost more than manufacturing jobs. The computerization of clerical work allowed large financial service firms such as Merrill Lynch, American Express, and Citibank to shift many operations to the South and the West.

If jobs could be moved to Texas, they could also be shifted to Mexico, Taiwan, and Indonesia. Until the 1960s, U.S. investment in Latin America and the Pacific basin focused largely on the extraction and processing of raw materials that were mined or grown in those regions. But beginning in the 1970s, a number of American firms produced some of their most sophisticated components in low-wage foreign factories. Between 1971 and 1976, manufacturers of color TVs shifted more than 90 percent of their subassembly production to Asia. Such foreign “outsourcing” of high-value goods and parts sustained the profitability of many “American” product lines. But in the long run, such policies eroded the U.S. manufacturing base and the technical expertise of its workers, managers, and engineers.

Such trends reshaped the business landscape in general and the fates of two great companies in particular. Until the 1970s, General Motors was the largest manufacturing company in the world. But the corporation’s failure to build fuel-efficient or stylish cars cost it more than 15 percent of the entire U.S. automobile market in a single decade. Because of its cumbersome, hierarchical bureaucracy, GM, which had once been the most cost-efficient of all automobile firms, now lagged behind both foreign rivals and U.S. competitors. In response, GM managers closed more than twenty factories and outsourced billions of dollars in parts production to low-wage firms in the American South and Mexico. The corporation’s blue-collar payroll fell by half, devastating once bustling centers of GM production such as Flint and Pontiac in Michigan.

The rise of the retailing giant Wal-Mart foretold which type of companies would survive the postboom economy. At the end of the 1980s, Wal-Mart was the nation’s leading retailer, with a payroll that was second in size only to that of the U.S. Postal Service. When Sam Walton founded Wal-Mart early in the 1960s, Sears, Woolworth’s, and other long-established chains that dominated the big cities and their nearby suburbs dwarfed this pint-sized, Arkansas-based retailer. But with the completion of the interstate highway system and the growth of exurban communities in the Sunbelt, the big shopping centers lost much of their convenience and appeal. Walton therefore located his stores in the once-neglected small-town hinterlands, especially in the South and Midwest, where land was cheap, labor was nonunion, and competition was limited. Although Walton gave his firm a “just folks,” small-town image, Wal-Mart was among the most sophisticated of global corporations, importing huge quantities of clothing and household goods from fifty countries. Managers used the latest computer technology to track sales, minimize inventory expenses, and squeeze suppliers. Wal-Mart’s success made the Walton family the richest in the nation, but the company’s rapid growth bankrupted thousands of independent merchants whose viability sustained Main Street life throughout small-town America.

Stagflation Politics: From Nixon to Carter

Of the three presidents who sought to revive the fortunes of American capitalism during the 1970s, Republican Richard Nixon actually presided over the most “liberal” administration. During his tenure, Congress passed, and the president signed, laws indexing Social Security payments to inflation, extending unemployment benefits, and regulating oil and gas prices during the first energy crunch. Declaring himself a “Keynesian,” Nixon also used government power to directly attack both inflation and the growing trade imbalance. In August 1971, for example, when the Nixon administration devalued the dollar, it also froze wages and prices and raised the tariff on foreign cars. The administration designed this “New Economic Policy” (NEP) to keep a lid on pay increases and rein in organized labor but also to avoid the kind of massive unemployment and high interest rates that a later generation of more conservative policymakers would routinely adopt as the orthodox anti-inflation remedy.

President Nixon’s economic activism gave him enough inflation-free breathing space to win the 1972 election, but his NEP proved unequal to the profound shifts that were transforming the world economy. When U.S. price controls were eliminated in 1973, the cost of living rocketed upward, fueled by sharp increases in the prices of grain, oil, lumber, and other internationally traded commodities. Vice President Gerald Ford had the misfortune to move into the Oval Office in August 1974, just weeks before the nation plunged into the deepest recession since the Great Depression. In an October 8, 1974, televised speech, he announced a voluntary anti-inflation campaign that he called “Whip Inflation Now” (WIN), exhorting Americans to sign pledge forms to save energy and reduce waste, but the program ultimately proved unsuccessful. As production declined by more than 10 percent in 1974 and 1975, nearly one-tenth of the workforce became unemployed. Ford’s unimaginative, passive response reminded many people of the kind of economic conservatism that had prevailed in the White House during the 1920s. Ford vetoed most congressional efforts to increase countercyclical spending on education, jobs, and infrastructure construction.

Gerald Ford’s ineffectual domestic leadership gave James Earl Carter, a little-known former governor of Georgia, the opportunity to win the Democratic presidential nomination in 1976 and then edge his way into the White House. Carter, a technocrat who had been trained in nuclear engineering in the U.S. Navy and a Christian moralist, argued that his lack of experience in Washington gave him a fresh, honest perspective. “I will never lie to you,” he told campaign audiences in a not-so-subtle reference to the Watergate scandal. Although he was one of the more conservative Democrats who campaigned for the presidency that year, Carter had broken with the tradition of southern racism. Andrew Young, a Georgia-based civil rights leader, served as his confidant, and Carter appointed numerous veterans of the 1960s civil rights movement to posts in his administration.

Jimmy Carter’s single-term presidency failed because he never managed to tame the double-digit inflation that frightened so many Americans. With few ties to organized labor or to traditional liberals, Carter won the White House by appealing to the anti-incumbent mood that dominated political life in the years following Watergate and President Ford’s unpopular 1974 pardon of ex-president Richard Nixon. Carter advocated energy conservation, but he had neither the will nor the political leverage to impose Nixon-style price controls. Carter also rejected most efforts to restart Great Society–like social welfare initiatives, including Democratic proposals for national health insurance and federal programs to promote full employment and fund abortion services for poor women. And he gave only tepid support to the labor movement’s 1978 effort to reform the National Labor Relations Board. An increasingly well-organized and outspoken business community successfully lobbied against the bill, which would have reduced employers’ ability to resist union-organizing drives.

Instead, Carter turned to a radical deregulation of the airline, trucking, railroad, and telephone industries to curb wages and prices. New Deal era liberals thought that enterprises in these industries required close government supervision, either because of the vital services they rendered or because of their inherent instability. But Carter Democrats had come to see such price and market regulation as economically inefficient and hostile to consumer interests. Administration officials believed that prices would fall and service would improve if they took a laissez-faire approach.

The Carter administration’s conservative tilt became apparent in its rescue of the Chrysler Corporation, which in 1979 stood at the brink of bankruptcy. Chrysler maintained factories that were aging and inefficient compared to those of Japan, resulting in equally outmoded automobiles. The financially troubled corporation needed billions of dollars to retool, but the big banks would not extend credit. Chrysler executives argued that only a government loan guarantee—a federal bailout—could save the company and thousands of well-paid jobs. Such loan guarantees were not new; in 1971, the Lockheed Corporation, one of the nation’s major defense contractors, had secured this type of federal help. But the conditions under which Washington guaranteed the Chrysler loan opened the door to a further decline in the standard of living of millions of American workers. Together with the big banks, federal officials demanded that Chrysler’s workers make hundreds of millions of dollars in wage concessions as part of the bailout package. As Chrysler’s president, Lee Iacocca, explained in the midst of the crisis, “It’s freeze time, boys. I’ve got plenty of jobs at seventeen dollars an hour; I don’t have any at twenty.” The leadership of the United Auto Workers convinced autoworkers that such concessions were the only way to save their jobs. Chrysler went on to earn record profits in the mid-1980s, but the Chrysler bailout proved to be the first in a long wave of concession contracts and wage rollbacks that swept through almost every unionized industry.

Carter broke with traditional economic again when he appointed Paul Volcker, a conservative Wall Street banker, as the chairman of the powerful Federal Reserve Board in 1979. Volcker’s appointment signaled the death of Keynesianism as a government policy tool. The president abdicated stewardship of the economy in favor of Volcker, who instituted a set of “monetarist” policies that severely restricted the growth of the money supply and thereby pushed interest rates toward 20 percent—their highest level since the Civil War. By the early 1980s, his program had cut the annual inflation rate from 12 percent to 4 percent.

Volcker’s monetarism had a huge impact, especially on the goods-producing sectors of the economy. Big-ticket consumer items became far more expensive. “Those interest rates have killed me and my business,” complained Bruno Pasquinelli, whose Illinois homebuilding firm faced near-bankruptcy. High interest rates also pushed up the value of the dollar against foreign currencies, making American cars, steel, and electronic products even less competitive overseas. Wave after wave of plant closings swept through the Midwest and Middle Atlantic states. In 1982 alone, 2,700 mass layoffs eliminated more than 1.25 million industrial jobs. Cities such as Youngstown, Buffalo, Cleveland, Gary, Milwaukee, and Detroit, once the industrial crown jewels of the nation, now exemplified a declining “Rust Belt.” Almost 11 percent of the U.S. workforce was unemployed, the highest proportion since 1940.

This blue-collar depression struck older male breadwinners with particular force. When plants closed, many of these workers experienced a deep sense of loss and a feeling that their inability to “bring home the bacon” reflected on their masculinity. “I’ve had to change my life-style completely,” reported Pete Jefferson, an African American who lost his job when his steel mill closed its doors. “I come from a Southern family. They always looked up to me because I’d done so well financially. I used to be head of the family; now I’m just a member.” Alcoholism, depression, and divorce grew more frequent in the months and years that followed a factory shutdown. Few blue-collar men over age forty were able to retrain; most would find new work, but rarely at the same high levels of pay or with the same pension and health care benefits.

The Nation Moves to the Right

Many of these blue-collar workers would turn to the Republican Party, helping to push American politics to the right in the late 1970s. The multiple traumas of that decade—Vietnam, Watergate, the oil shocks, and the Iranian hostage crisis—generated a pervasive sense of cynicism and alienation. In the quarter-century after 1973, the percentage of Americans who agreed with the statement “The best government is the government that governs the least” nearly doubled, to 56 percent. Not unexpectedly, electoral participation dropped sharply; many Americans who had traditionally supported an activist, governmental solution to the nation’s problems now stayed home on election day. Among those who did vote, conflicts over race, gender, and sexuality offered many working-class white people the opportunity to vote for conservative politicians, who promised not to bus children outside their neighborhoods in the name of desegregation or raise taxes to benefit “welfare queens.” In what seemed to be an increasingly unstable geopolitical environment, American foreign policy also moved to the right. Beginning even before Ronald Reagan took office, top policymakers abandoned détente and projected a more aggressive posture in the Cold War conflict.

The Rise of the New Right

After 1968, barely half of all potential voters cast ballots in presidential elections—about one-third fewer than voted in the New Deal era or the early 1960s. This withdrawal from the electoral process was concentrated among working people and the poor, whose disappearance from the voting roles pushed all American politics to the right. The decline in voter turnout had many sources, but two stand out. First, institutions that traditionally linked individual voters to national politics, such as trade unions and urban political machines, had become far less influential. The professionally crafted thirty-second television spots and computer-generated direct-mail fund-raising letters that replaced these grassroots institutions proved to be ineffective in mobilizing those at the bottom of the social ladder. Second, political participation declined among poor and working-class voters because the Democratic Party failed to offer alternative policies around which these voters, once its most loyal supporters, might be mobilized. Between the end of the 1960s and the end of the 1970s, the number of Americans who agreed with the polling statement that the “people running the country don’t really care what happens to you” shot up from 26 to 60 percent.

Political demobilization among once-stalwart supporters of the Democratic Party was soon matched by the rise of a “New Right,” which made a powerful bid for the allegiance of many of these same voters. For most of the twentieth century, political conservatism in the United States had been closely linked to the views of affluent white Anglo-Saxon Protestants, who looked with some disdain on Black people, Catholics, Jews, unionists, and immigrants. This brand of “Old Right” conservatism mistrusted activist government, denounced international Communism, and defended laissez-faire economics. Although traditional conservatism did not disappear in the 1970s, its elite spokesmen lost much of their influence to a New Right, dedicated to mobilizing the body politic against secular culture, feminist ideas, and the government social programs that had first been launched during Lyndon Johnson’s presidency.

The New Right grew in response both to the decline in popular confidence in the nation’s institutions and to the transformations that were taking place in American culture. Social and ideological changes during the 1960s challenged what many people saw as their traditional values and proper place in the social hierarchy: the father-centered family, the Christian character of American public life, and an unproblematic patriotism. Thus, the New Right appealed to millions of Americans who had once been stalwart Democrats: white southerners, urban Catholics, and disaffected unionists.

But the New Right was not simply a backlash against “the Sixties”; a militant brand of conservatism had begun to flourish long before the end of that decade. Barry Goldwater’s 1964 presidential campaign was a crystallizing event, transforming the nascent New Right from a circle of collegiate intellectuals into something of a broad political movement. Key conservative leaders of the 1970s and 1980s, including columnist George Will, Chief Justice William Rehnquist, Equal Rights Amendment opponent Phyllis Schlafly, radio and TV pundit Pat Buchanan, and direct-mail entrepreneur Richard Viguerie had all been passionate Goldwater partisans. Raising millions of dollars for New Right initiatives, Viguerie proved a particularly imaginative organizer, who combined the Goldwater campaign donor list with that of the George Wallace partisans. Soon, he gained power in the Republican Party. “Direct mail,” asserted Viguerie, “is like having a water moccasin for a watchdog. Silent but deadly.”

Conflicts abroad also helped conservatives at the polls. For decades, Iran’s monarch Reza Shah Pahlavi had been a bulwark of U.S. influence in the Persian Gulf. But the billions of dollars in oil revenue that washed over his nation of thirty-five million had set off explosive tensions among both secular radicals, who fought for a constitutional democracy, and Islamic fundamentalists, who sought to impose an anti-Western theocracy. As the Shah’s army disintegrated, Islamic religious leaders (ayatollahs) led by the exiled Ayatollah Ruhollah Khomeini consolidated their power. The 1979 Iranian revolution precipitated a second oil shock, which tripled world oil prices and further strained the already weak American economy. Islamic militants added to U.S. woes when they seized the U.S. embassy in November 1979 and held fifty-two embassy personnel hostage for 444 days. A long-running American Broadcasting Company TV news show, The Iran Crisis-America Held Hostage (which ultimately became Nightline) , reflected a widespread sense of U.S. impotence and frustration, which was exacerbated by a failed rescue mission during which eight U.S. commandos died.

Events in Afghanistan compounded the situation. A pro-Soviet faction had gained power there, but Communist rule was tenuous, especially after Islamic radicals seized power in neighboring Iran. To forestall the collapse of their Afghan clients, the Soviets airlifted thousands of troops into the capital, Kabul, in December 1979. President Carter called the invasion the “gravest threat to peace since 1945” and demanded an increase in the military budget, the reinstatement of draft registration for eighteen-year-old men, and the speedy development of a new generation of medium-range missiles for deployment in Europe. In Afghanistan itself, the Central Intelligence Agency shipped sophisticated arms to Muslim guerrillas, who soon stalemated more than 100,000 Soviet troops.

The Iranian hostage situation and the Soviet presence in Afghanistan humiliated Carter. He had come into office promising to deepen détente (see Chapter 12), but such foreign relations disasters convinced many conservatives and some liberals that détente was a poor bargain for the United States. Senator Henry Jackson of Washington led a faction within the Democratic Party that opposed both trade liberalization with the Soviets and a new arms agreement until the Kremlin respected the human rights of Jews and other minorities within the Soviet Union. Among Republicans, a powerful nationalist current also opened a breach within party ranks. Therefore, when President Carter sent to the Senate a treaty relinquishing U.S. sovereignty over the Panama Canal, ratification proved contentious, even though diplomats of the Nixon-Ford State Department had negotiated most of the agreement. The treaty was ultimately ratified, but by the last year of his presidency, Carter’s foreign policy lay in ruins.

Revolt Against Taxes and Busing

The civil rights revolution had a huge impact on the urban white working class in many Northern cities, among them Irish Americans in Boston, Slavic Americans in South Chicago, and Italian Americans in Brooklyn. Many of them now lived in cities that were presided over by Black mayors such as Cleveland’s Carl Stokes, elected in 1967, and Detroit’s Coleman Young, who began a twenty-year tenure in 1973. As long as cities were prosperous and schools were well funded, it seemed possible that a multiracial set of urban institutions might emerge with relatively little social tension. But economic hard times in the 1970s, combined with the decline of the cities and their school systems, made it virtually certain that racial conflict would erupt and that white working-class voters would shift rightward in response.

Tensions exploded over court-ordered busing to achieve racial balance in the schools. By the early 1970s, racial integration of public schools in the rural South had largely ended the state-supported dual system there, but in most large urban areas, segregated schools continued to mirror the racial divide that persisted in residential neighborhoods. To remedy such de facto segregation, courts often ordered local school boards to institute cross-neighborhood busing. Despite the loss of familiar neighborhood schools, African American parents supported these plans, hoping that their children’s attendance at resource-rich, formerly all-white schools would enhance their educational opportunity and performance. But in Pontiac, Michigan; Louisville, Kentucky; and Kansas City, Missouri, busing programs quickly generated white opposition, sometimes leading to school boycotts and violence.

The most spectacular clash over school busing came in Boston, the city that had spawned nineteenth-century movements for free public education and the abolition of slavery. For years, the Irish American–dominated Boston school board had kept the city’s schools racially segregated. In 1975, after a long, NAACP-initiated legal battle, a federal district court issued a sweeping integration order that mandated, among other remedies, the busing of pupils from all-Black Roxbury to South Boston, an economically declining Irish section of the city. Federal courts in Massachusetts, as elsewhere, excluded from integration plans the white, middle-class suburbs, so the burden of the busing plan fell almost entirely on the children of the urban working class, both Black and white.

The stage was set for an ugly confrontation when the first African American students were bused across town. For three years, Boston police struggled to protect Black children from angry white parents screaming, “Nigger go home!” More than 20,000 white students left the Boston public schools to escape desegregation. Boston’s inflamed racial climate finally subsided in the 1980s when a new generation of politicians, both Black and white, launched a set of biracial electoral campaigns to defuse urban tensions. The busing controversy also faded from the news, both because a new cohort of conservative judges backed away from the tactic and because the racial integration of most big-city school systems had become unworkable. White flight and the growth in immigrant and minority populations gave most urban school systems a substantial African American and Latinx majority enrollment.

Because of the enfranchisement of millions of African American voters, direct appeals to racial intolerance largely vanished from public political discourse during the 1970s and 1980s. High public officials who used words such as “nigger,” “Jap,” or “kike” invariably apologized for their gaffes. But American political discourse remained saturated with an array of code words, phrases, and substitute issues that indirectly expressed white racial prejudice. Urban crime, escalating drug use, and the growth of public assistance programs were real issues, but much of the rhetorical denunciation of “welfare queens” (women who grew wealthy by taking advantage of the welfare system), “drug lords,” and “forced busing” gave the public debate a thinly disguised racial edge.

All this helped to discredit many government functions and fuel a series of an antigovernment tax revolts. In 1978, passage of California’s Proposition 13, a ballot initiative limiting property taxes and slashing local government revenues, signaled the ability of New Right conservatives to turn the tax issue against liberal governance itself. As one state legislator put it, Proposition 13 was “a bullet from a loaded gun that went off in California on its way to its ultimate target—the high level of Federal spending.” Soon conservative activists were mounting antitax campaigns in Michigan, Idaho, Nevada, Massachusetts, Oregon, and Arizona. They argued that at issue was not only the fairness of the tax system, but also wasteful government expenditures for education, welfare, and other social programs. Most of the tax savings went to business, not to ordinary taxpayers, but the antitax agitation of these years had a profound impact on civic life. Combined with the stale taste left by Watergate, it helped to mobilize public sentiment against the government’s assuming responsibility for pressing social and economic problems.

Gender Politics

School busing and taxes were not the only issues that mobilized conservatives in the 1970s. Explosive moral and cultural questions about the role of women and the status of those in LGBTQ+ communities proved just as powerful. Before the 1970s, most evangelical Protestants avoided politics, which they saw as hopelessly corrupt. But court rulings that legalized abortion, curbed school prayer, and deprived segregated Christian academies of their tax-exempt status unleashed a wave of activism. In the South, the Carter administration’s efforts to eliminate tax breaks for hundreds of new religious academies mobilized thousands of Protestant conservatives to participate in Republican Party politics, the Moral Majority, and other Christian political groups.

For Protestant fundamentalists, gender issues took on the air of a religious war. In these conflicts, denominational lines had less meaning than did the split between theological liberals and conservatives. The former had little quarrel with the nation’s pluralist, secular culture; the latter, regardless of denomination, saw the United States as an increasingly amoral nation in which the difficulties that it faced at home and abroad were but the outward sign of an inner debasement. Perhaps for this reason, evangelical Christianity enjoyed an extraordinary renaissance in the 1970s among both Black and white Americans. Between 1965 and 1985, membership in liberal Protestant churches declined, but the conservative Southern Baptists, America’s largest Protestant denomination, gained three million members. By the start of the 1980s, more than forty-five million Americans considered themselves fundamentalists. Scores of congregations moved to the suburbs and built huge new churches, often with money donated by the Sunbelt’s energy, real estate, and banking entrepreneurs. Evangelical ministers, such as Virginians Jerry Falwell, founder of the politically influential Moral Majority, and Pat Robertson, used the latest in television technology and programming to spread their conservative message well beyond the traditional southern Bible Belt.

Three gender issues served as New Right cultural and religious lightning rods: the U.S. Supreme Court’s 1973 decision in Roe v. Wade, which legalized abortions; the feminist-backed effort to pass the Equal Rights Amendment (ERA) to the U.S. Constitution; and the increasing rights consciousness and public visibility of LGBTQ+ Americans.

Before 1973, abortion was legal in some localities but not in every state. Feminists and civil libertarians argued, and the Supreme Court came to agree, that governmental prohibitions against a medical abortion were not only unenforceable, but also a violation of a woman’s right to privacy, especially during the first trimester of pregnancy. Most women who sought abortions were not vocal feminists, but conservatives linked the exercise of the new abortion rights to what they perceived as the sexual licentiousness of the 1960s. They denounced legalized abortion as murder of the unborn, a spur to sexual promiscuity, and, as one activist put it, an attack on “the right of a husband to protect the life of the child he has fathered in his wife’s womb.”

Antiabortion forces across the country rallied quickly after the Roe v. Wade decision. In the North, Catholic Church leaders organized the first antiabortion demonstrations. Among many Catholics, as among most evangelical Protestants, the degree of opposition to abortion and the defense of what many saw as the sanctity of “God-given” gender roles within the family determined the depth of one’s religious commitment. In state and local jurisdictions, “pro-life” groups waged a vigorous legal and legislative campaign to restrict abortion rights when they could not eliminate them outright. Activists often picketed abortion clinics and courted arrest in order to stigmatize this medical procedure and the doctors who performed it. But across the picket lines, they faced an equally fervent “pro-choice” movement, whose members argued that the right of a woman to choose an abortion was fundamental to her dignity and citizenship.

The ERA, which both houses of Congress approved in 1972, also proved to be a controversial issue in the 1970s and early 1980s. The proposed amendment simply stated that “equality of rights under the law shall not be denied or abridged by the United States or by any State on account of sex.” It thus ratified, in symbolic and legal terms, the new roles that women were exploring and the new gender egalitarianism that was reshaping so many aspects of American life. In education, for example, coeducation came to scores of all-male colleges and universities, including Yale, Dartmouth, Princeton, the University of Virginia, and the military academies. And when Congress amended the Civil Rights Act in 1972 to require equal expenditure of federal funds on male and female students, the skill and visibility of women in competitive sports leapt forward.

Although many Republicans had once been staunch supporters of the ERA, a new generation of social conservatives attacked it as little more than a proxy for the entire feminist agenda. As debate on the proposed constitutional amendment rolled through key state legislatures, New Right leaders such as Phyllis Schlafly and Jerry Falwell organized thousands of activists against it. Falwell told his large television audience, “In families and in nations where the Bible is believed, Christian women are honored above men.. . . . The Equal Rights Amendment strikes at the foundation of our entire social structure.” Twenty-eight state legislatures approved the ERA by the end of 1973, but public opinion turned against the amendment during the remainder of the decade—so much so that five states eventually rescinded approval. It never became part of the U.S. Constitution because supporters could not win passage in the three-quarters of all state legislatures (thirty-eight) necessary for final ratification.

The key to the New Right’s victory in this battle lay in the very different meaning the idea of women’s equality held for men and women depending on age, class, and economic expectations. Many working-class men feared that passage of the ERA would undercut whatever control they still possessed over their work and family lives. For their part, many working-class women, especially those in the South and Middle West, defended a home-centered “separate sphere” as the key to their sense of dignity and self-worth. Many of them did not identify with the feminist leaders who had injected the ERA into national politics. One otherwise liberal trade unionist explained that she distrusted Ms. magazine publisher Gloria Steinem because “I think maybe she looks above us. I feel she’s fighting for women like herself, professional women. . . . So I don’t consider myself part of her movement.

The new visibility and rights consciousness projected by LGBTQ+ Americans also polarized American politics and culture. In the surge of “gay pride” that followed the 1969 Stonewall Inn riot in New York City, many in the LGBTQ+ community expressed their sexual orientation with an openness that had been denied to previous generations. They built a new kind of urban counterculture, which included bars, newspapers, and magazines, as well as numerous social and political groups for self-identified LGBTQ+ people. The once-buried history of LGBTQ+ Americans came alive in an outpouring of books and movies. For the first time, local politicians acknowledged a definable LGBTQ+ vote. In the 1970s, the victorious mayoral campaigns of George Moscone in San Francisco and Edward Koch in New York City benefited from the support of a mobilized LGBTQ+ electorate.

Horrified, fundamentalist Christians attacked the public display of same-sex activity as blasphemous. In 1977, Anita Bryant, a popular singer and advertising celebrity, won national attention when she campaigned against the specter of “militant” LGBTQ+ people corrupting young students in Florida’s public schools. Bryant spearheaded the successful movement to repeal a Dade County ordinance that protected LGBTQ+ people from employment discrimination. Physical attacks on members of LGBTQ+ communities increased, including attacks against high-profile leaders. In 1978, Dan White, a conservative former San Francisco politician, assassinated Mayor Moscone and Harvey Milk, the city’s first openly LGBTQ+ member of its board of aldermen. When White was found not guilty of the crime by reason of insanity, San Francisco’s LGBTQ+ community erupted in a night of street violence.

The New Right’s hostility toward same-sex sexuality intensified in the early 1980s when AIDS, the deadly acquired immune deficiency syndrome, began to ravage LGBTQ+ communities of San Francisco, New York, Los Angeles, and other big cities. Within a decade, more than 100,000 people had died, while another two million (LGBTQ+ and straight) were infected with HIV (human immunodeficiency virus), the virus that causes AIDS. To many heterosexual Americans, not only those in the New Right, AIDS seemed less a disease than a moral judgment on the LGBTQ+ lifestyle. But the AIDS epidemic soon spread well beyond the LGBTQ+ community, first to intravenous drug users and then to heterosexuals, the latter reminding moralists that LGBTQ+ people had no monopoly on promiscuity. A burst of LGBTQ+ community health care activism late in the 1980s won LGBTQ+ Americans respect within the public health community and greater awareness of LGBTQ+ issues and rights. By the mid-1990s, the promotion of safe sex, along with the deployment of new medicines, had limited the devastating impact of the AIDS epidemic within the United States.

A CLOSER LOOK: ACT UP: Silence=Death

The Reagan Revolution and Economic Disparity

The rise of the New Right, the demise of détente, and the persistence of stagflation doomed the presidency of Jimmy Carter and opened the door to Republican Ronald Reagan’s sweeping victory in the 1980 presidential contest. Born in Illinois, Reagan achieved modest fame as a Hollywood actor in the 1930s and 1940s. He was then a prolabor, New Deal liberal who was active in the Screen Actors Guild. But after World War II, Reagan sided with conservative anti-Communists during a violent set of film industry strikes. In the 1950s, as a corporate spokesperson for General Electric, one of the nation’s most aggressively antiunion firms, Reagan became an active Republican. Elected governor of California in 1966, he served two terms, during which he fought to slow the growth in state spending for health, education, and welfare. He won national attention as a bitter opponent of the student movement on California campuses and as a strong supporter of the Vietnam War. By the time he left the governor’s mansion in 1975, he was the de facto leader of the resurgent Republican right. The movie star turned political superstar won the presidency with a promise of a new economic program to end the stagflation of the 1970s. Under Reagan, tax cuts and defense spending boosted some regional economies. But tax breaks for corporations and the wealthiest Americans increased the overall disparity between the rich and poor, as unions continued to struggle against foreign competition and domestic hostility to the labor movement.

Reagan’s Presidency

Reagan was by far the most conservative figure in the 1980 presidential race. He easily defeated the mainstream Republican candidate, ex-CIA director George H. W. Bush, for the Republican nomination; and in the general election, he marginalized support for the GOP moderate John Anderson, who ran as an independent. Reagan attacked détente, emphasizing the need to increase the military budget and project American power abroad. He took skillful advantage of the nation’s economic difficulties to denounce government efforts to manage the economy and regulate business. Reagan opposed outright corporate taxation and advocated sharply lower personal tax rates for the rich. In rhetoric, if not always in practice, he supported the New Right’s conservative social agenda. “Government is not the solution to our problem,” he asserted in his first inaugural address, “government is the problem.”

The Republicans won the election by retaking the white South from the Democrats and increasing their victory margin among middle-class, suburban voters. Most strikingly, Reagan captured the votes of half of all blue-collar workers and more than 40 percent of union households. These “Reagan Democrats,” who had once been core supporters of the New Deal and the welfare state, now helped to tilt American politics against new taxes and social spending initiatives. Only African Americans voted solidly Democratic. On Reagan’s coattails, the Republicans gained twelve Senate seats, giving them a majority in that body for the first time since the early 1950s. The House maintained a slim but dispirited Democratic majority.

Reagan and the Republicans promised to transform American politics in a fashion just as sweeping as that inaugurated by Roosevelt’s New Deal nearly fifty years earlier. Declaring the Soviet Union an “evil empire,” Reagan won a 40 percent increase in arms spending, including expensive new weapons systems, including the “Star Wars” antimissile shield. Reaganite intellectuals, such as U.N. ambassador Jeane Kirkpatrick and State Department official Elliot Abrams, defined most insurgencies in developing countries as Soviet-inspired, Soviet-supported terrorism. The Reagan administration therefore adopted a “rollback” strategy that targeted revolutionary movements in Africa and Latin America. The administration also sought to overcome what some pundits called the “Vietnam syndrome,” which was defined as a lingering reluctance to commit U.S. military forces abroad. To this end, the administration increased military aid to the Afghan rebels, organized and armed a group of counterrevolutionary Nicaraguans, sent the Marines into the Lebanese civil war, and in 1983 launched a military invasion of the tiny Caribbean island of Grenada, where pro-Cuban radicals had taken power.

By “getting the government off our backs,” Reagan hoped to unleash a tide of entrepreneurial energy that would restore economic growth and pay for the military buildup by cutting taxes, government regulations, and social spending. In 1981, with the cooperation of many conservative Democrats in Congress, the new administration cut business and income taxes by 25 percent and pared billion from domestic social programs. Meanwhile, Reagan’s new secretary of the interior, James Watt, worked to open federally controlled land, coastal waters, and wetlands to exploitation by mining, lumber, oil, and gas companies. Both the Environmental Protection Agency and the Occupational Safety and Health Administration became much more solicitous of the business point of view. Commentators called Reagan’s program of tax cuts and regulatory reforms “supply-side economics” or just “Reaganomics.” Its partisans argued that business profits, sales, investment, employment and even tax revenues would soar in an economic environment that was so much more favorable to entrepreneurship.

Such was the theory. Reaganomics did cut taxes sharply for corporations and the wealthy, reducing the top individual tax rate from 70 percent in the 1970s to 28 percent in 1986. A family with an income between 0,000 and 0,000 gained an average of ,400 in extra income from the Reagan tax cuts. But the taxes that most working-class Americans paid actually rose during the 1980s because state and local taxes increased to make up for reductions in federal aid and because Social Security deductions increased almost every year.

The Reagan tax policy had an additional consequence, which many of its architects were unwilling to advertise. Between 1981 and 1986, federal income tax receipts plummeted by 0 billion. This loss of income, combined with huge increases in military spending, generated staggering federal budget deficits of some 0 billion to 0 billion a year. Thus, Reaganomics ensured that regardless of which party controlled Congress or the White House—and however great the social need—the federal government would find it virtually impossible to initiate new programs. Moreover, because the government had to borrow so much to cover the tax shortfall, interest rates remained at double-digit levels. The strong dollar drove down the cost of imported Japanese cars and German machine tools, but it also ensured that domestic manufacturing would continue to struggle and blue-collar unemployment would remain at near-depression levels.

Reaganomics also forced deep cuts in welfare spending. The social programs that were inaugurated or expanded in the late 1960s had helped to reduce poverty in the United States, but in the 1970s, inflation and recession began to undermine this progress. During the 1980s, conservative ideologues such as Charles Murray and George Gilder argued that liberal social policy itself produced poverty and generated a self-perpetuating “underclass” that was dependent on government handouts. Declaring such social programs a failure, the Reagan administration’s policymakers set out to destroy them. Large cuts in food stamp, child nutrition, and job-training programs followed; Aid to Families with Dependent Children, public service employment programs, and low-income housing projects also suffered.

Not all welfare programs were cut so drastically—only those that assisted the poor. The administration spared the social programs and tax policies that benefited people who had a solid attachment to paid work—Social Security, Medicare, and the tax deduction for interest on home mortgages. These middle-income entitlement programs bore no stigma; most Americans considered them a right rather than a handout. Thus, in the 1980s, even conservative Republicans considered Social Security, by far the nation’s most expensive income-transfer program, the “third rail” of American politics: touch it and you die.

The Reagan Boom

The same economic policies that devastated the country’s industrial heartland generated regional booms in the Sun Belt and in high-tech New England. Defense spending, foreign investment, and the maturation of the baby boom generation sent real estate values soaring on the coasts. In financial centers such as New York, Dallas, Los Angeles, and Miami, Reagan administration policies that deregulated the banking industry and the stock market set off a wave of speculation, much of it visible in the suburban and exurban centers whose overnight growth often eclipsed that of the downtown office districts. Tysons Corner, Virginia; the Route 1 corridor to Princeton, New Jersey; Clayton, Missouri; Newport Beach, California; Rockville Pike, Maryland; and the Route 580 corridor just east of the Oakland hills in northern California—these were not the bedroom suburbs of the 1950s, but entirely new towns complete with gleaming office parks, huge shopping malls, and high-priced homes and condominiums. In them lived a workforce that was segregated by race, class, and personal expectations from the people who were still struggling in the nation’s older, urban manufacturing and service sectors. In the 1980s and afterward, these “edge cities” represented a physical manifestation of the great social divisions that Reaganite capitalism generated.

Income distribution in the United States, though far less egalitarian than that in most other industrial democracies, had remained fairly stable for a quarter-century after the end of World War II. But in the early 1980s, the United States, in the words of economists Barry Bluestone and Bennett Harrison, took a “Great U-Turn” that widened the distance between the well and poorly paid to an extent greater “than at any point in the life-times of all but our most senior citizens, the veterans of the Great Depression.” Between 1977 and 1990, the income of the richest fifth of the population rose by one-third; that of the top 1 percent almost doubled, even as the total income of the bottom 60 percent actually fell, the income of those who lived below the poverty line dropping most sharply.

There were critics aplenty, but much of American popular culture celebrated the new rich. Greed and extravagance became the stuff of television dramas such as Dallas and Dynasty, whose ruthless but stylish characters owned the hot stocks, luxury homes, and designer clothes that most viewers coveted. Their real-life counterparts paraded across the fashion and society pages of glossy magazines, daily newspapers, and the aptly named television series Lifestyles of the Rich and Famous. Young lawyers, bankers, stockbrokers, and businessmen—the young urban professionals sometimes mocked as “yuppies”—flocked to the thriving financial districts, where they sought to win their share of the vast fortunes created by corporate consolidations and leveraged buyouts. The films Wall Street and Working Girl captured the lure of this new era of deal-making wealth, while Tom Wolfe’s novel Bonfire of the Vanities revealed its darker, amoral side.

Millions of white-collar professionals, managers, and small businesspeople stood well below these high rollers on the American income pyramid. Constituting perhaps 25 percent of the working population, this slice of the middle class also seemed to bask in the glow of the Reagan revolution; their salary increases kept pace with inflation, their income taxes were lower, and they voted Republican in overwhelming numbers. But this seemingly prosperous middle stratum was not immune to the economic and social difficulties of the 1980s. Since the end of World War II, white-collar workers had enjoyed job stability in return for their loyalty to corporate employers. Corporations in the United States employed three or four times as many managers and supervisors as those in Japan and Europe. These people were a costly burden—too costly in an era of corporate mergers and rising international competition. Therefore, when top executives began reorganizing their enterprises to make them “lean and mean,” many long-serving middle managers found themselves unemployed for the first time in their lives. “I was hurt,” remembered a middle manager who had been nudged into retirement by a large pharmaceutical company. “After thirty-four years with the company, I was surprised that it came down to an economic relationship. . . . I thought I was in—a family kind of thing.”

Downsizing, a polite term for mass dismissals, intensified during the recession that began in 1989, when a new wave of reorganizations and layoffs swept through the banking, stock brokerage, and real estate sectors. These sweeping white-collar layoffs became a routine management practice and continued well into the boom of the 1990s. Of course, total professional and managerial employment increased during the 1980s and 1990s, even in some newly restructured companies, but this churning of the white-collar labor force generated widespread middle-class insecurity, even among people who kept their jobs.

Despite the erosion in the job security of many male breadwinners, family incomes rose modestly in this era because more members of the family went to work. The most important additions to the workforce were women, primarily wives, whose labor-force participation increased from 40 to 60 percent in the quarter-century following 1970. By the 1990s, paid work was virtually universal among middle-class and working-class women under age forty. Indeed, their labor represented the difference between comfort and hardship. Virtually all of the income gain among white two-parent families in the years after 1967 can be accounted for by the wages of wives and daughters.

The American middle class sustained its relative affluence by what can only be called family speedup. To the surprise of an earlier generation of optimistic social forecasters, the growth of office automation and the deployment of a wide array of technological gadgets—from personal computers to faxes and mobile phones—did not reduce the working hours of professionals and office workers. Global commerce lengthened the workday, often right at home. For Motorola executive Sheila Griffin, a cell phone and voice mail started the workday during her early morning commute. “I get to the office and check the faxes. I get Europe out of the way and then work on things in our own time zone.” By 6:30 p.m., thirteen hours after leaving home in the predawn darkness, Griffin is back with her family. “Then at about 9:30 p.m. the phone rings, and it’s Japan."

In the two decades after 1969, the average employed American worked an extra month a year—about two and a half weeks more for men, seven and a half weeks more for women. Vacation time declined, overtime rose, and moonlighting soared. Women bore the brunt of this family speedup. Columnist Ann Landers pronounced herself “awestruck at the number of women who work at their jobs and go home to another full time job.” One study found that employed mothers averaged more than eighty hours a week of employment, housework, and child care. “These women talked about sleep the way a hungry person talks about food,” reported a California sociologist. Teenage employment also increased, even among middle-class families. Teenagers helped to pay for college, cars, and clothing, and they proved essential labor in the nation’s burgeoning service economy. Wal-Mart, McDonald’s, and Foot Locker could hardly have remained open without a vast adolescent workforce.

The Ranks of the Poor

At the very bottom of the American social hierarchy stood the one in eight Americans whose incomes fell below the U.S. government’s poverty line during the 1980s. The proportion of all Americans who were considered poor reached its postwar low in 1973, but stagflation drove this number upward until it peaked at 15 percent in the early 1980s. Increasingly low levels of unemployment, especially in the 1990s, decreased the number of people who were living in poverty, but even during the most prosperous times, about one-tenth of all white people lived in poverty, as did one-third of the nation’s African Americans and one-quarter of all Latinx residents.

Low pay, structural changes in the economy, and institutional racism caused most poverty in late-twentieth-century America. During the 1980s and into the early 1990s, American business generated some thirty million new jobs, but most of them were in the service sector, which paid on average about 20 percent less than did jobs in manufacturing or transportation. The most rapidly growing occupations—home health care attendant, sales clerk, food server, janitor, and office clerk—were low-paid and part time, offering few pension or health care benefits and affording little opportunity for promotion. McDonald’s, the largest employer of Black youths in the nation, hired almost all its workers on a part-time, minimum-wage basis, which ensured a turnover rate of more than 100 percent per year. “You make minimum wage,” complained one Baltimore resident, “and there are so many people applying that the jobs are snapped right up."

During the Reagan era, the government cut back or abandoned welfare programs and wage standards that had been designed to compensate for the economy’s inability to generate enough high-paying jobs. By 1989, state and local welfare payments dropped by an average of 40 percent from their 1973 level. Job-training programs received drastic cuts, and the minimum wage, which the Reagan administration froze at 1981 levels, lost some 44 percent of its value through inflation. A full-time minimum wage worker could not keep a family of four out of poverty.

A significant number of Americans fell out of the world of work. The decline in the value of unemployment compensation and welfare, coupled with sharp hikes in urban rents, generated a new, or at least a vastly more visible, phenomenon: unhoused Americans, between one million and three million of them during the 1980s. When people without houses first appeared in large numbers in the late 1970s, many Americans labeled them as “bag ladies, winos, and junkies” or assumed that they had just been released from mental hospitals. But within a few years, it became clear that for millions of working Americans, housing insecurity was only a layoff or a family crisis away. One of every five unhoused people held a job but still could not afford housing. On cold winter nights, whole families, not just single men, searched for food and warmth at the crowded, squalid shelters that opened in almost every American city.

A fearful wave of criminality and drug addiction washed over inner-city neighborhoods during the 1980s. The importation of hard drugs, especially inexpensive “crack” cocaine, generated a violent world of well-armed drug lords and street-corner salesmen. Homicide became the leading cause of death among urban Black males aged fifteen to twenty-four, a rate that was six times greater than that of other Americans. In many large metropolitan areas such as New York City, one of every four African American men in their twenties and one of every eight Latinx men of the same age were in prison, on probation, or on parole.

Prisons, in fact, became one of the great growth industries of the 1980s and 1990s. For most of the twentieth century, the United States maintained an incarceration rate that was comparable to that in other industrialized nations—about one per thousand. But as the idea of rehabilitation faded during the 1970s, government at all levels began to jail criminals simply to keep them off the street. Fueled by a set of stiff drug laws that hit African Americans the hardest, the prison population tripled to two million between 1980 and 2000, giving the United States both the most prisoners and the highest incarceration rate of any nation in the world. The United States was also one of the few Western nations to retain the death penalty. After the Supreme Court affirmed its legality in 1975, southern and western states carried out executions with increasing frequency. And like the drug laws, states applied the death penalty disproportionately to African Americans, who made up half of all inmates on death row.

Struggling Against the Conservative Tide

The organized working class was on the ideological and economic defensive throughout the 1980s. In almost every strike and negotiation, unions sought to defend the status quo: to save jobs and maintain their existing wage levels and health benefits in the face of the concessions, or givebacks, that employers demanded. A bitter and divisive “culture war” reinforced this corporate offensive against the unions, putting on the defensive the secular values and cultural pluralism that had long undergirded the political and ideological hegemony of the New Deal and its reform successors. In addition, there were challenges to the right’s efforts to remake American culture. Throughout these years, Americans could, with equal ease, watch a televangelist preach the gospel or a pop music star challenge sexual mores. After 1986, Reagan’s political star began to dim when foreign policy scandals and the waning of the Cold War sapped his conservative influence both at home and abroad.

The Labor Movement Under Fire

Reaganomics proved to be disastrous for American trade unions. In the 1970s, the unions represented almost one in four working Americans, but during the 1980s, this proportion dropped sharply, so in the private sector, organized labor represented only 8 percent of all workers—a huge decline from the early postwar years, when trade unions were pervasive in manufacturing, utilities, transport, mining, and the telephone industry. Moreover, unions became weak, and their leaders became fearful; from the 1980s onward, strikes were rare, even when management cut wages, pensions, and health care benefits.

What accounted for this debacle? The plant closings and layoffs that swept through many heavily unionized industries provided one answer. U.S. Steel, which had once employed a unionized workforce of 200,000, transformed itself into USX Corporation, shut down a dozen steel mills, and acquired Marathon Oil, from which it soon derived the bulk of its sales and profits. The United Steelworkers of America, which proved to be incapable of organizing nonunion “mini-mills” in the South, lost almost half its members in little more than a decade. The United Auto Workers lost even more members—half a million after 1978—when Japanese auto sales captured one-quarter of the U.S. market. The Big Three automakers shut down dozens of plants, and auto parts makers fled the unionized North for low-wage Tennessee, Alabama, and Mexico.

Beginning in the 1970s, many employers became much more aggressive in their efforts to avoid or break trade unions, even in the once labor-friendly North. Their tactics skillfully combined both a paternalistic carrot and an antiunion stick. Corporations in such growing fields as finance, information technology, and health care offered workers an attractive menu of new benefits, often including on-site health clubs and child care. But most of these same companies remained bitterly antiunion. Management consultants advised executives of a group of New Jersey hospitals to figure out “who is going to be most vulnerable if the union knocks,” then “weed ’em out. Get rid of anyone who’s not going to be a team player.” By 1984, companies fired prounion workers at a rate four times higher than in 1960.

Hardball antiunion tactics went hand in hand with another old-fashioned management strategy: cutting wages. In unionized industries, this was called “concession bargaining.” Instead of negotiating a wage boost in each new contract, workers now faced corporate demands for a wage cutback—and not only in firms facing stiff foreign competition. As a business spokesperson summed up the situation, “An abundant supply of labor makes it more possible than ever before to operate during a strike. This possibility constrains union demands.” During the first half of the 1980s, American workers lost about 0 billion in wage givebacks and other concessions.

In the midst of this wave of concession bargaining, President Reagan’s destruction of a trade union of government employees, the Professional Air Traffic Controllers Organization (PATCO), immeasurably strengthened capital’s hand against organized labor. These well-educated, well-paid controllers, many of them politically conservative Air Force veterans, complained of the intense mental stress and physical strain that were inherent in their work. The union wanted Reagan to increase the staff and reform the management of the federal government’s Federal Aviation Administration, which employed most PATCO members. When the air traffic controllers struck in August 1981, Reagan waited only three days to fire more than 10,000 of these federal employees and fill their jobs with supervisors and hastily trained replacements. Not since Massachusetts governor Calvin Coolidge broke the Boston police strike in 1919 had the government so thoroughly smashed a union of its own employees.

Reagan’s destruction of PATCO transformed every strike into a union-busting opportunity. In both private companies and public agencies, managers had little trouble finding unemployed or underpaid workers who were willing to replace union strikers. Most “scabs,” or strikebreakers, were motivated by sheer economic necessity, given the wage stagnation and high unemployment of the late twentieth century. But worker solidarity could hardly be expected to flourish in an era that celebrated entrepreneurial freedom, wage inequality, and the virtue of corporate downsizing. In almost every long strike during the 1980s, management found it relatively easy to recruit replacement workers who were eager to keep their enterprise afloat. A strike of professional football players in 1984 graphically revealed this shift in sentiment and public perception. Thousands of fans flocked to the stadiums, jeering picket lines manned by players they considered too well paid, to cheer on football teams that were composed entirely of third-string replacements.

Corporations that had been deregulated during the late 1970s mounted especially vicious attacks on unions. Nonunion upstart companies grabbed market share from industry stalwarts, but in the new, competitive environment, some firms went bankrupt, while others cut wages, broke their unions, and even skimped on safety precautions. In 1983, after selling hundreds of millions of dollars’ worth of high-interest “junk bonds,” Frank Lorenzo bought Continental Airlines, declared it bankrupt, and then slashed the pay of thousands of pilots, machinists, and flight attendants. When the workers struck, Lorenzo broke their unions by hiring replacement workers from the tens of thousands of unemployed pilots and machinists who had been laid off during the recession of the early 1980s. He then moved on to Eastern Airlines, which he added to his empire in 1986. Using similar tactics, Lorenzo again demanded massive wage concessions that were certain to precipitate a strike. But this time, well-paid airline pilots joined machinists and flight attendants in a highly effective work stoppage that won sympathy from passengers, solidarity from the labor movement, and even grudging admiration from Wall Street. Their lengthy strike threw the airline into bankruptcy, forced Lorenzo out of the airline business, and put the brakes on deregulation of the airline industry.

The battle to maintain wages and benefits often took place in a single factory, company, or community where embattled unionists tried to mobilize families, neighbors, and community activists in defense of an entire way of life. In the coal fields of Appalachia, in the Arizona copper mines, on California factory farms, and in midwestern manufacturing towns, unionists demonstrated that they could still organize for long, bitter social struggles that were reminiscent of those that had been waged during the last quarter of the nineteenth century.

In Austin, Minnesota, for example, Local P-9 organized retirees, high school students, family members, and sympathetic unionists in a battle against wage concessions at the venerable Hormel Meatpacking Company. Throughout the bone-chilling winter of 1985–1986, the Austin working-class community, “P-9 Proud,” mobilized tens of thousands of supporters throughout the upper Midwest. But strikers could gain little support from the Democrats or even from many liberals. A Democratic governor sent in National Guard troops to prevent mass picketing, while the leadership of United Food and Commercial Workers Union urged the Hormel workers to accept another round of wage concessions. The Austin strike ended late in 1986 with the community divided, P-9 defeated, and Hormel victorious.

Reaganism Reaches an Impasse

As fiscal austerity and the liberal-labor retreat shifted American politics to the right, the long recession of the early 1980s began to lift. Thus, Ronald Reagan and his party entered the 1984 campaign season at the height of their popularity. The Democrats were deeply divided. Jesse Jackson, an African American minister whose Rainbow Coalition aspired to represent a multiracial alliance of the poor and working class, proved a dynamic spokesman for the party’s social democratic wing. In contrast, Senator Gary Hart represented a growing neoliberal current, progressive on social issues but increasingly cool toward the labor movement and the welfare state. Former vice president Walter Mondale bested both of these candidates and injected some excitement into the campaign by selecting New York City congresswoman Geraldine Ferraro as his running mate. Ronald Reagan won the election in a landslide, taking 59 percent of the popular vote and every state except Minnesota, Mondale’s home state. Although African Americans and Jews voted solidly Democratic once again, Republicans kept the allegiance of blue-collar “Reagan Democrats,” swept the white South, and won a majority of votes even among white women and union members.

But Reaganism reached an impasse after the start of the new term, both as an economic doctrine and as a foreign policy prescription. Although unemployment began to fall and the stock market began to boom, supply-side economics had little to do with it. Rather, the surge in military spending, combined with lower interest rates and a large federal budget deficit, pulled the U.S. economy out of the recession. Indeed, Congress repudiated much of Reagan’s supply-side tax program early in his second term; by closing numerous tax loopholes and subjecting capital gains to the same tax rate as ordinary income, it raised the effective tax rate on corporations and the rich.

Reagan’s aggressively anti-Communist foreign policy also ran into trouble. Outraged by the administration’s policy of “constructive engagement” with white South Africa, U.S. college students spearheaded a broad national movement demanding that universities and companies divest themselves of investments in the strictly segregated apartheid regime. Early in 1985, thousands of protesters were arrested at South African embassies and consulates. This activism spurred Congress to pass, over President Reagan’s veto, a law banning new investments in South Africa or loans to the South African regime, an important step in the peaceful revolution that brought multiracial democracy to that nation in the early 1990s.

Opposition to U.S. intervention in the civil wars in Nicaragua and El Salvador and increased military spending also grew in the 1980s. Although Reagan hailed the Nicaraguan counterrevolutionaries (known as Contras) as “freedom fighters,” Congress remained skeptical; in 1984, it passed the Boland Amendment banning U.S. military aid to the Contras. Finally, Reagan’s support for a new round of Cold War military spending encountered stiff resistance, especially after reformer Mikhail Gorbachev assumed power in the Soviet Union.

Crisis engulfed the Reagan presidency in 1986 when the public learned that officials from the Central Intelligence Agency (CIA), the National Security Council (NSC), and the State Department had conspired to organize a covert and illegal government operation to aid the Contras in violation of the Boland Amendment. At the instigation of CIA director William Casey and national security adviser Robert McFarlane, the United States secretly sold millions of dollars worth of military equipment to Iran, a regime that the United States publicly denounced for supporting terrorism. The profits from this illegal arms trade, along with other money that was raised secretly from foreign governments, were then used to fund the Contras in their war against Nicaragua’s radical Sandinista government. Several NSC officials went to jail, and much evidence suggested that Reagan had condoned the illegal acts. Although Democratic lawmakers shied away from any effort to impeach the still-popular president, the Iran-Contra affair nonetheless deprived Reagan of his ability to set the national political agenda for the remainder of his term. In the 1986 congressional elections, the Democrats recaptured control of the U.S. Senate, and the next year, a liberal coalition generated sufficient pressure to persuade the Senate to reject Robert Bork, Reagan’s highly conservative U.S. Supreme Court nominee. In his place, the Senate easily approved Anthony M. Kennedy, who with Sandra Day O’Connor, a 1981 Reagan appointment and the first women to serve on the high court, became the core of a centrist bloc on the Supreme Court.

Culture Wars

Reaganism proved to be far more of a political than an economic success. If at the end of the 1980s, the Reagan administration stood at a policy impasse, its eight years of governance had nevertheless shifted the nation’s politics well to the right. The Republican Party no longer had room for a liberal, pro-welfare-state wing. The Democrats still controlled Congress, though they had neither the votes nor the will to propose legislation of the sort that Lyndon Johnson and Hubert Humphrey had once championed. Reagan Republicans set the nation’s political agenda, even if they could not always carry the day on any particular issue. If the culture of the university and the old-line philanthropies remained largely liberal, conservative intellectuals’ presence grew on television talk shows, on newspaper opinion pages, and in such influential, well-funded think tanks as the Heritage Foundation and the Cato, Hudson, and American Enterprise institutes.

But Reaganite political power was not matched by a conservative capacity to transform the nation’s social mores or restrain the increasingly adventuresome character of U.S. culture, entertainment, and social thought. This caused enormous frustration to many intellectuals and politicians on the right, who saw Reaganism not merely as a political or economic doctrine, but also as a movement to reverse some of the dramatic cultural changes that had transformed American society since the 1960s. William Bennett, Reagan’s secretary of education, denounced “relativism” and “multiculturalism” in university curricula, arguing instead for a return to the study and celebration of what he called the “Judeo-Christian tradition.” Allan Bloom, a neoconservative political theorist, briefly soared to prominence with publication of The Closing of the American Mind, a 1986 best-seller that assaulted student activism, cultural relativism, academic Marxism, and rock-and-roll.

But American culture remained pluralistic. Evangelical Protestantism continued to grow in numbers, and in 1986, allies of the conservative televangelist Pat Robertson won control of the Southern Baptist Convention, the nation’s largest religious denomination. But the influence of the Religious Right remained limited. Several states did impose some restrictions on the right of women to secure an abortion, especially for teenagers or for women in the last trimester of pregnancy, but this medical practice remained legal and widely available in the United States. Women’s participation in the workforce continued to increase, and at the end of the 1980s, affirmative action programs still benefited members of racial minorities who sought employment, job promotions, or admission to college. Meanwhile, several well-known fundamentalist ministers, including Jimmy Swaggart and Jim Bakker, became embroiled in embarrassing sex and financial scandals.

Popular music defined a technological and aesthetic frontier. Cassettes replaced vinyl records as the Sony Walkman and the portable boom box made listening both more all-pervasive and more private. Country music still held the greatest radio audience, but rock-and-roll, which dominated record and cassette sales, continued to showcase the nation’s cultural avant garde. Two of the biggest pop stars of the 1980s, Michael Jackson and Madonna, were not only fabulously successful entertainers, but also racial and gender experimentalists. Jackson, whose 1982 album Thriller sold more copies than any other record in history, transformed himself into a racially and sexually ambiguous icon. Madonna, an indefatigable exhibitionist, redefined a modern sexuality that deployed an earthy postfeminist sensibility, exemplified in her “Material Girl” and other best-selling songs.

Music television, a genuinely new art form, burst onto the scene in 1981 with the appearance of the MTV cable television network, about the same time that rap, or hip-hop, emerged out of the African American ghettos. Combining rhythmic verse with a driving beat derived from scratching the surface of a record and sampling the music of other rhythm-and-blues and rock-and-roll performers, rap spoke to the daily experiences of Black inner-city youth facing gang violence, the crack epidemic, police brutality, and economic strain. By the end of the decade, many Latinx residents, who introduced bilingual lyrics, and white suburban teenagers had enthusiastically embraced the music, the accompanying hip-hop style in fashion, and rap’s outlaw image. Groups such as Public Enemy, whose Fight the Power was an openly Afrocentric anthem, moved the music onto political terrain.

Conclusion: The Reagan Legacy

In the 1970s and 1980s, the United States entered an era of post–New Deal politics and political economy. The globalization of trade, finance, and manufacturing put enormous pressure on American firms and made far less efficient and effective the kind of liberal interventionist economic policies that had worked so well when the North American continent coincided with U.S. labor and product markets. But politics still trumped market forces. Thus, the price of oil, which seemed to rise inexorably during the 1970s, plunged after 1984, largely because of political disarray among the oil-producing nations in the Middle East, government-mandated conservation measures, and slower economic growth.

Likewise, the stagnation in the American standard of living was not a product of what some politicians of the 1970s liked to call “an era of limits”; it reflected instead an increasingly successful effort to make American workers pay for the return of U.S. business to a more profitable and competitive status. Reagan’s massive tax cut in 1981 slashed social spending and offered the wealthy billions in tax relief. And his destruction of the air traffic controllers’ union later that same year inaugurated a generation-long assault on the organized working class that was as debilitating as the defeat of the Homestead strikers in 1892 and the immigrant steelworkers in 1919.

Reaganism began a realignment in American politics and political economy that would take a full generation to complete. The new conservatism won recruits in the white South and among northern blue-collar Democrats. It eroded labor’s strength, legitimized business’s power and prestige, and checked the headway the civil rights, women’s, and gay rights movements had made. But anti-Communism still constituted much of the glue that held together the Reaganite majority, so the end of the Cold War would open the door to a new configuration of politics and ideology on the home front as well as abroad.

Timeline

1972

Congress approves the Equal Rights Amendment, but it fails to win ratification by states after the New Right mobilizes against it.

1973

The Organization of Petroleum Exporting Countries raises oil prices from to a barrel. The resultant energy crisis leads to federally mandated gasoline rationing in 1974 and a nationwide speed limit of 55 miles per hour.

1974

In September, President Ford pardons Richard Nixon, who resigned in August to avoid impeachment.

1975

New York City narrowly averts bankruptcy; President Ford refuses to provide federal assistance.

1976

Georgia Democrat Jimmy Carter defeats Gerald Ford to become president.

1977

A new climate of homophobia grows when singer Anita Bryant spearheads a successful movement to repeal a Florida antidiscrimination ordinance.

1978

California voters pass Proposition 13, capping property taxes.

1979

A near-meltdown occurs at Pennsylvania’s Three Mile Island nuclear power plant.

1980

The United States boycotts the Moscow Olympics in response to the Soviet invasion of Afghanistan, signaling the breakdown of détente.

1981

The Reagan administration and Congress cut taxes and domestic social programs and raise military spending.

1982

The first case of AIDS is reported in the United States.

1984

Ronald Reagan captures 59 percent of the popular vote, defeating Democrat Walter Mondale and the first female vice presidential candidate, Geraldine Ferraro.

1985

The local packinghouse union in Austin, Minnesota, wages an unsuccessful strike against Hormel.

1987

Enthusiasm for televangelism wanes after Jimmy Swaggart and Jim Bakker are involved in sex and money scandals.

Additional Readings

For more on the end of the postwar boom, see:

Donald L. Barlett and James B. Steele, America: Who Stole the Dream (1996); David Bensman and Roberta Lynch, Rusted Dreams: Hard Times in a Steel Community (1988); William Greider, Secrets of the Temple: How the Federal Reserve Runs the Country (1989); Max Holland, When the Machine Stopped: A Cautionary Tale from Industrial America (1990); Robert B. Reich, The Work of Nations: Preparing Ourselves for 21st Century Capitalism (1992) and Daniel Rodgers, Age of Fracture (2011).

For more on the changes to U.S. business in the 1970s and the rise of globalization, see:

William Greider, One World, Ready or Not: The Manic Logic of Global Capitalism (1997); Bennett Harrison and Barry Bluestone, The Great U-Turn: Corporate Restructuring and the Polarizing of America (1998); Bethany Moreton, To Serve God and Wal-Mart: The Making of Christian Free Enterprise (2009); Robert B. Reich and John D. Donahue, New Deals: The Chrysler Revival and the American System (1986); and Susan J. Tolchin and Martin Tolchin, Dismantling America: The Rush to Deregulate (1985).

For more on presidential policies under Gerald Ford and Jimmy Carter, see:

James M. Cannon, Time and Chance: Gerald Ford’s Appointment with History (1998); Peter N. Carroll, It Seemed Like Nothing Happened: America in the 1970s (1990); Burton I. Kaufman, The Presidency of James Earl Carter, Jr. (1993); Kenneth E. Morris, Jimmy Carter: American Moralist (1996); and Robert A. Strong, Working in the World: Jimmy Carter and the Making of American Foreign Policy (2000).

For more on the rise of the new right, see:

William C. Berman, America’s Right Turn: From Nixon to Clinton (The American Movement) (1998); Joseph Crespino, In Search of Another Country: Mississippi and the Conservative Counterrevolution (2009); Thomas Ferguson and Joel Rogers, Right Turn: The Decline of the Democrats and the Future of American Politics (1987); Godfrey Hodgson, The World Turned Right Side Up: A History of the Conservative Ascendancy in America (1996); John B. Judis, William F. Buckley, Jr.: Patron Saint of the Conservatives (1990); Rebecca E. Klatch, Women of the New Right (1988); Matthew Lassiter, Silent Majority: Suburban Politics in the Sunbelt South (2005); William Martin, With God on Our Side: The Rise of the Religious Right in America (1996); Bruce J. Schulman, The Seventies: The Great Shift in American Culture, Society, and Politics (2001); and Judith Stein, Running Steel, Running America: Race, Economic Policy and the Decline of Liberalism (1998).

For more on the revolt against taxes and busing in the 1970s and 1980s, see:

Robert Kuttner, Revolt of the Haves: Taxpayer Revolts and the Politics of Austerity (1980); and J. Anthony Lukas, Common Ground: A Turbulent Decade in the Lives of Three American Families (1986).

For more on gender politics during the 1970s and 1980s, see:

Tracey Deutsch, Building a Housewife's Paradise: Gender, Politics, and American Grocery Stores In the Twentieth Century (2011); Susan M. Hartmann, From Margin to Mainstream: American Women and Politics since 1960 (1989); Susan M. Hartmann, The Other Feminists: Activists in the Liberal Establishment (1998); Alice Kessler-Harris, In Pursuit of Equity: Women, Men, and the Quest for Economic Citizenship in 20th-Century America (2001); Jane J. Mansbridge, Why We Lost the ERA (1986); Rosalind Pollack Petchesky, Abortion and Woman’s Choice: The State, Sexuality, and Reproductive Freedom (1990); Judith Stacey, Brave New Families: Stories of Domestic Upheaval in Late Twentieth Century America (1998); and Winifred D. Wandersee, On the Move: American Women in the 1970s (1988).

For more on the growing economic disparity and issues among the poor, see:

Eric Avila, Popular Culture in the Age of White Flight: Fear and Fantasy in Suburban Los Angeles; Dawn Biehler, Pests In the City: Flies, Bedbugs, Cockroaches, and Rats (2013); Paul Blumberg, Inequality in an Age of Decline (1980); Thomas Byrne Edsall, The New Politics of Inequality (1985); Arlie Hochschild, The Second Shift: Working Parents and the Revolution at Home (1989); Katherine S. Newman, Falling from Grace: Downward Mobility in the Age of Affluence (1999); Juliet B. Schor, The Overworked American: The Unexpected Decline of Leisure (1993); Ruth Sidel, Women and Children Last: The Plight of Poor Women in Affluent America (1992); and William Julius Wilson, The Truly Disadvantaged: The Inner City, the Underclass, and Public Policy (1990).

For more on labor struggles in the 1970s and 1980s, see:

Eileen Boris and Jennifer Klein, Caring for America: Home Health Workers in the Shadow of the Welfare State (2012); Jefferson R. Cowie, Capital Moves: RCA’s Seventy-Year Quest for Cheap Labor (1999); Jefferson R. Cowie, Stayin' Alive: The 1970s and the Last Days of the Working Class (2012); Thomas Geoghegan, Which Side Are You On?: Trying to Be for Labor When It’s Flat on Its Back (1992); Barbara Kingsolver, Holding the Line: Women in the Great Arizona Mine Strike of 1983 (1997); Kim Moody, An Injury to All: The Decline of American Unionism (1997) and Annelise Orleck, Storming Caesars Palace: How Black Mothers Fought Their Own War on Poverty (2005).

For more on President Reagan and his policies, see:

W. Elliot Brownlee and Hugh Davis Graham, eds., The Reagan Presidency: Pragmatic Conservatism and Its Legacies (2003); Sidney Blumenthal and Thomas Byrne Edsall, eds., The Reagan Legacy (1988); Matthew Dallek, The Right Moment: Ronald Reagan’s First Victory and the Decisive Turning Point in American Politics (2000); E. J. Dionne, Jr., Why Americans Hate Politics (1992); Frances FitzGerald, Way Out There in the Blue: Reagan, Star Wars, and the End of the Cold War (2000); Fred Halliday, From Kabul to Managua: Soviet-American Relations in the 1980s (1989); Joel Krieger, Reagan, Thatcher, and the Politics of Decline (1986); Joseph A. McCartin, Collision Course: Ronald Regan, the Air Traffic Controllers, and the Strike that Changed America (2011); and David Thelen, Becoming Citizens in the Age of Television: How Americans Challenged the Media and Seized Political Initiative During the Iran-Contra Debate (1996).