Thank you for using Who Built America?  The project is currently in beta with new features to be implemented over the coming months, so please check back. If you have feedback or encounter any bugs, please fill out this form.

Volume 2, Chapter 7

A New Era, 1920-1929

On January 1, 1929, the editors of the Washington Post prepared a special business supplement to welcome the New Year. “Good Times Are Predicted for 1929,” read the banner headline. The Post hardly seemed to be going out on a limb with this forecast. After all, two other headlines trumpeted “Record Year Just Ended” and “Gains in Stock Exchange.” Further fueling the business community’s optimism was the feeling that the federal government took their views to heart. “Hoover’s Policies Looked Forward to as Great Aid to Business,” still another headline promised.

The Post could have made this cheerful prediction almost any time during the 1920s. At least for the wealthiest Americans, these were indeed good times. “The whole upper tenth of a nation,” novelist F. Scott Fitzgerald wrote, lived “with the insouciance of a grand duc and the casualness of chorus girls.” Fitzgerald, the most celebrated writer of the decade, gave the new era a label that stuck: “The Jazz Age,” an invocation of a carefree time of high living, bootleg liquor, and illegal drinking in speakeasies. Fitzgerald himself spent much of the decade in France, drinking and dancing till dawn with his wife Zelda.

More than just the “upper tenth” enjoyed some of the good times. For the first time, substantial numbers of workers lived above bare subsistence levels. Even automobiles and houses became attainable for many. As working hours decreased and incomes rose, leisure time and ways to enjoy it mushroomed, feeding the entertainment industry. Movies and radio promoted consumerism, undermining the traditional ideals and values of the family, community, and ethnic and regional subcultures.

Even so, few Americans lived like grand dukes or like chorus girls. Prosperity was uneven. While the rich got much richer, the average worker saw only a slow rise in income. Workers in some industries suffered falling wages and massive unemployment. On the whole, skilled and white-collar workers did much better than the unskilled did. Farmers suffered for most of the decade. And as usual, African Americans fared worse than white citizens. A resurgent Ku Klux Klan spewed venom at African Americans as well as at Catholics and Jews. Hostility toward immigrants impelled the nation to shut its doors to foreigners.

But perhaps the most basic problem of the 1920s was the one no one seemed to notice: the unequal distribution of the nation’s wealth made its economy fundamentally unsound. Before the year was out, the Post would be running very different headlines announcing the crash of the stock market and the early signs of hard times. When a discouraged Fitzgerald, now battling alcoholism, returned to America in 1931, he found a country racked by depression.

Business Conservatism at Home and Abroad

In the 1920s, conservative Republicans with probusiness policies dominated politics. Calvin Coolidge described business as “one of the greatest contributing forces to the moral and spiritual advancement of the race.” Such views also shaped American foreign policy, which sought an “open door” for American corporations to operate with a free hand abroad. Ironically, at the same time, the United States decisively closed the “open door” that had long marked its immigration policy and imposed sharp limits on newcomers, especially from Southern and Eastern Europe and Asia.

Conservatism and Corruption in Political Life

The 1920 presidential election set the political tone for the decade. Both major parties nominated middle-of-the-road politicians from Ohio: Republican Senator Warren G. Harding and Democratic Governor James Cox. Since the last election, many Americans had tired of Wilson and his moral righteousness, and they wanted relief from the rampant inflation and social turmoil that had come with the end of World War I—the strikes, race riots, and Red scares of 1919. Harding, capitalizing on this weariness, declared that the country needed “not heroism but healing, not nostrums but normalcy, not revolution but restoration.” The election results—a landslide for the Republicans—showed that he had captured the public mood.

Although Harding’s vision of “normalcy” embraced generous acts, such as pardoning the imprisoned socialist Eugene Debs, it translated more fundamentally into extraordinary corporate influence on national policy. During the 1920s, the political power of big business climbed to new heights. Secretary of the Treasury Andrew Mellon, one of the wealthiest men in the nation, and Secretary of Commerce Herbert Hoover dominated Harding’s cabinet.

The discovery of considerable corruption after Harding’s sudden death in 1923 destroyed his modest reputation. Subsequent polls of historians have consistently ranked him one of the worst presidents in U.S. history. “Harding was not a bad man,” Alice Roosevelt Longworth (Theodore Roosevelt’s daughter) later observed. “He was just a slob.” She recalled with distaste Harding’s drinking and carousing with political buddies in the “Ohio Gang.” When she visited the White House study, she found the “air heavy with tobacco smoke. . . every imaginable brand of whisky . . . cards and poker chips ready at hand.” Fortunately, she did not open the closet doors, or she might have discovered the president making love to his mistress, Nan Britton.

Whatever his personal failings, Harding himself was never accused of public wrongdoing, but he generally looked the other way when his friends put their hands in the government till—taking kickbacks on government contracts, for example. The most notorious scandal involved the secret leasing, at discount prices, of government-owned oil reserves in California and Wyoming, one of which was called Teapot Dome. The grateful oil companies rewarded Secretary of the Interior Albert Fall, who arranged the deal, with more than 0,000 in bribes.

These gradually unwinding scandals probably contributed to Harding’s depression and high blood pressure and finally his death, probably from a stroke. His successor, Vice President Calvin Coolidge, was a very different sort of man—upright, cautious, and introverted. Under Coolidge, Alice Longworth observed, the White House was “as different as a New England front parlor is from a backroom in a speakeasy.” But while Coolidge differed in style, he continued the same conservative policies.

The Business of America Is Business

More than most of their predecessors, Harding, Coolidge, and the men who served in their administrations identified the fortunes of America with those of business. “The chief business of the American people is business,” Coolidge declared. Probusiness conservatives such as Secretary of the Treasury Mellon believed that the government could best aid business by cutting taxes and minimizing federal intervention in the economy. In 1926, Mellon succeeded in persuading Congress to halve the income tax rate for the top bracket of taxpayers and drastically lower the taxes on inherited wealth. Mellon also wanted to return quickly to private hands the businesses that the government had operated during the war. Although the American Federation of Labor (AFL) and the railroad brotherhoods, which had prospered with government control of the trains, wanted the nation to buy the railroads, Congress instead restored them to corporate control in 1920.

Some cabinet officials, however, believed that the government should actively promote and coordinate private economic initiatives. To keep farm prices up, Secretary of Agriculture Henry C. Wallace created cooperative marketing arrangements and voluntary crop reduction programs. Secretary of Commerce Herbert Hoover, the engineer and business executive who had served as head of the Food Administration during the war, sponsored studies of industrial waste and inefficiency, pushed for further standardization in industry, and worked with the State Department to increase exports. Hoover also encouraged state and local governments to use public works to stimulate the economy. But Hoover’s modest government activism stayed safely within the bounds of the era’s dominant probusiness ideology.

The business-government bond became an issue in the 1924 presidential election, when discontented farmers and unionists joined together to back a third-party presidential bid by Wisconsin Senator Robert La Follette. The senator’s platform, which harked back to prewar Populism and Progressivism, called for nationalization of railroads and water power, a ban on antilabor injunctions, increased aid for farmers, and a tax system restructured to benefit working people.

The Democrats might have been expected to join La Follette in attacking the business-dominated Republican administration. But deep cultural issues divided the Democrats and deadlocked their 1924 convention. Southern Democrats supported Prohibition and refused to denounce explicitly the racist and anti-Catholic Ku Klux Klan. Meanwhile, the immigrant-led urban political machines opposed Prohibition and wanted to nominate a Catholic, Governor Alfred E. Smith of New York, for president. After a record 103 ballots, the two factions compromised on an obscure and relatively conservative Wall Street lawyer, John W. Davis, who could neither unite his divided party nor offer voters a clear alternative to Coolidge. Although both Davis and Coolidge sharply attacked La Follette as a dangerous radical, he received one-sixth of the popular vote, an extraordinary showing for a third-party candidate. Coolidge, however, won the election easily, and the business-government alliance remained intact.

Capital and Statecraft

The business-government partnership that guided domestic policy also shaped U.S. foreign policy in the 1920s. To open the “door” for American business, Charles Evans Hughes, secretary of state under Harding and Coolidge, used federal power to assist American businesses that wished to expand overseas. Critics charged that Wall Street was dictating American foreign policy, but a State Department official retorted that “in these days of competition . . . capital . . . and statecraft go hand in hand.” Working with Herbert Hoover, who as secretary of commerce exercised great influence over American foreign policy, Hughes sought to create a new world order that would bring stability and allow for the expansion of American capitalism—a “Pax Americana,” as Hughes put it.

American businesses would have to exercise this global power while the United States stood aloof from the organization that Woodrow Wilson had designed precisely to promote world stability: the League of Nations (see Chapter 6). Without American involvement, the League remained permanently weak. Nevertheless, the U.S. government, American businesses, and American citizens became increasingly active in world affairs during the 1920s. At the Washington Conference of 1921, Hughes, a brilliant diplomat, engineered a halt to the naval arms race, which threatened peace and the ability of American businesses to expand abroad.

American bankers were central to another key diplomatic agreement of the 1920s. By 1924, the Germans could no longer pay the billion in reparations that had been imposed on them at the end of World War I. Hughes sponsored a Washington conference in which American bankers worked out a deal, called the Dawes Plan after Chicago banker Charles Dawes, that sharply reduced German reparations payments and provided massive private loans to restart the German economy. But what looked like a brilliant success ultimately proved a dismal failure. Increasingly, the loans went into speculative ventures or simply propped up the crippled German economy rather than rebuilding it.

Bankers and businessmen were not the only Americans to look overseas in the 1920s. Private voluntary organizations such as the YMCA and Rotary International founded branches in other countries. Like the Coolidge administration, these organizations favored the spread of American business and opposed radical alternatives such as the Socialist and Communist movements that had large numbers of adherents in Europe. Another group of Americans formed a loosely defined peace movement. They won their greatest victory with the Kellogg-Briand Pact of 1928; signed by most of the world’s nations, the treaty outlawed war but failed to set up an effective enforcement mechanism.

Business, however, provided the real engine of American internationalism. American companies scrambled to build factories in other countries to take advantage of cheap labor, low tariffs, and easy access to raw materials and markets. By 1929, eight American automakers were running assembly plants abroad. Other companies invested in foreign enterprises; General Electric, for example, held shares in every major electric company in the world. Soon, mass retailers such as Montgomery Ward, Woolworth’s, and A&P began to expand into foreign countries. Finally, U.S. companies took an increasingly active hand in exploiting the raw materials of Latin American countries: Venezuelan oil, Chilean copper, Cuban sugar, Argentine beef, and Central American fruit. Direct U.S. investment abroad doubled in the 1920s; by 1930, the United States also led the world in exports.

In selling and investing, American companies received generous assistance from the federal government. Worried about a potential oil shortage, U.S. diplomats helped to identify foreign oil sources that companies might use to satisfy the growing needs of American motorists. Cars also needed rubber tires, so Secretary of Commerce Hoover encouraged Firestone and other tire companies to expand their rubber plantations in West Africa and Southeast Asia to break British dominance of the world rubber market.

The growth of U.S. investment abroad was also facilitated by the government’s willingness to “send in the marines.” Wilsonian principles of self-determination, Secretary of State Hughes argued, did not apply to poor countries. Over and over, the U.S. military intervened when rebels demanding self-rule threatened American industry. The United States occupied Nicaragua almost continuously from 1912 to 1933 and Haiti from 1915 to 1934. In both countries, the U.S. Marines met resistance in the form of guerilla warfare, led by Augusto Sandino in Nicaragua and Charlemagne Péralte in Haiti. U.S. Marine Corps General Smedley Butler later gave this blunt summary of his role in U.S. military interventions in Central America: “I helped in the raping of half a dozen Central American republics for the benefit of Wall Street. I spent most of my time being a high-class muscle man for Big Business, for Wall Street and the bankers. In short, I was a racketeer for capitalism.”

Immigration Restrictions

While the United States stood firmly for the free flow of goods and capital around the globe, it backed away from its earlier commitments to the free flow of people. During World War I, anti-immigrant sentiment had led to a literacy test for immigrants, and wartime conditions had sharply curtailed the influx of foreigners. With the end of the war, immigration returned to previous levels—800,000 people arrived in 1921—and anti-immigration sentiment rose with it. Congress passed the Quota Act of 1921 and then, in 1924, the even more restrictive Immigration (Johnson-Reed) Act, which shaped U.S. immigration policy for the next four decades. Initially, the 1924 law limited immigration from all countries to a total of 165,000 a year, less than 20 percent of the pre–World War I average. A 1927 law set even lower limits. Both the 1924 and 1927 laws set quotas by countries, based on the percentage of a particular nationality in the United States as recorded in the 1890 census. This maneuver was a blatant effort to limit immigration from southern and eastern Europe, which occurred primarily after 1890. In the first decade of the twentieth century, an average of 200,000 Italians had entered the United States each year; in 1924, the annual quota for Italians was set at less than 4,000.

Asians fared even worse under the new law, which barred all immigrants who were ineligible for citizenship. That included Asian Indians, Japanese, and Chinese, all of whom were judged ineligible on the basis of past judicial rulings. In 1922, the Supreme Court had ruled Japanese immigrants ineligible for citizenship; the following year another court ruling barred Asian Indians, who had begun immigrating to work in the northwestern lumber mills and on California farms in the early twentieth century. These new immigration restrictions reflected a resurgent nativism and racism. During the Senate debate on the 1924 Johnson-Reed Act, Ellison DuRant Smith of South Carolina urged lawmakers to “shut the door” to preserve “the pure, unadulterated Anglo-Saxon stock” that had made America “the foremost nation in her progress.”

The new restrictions partially excepted immigration from within the Western Hemisphere. In part because the United States wanted to maintain good relations with its neighbors and in part because farmers (and especially agribusiness) in the Southwest insisted on a ready supply of cheap labor, Mexicans and Canadians could immigrate more freely than Europeans and Asians. Nevertheless, Mexicans faced increased barriers to entry, including a head tax and a visa fee, which were enforced by the new Border Patrol. Still, almost half a million Mexicans entered the United States in the 1920s—double the number in the previous decade.

Mexican immigrants favored the Southwest, but significant numbers headed for jobs in industrial cities of the East and Midwest, such as Bethlehem, Pennsylvania; Detroit and Flint, Michigan; and Chicago, Illinois. The heavily male Mexican population of Chicago forged “Colonias” (Mexican residential enclaves) in industrial areas of the city that the newcomers shared with Eastern and Southern European immigrants. They also began to develop community institutions such as the Chicago Mexican church, Our Lady of Guadalupe, which was formed in 1924.

A CLOSER LOOK: Chinese Restaurants During the Era of Exclusion

The New Economy

The U.S. economy experienced a fundamental shift during the 1920s as consumer-oriented industries such as automobile and appliance manufacturing replaced coal and railroads as the leading sectors of the economy. Overall, the 1920s were an era of rising standards of living and economic growth, but the proceeds of that growth were distributed very unevenly. While some industries (automobiles) boomed, others (coal and textiles) declined. While corporate profits skyrocketed, working-class wages grew slowly. While the rich and the middle class enjoyed a dazzling array of new consumer goods, farmers suffered from overexpansion and debt, and unions lost members and influence.

Shifts in Manufacturing

The old economy, based on typical nineteenth-century industries—steel, coal, textiles, railroads, lumber, meatpacking and shipping—was being replaced by one that was more directly responsive to consumer demand for automobiles, entertainment, processed food, ready-made clothing, petroleum chemicals, home appliances, and other mass-merchandised goods. Demographic changes, in part, drove the change. Americans were having fewer children, and they were moving to cities. In 1920, the urban population had surpassed the rural population for the first time, and it would continue to grow throughout the decade. Households grew more numerous but smaller. In 1890, about one-third of American households had fewer than four people; by 1930, over half did. The increase in the number of households fed the demand for household goods. So did the growth of cities, which meant less time and space for home production.

Consumer goods accounted for more economic activity than ever before and for most of the increase in manufacturing. Yet while manufacturing output nearly doubled between 1921 and 1929, the manufacturing workforce barely grew; almost the same number of workers now produced twice as many goods. During and after the war, manufacturers invested heavily in new machines, which reduced the number of workers needed, especially skilled workers. For example, a machine that painted decorative strips on cars eliminated ten jobs. Many factories also converted from steam to electric power, a more efficient source of energy. Productivity gains (and cuts in employment) also came from speed-ups, the practice of forcing employees to work harder and faster. This was especially prevalent in the southern textile industry. One southern mill hand recalled her nightmares: “I just sweated it out in my dreams just like I did when I was there on the job; wanting to quit but I couldn’t afford to.”

Service industries, finance, construction, and public utilities all grew more than manufacturing during the 1920s. Construction offered many new jobs (both home and road building boomed), as did commerce, services, and finance. Clerical employment alone increased by nearly a million jobs during the 1920s, while the proportion of workers engaged in manufacturing, mining, agriculture, and railroading declined.

Economic Growth and Social Inequality

Throughout the 1920s, industrialists and bankers hailed what they called “the new capitalism” that had conquered the business cycle and brought prosperity to all. Banker Charles E. Mitchell claimed that this new era was bringing “all classes of the population to a more equal participation in the fruits of industry.” But rather than shrinking the gap between the rich and the poor, economic growth increased it. Corporate profits nearly doubled in the 1920s, but factory workers’ wages rose only modestly—less than 15 percent, by one estimate. Workers in some industries did considerably better than those in others. Between 1923 and 1929, workers in the automobile, electrical, and printing industries received wage increases of 10 to 28 percent, but those in the shoe, coal, and textile industries watched their wages drop. In general, skilled workers (in construction, for example) and white-collar workers (such as postal clerks) did much better than the unskilled (domestic servants and others).

The ups and downs of the business cycle, which continued despite assertions to the contrary, exacerbated joblessness and insecurity. The initial postwar boom gave way to hard times in late 1920. Prices fell sharply, and 100,000 firms went bankrupt. The economy recovered in 1923, but two more recessions soon followed.

In spite of these fluctuations, the nation as a whole experienced rising standards of living in the 1920s. In the aggregate, the real earnings of all employed wage earners (not just working-class or blue-collar workers) rose about 25 percent. Americans were also getting more schooling; 27 percent of all seventeen-year-olds graduated from high school in 1929, up from 16 percent in 1920. And they were living longer: life expectancy went from age forty-seven in 1900 to fifty-nine in 1930.

Despite the overall prosperity, some people saw signs of deeper economic trouble. Part of the problem lay in the painful and disruptive shift from the old industries that were ailing to the new industries that were not yet mature. But those troubles were compounded by financiers who speculated in new enterprises and engaged in a variety of unsound financial practices, including buying stock on borrowed money and creating dummy corporations that existed only to hold the stocks and bonds of other companies. Stock prices rose steadily throughout the decade, accelerating to speculative heights in 1928 and 1929.

A third sign of trouble came from what analysts of the Great Depression would eventually label “overproduction.” By 1928, commercial real estate and consumer-oriented industries such as housing, automobiles, and electrical products were slowing down. Sales lagged, inventories rose, factories cut their output, and unemployment rose. Even before the stock market crash of 1929, many executives thought that their markets were saturated—that is, everybody who was going to buy their products was already doing so. For example, in 1923, new cars outsold used cars three to one; by 1927, that ratio had reversed.

But what the businessman saw as overproduction was underconsumption to the many workers who could not afford to buy the new consumer items the nation’s factories were churning out. Workers and farmers were buying all those used cars in the late 1920s because they could not afford new ones. In one Indiana city, almost three-quarters of all families earned less than the Bureau of Labor’s minimum cost of living standard. By one estimate, two-fifths of all Americans lived in poverty in the 1920s.

Meanwhile, the rich were getting richer. In 1929, the 36,000 wealthiest families received as much income as did the twelve million poorest. Wealth—in stocks, bonds, and real estate—was concentrated among these fortunate few to an unprecedented degree. This growing social inequality was paralleled by the concentration of productive capacity among a handful of enormous corporations, such as General Motors, General Electric, and AT&T. By 1929, the two hundred largest U.S. corporations controlled half of all corporate assets.

Agriculture in Crisis

While the consumer economy boomed, overproduction, falling prices, and declining income plagued agriculture. Farmers enjoyed modest prosperity until 1920; agricultural prices had risen faster than industrial prices before World War I, and the extraordinarily high wartime demand for farm products further improved farmers’ purchasing power. But the economic downturn of 1920 hit agriculture particularly hard. In 1910, a suit of clothes cost the equivalent of twenty-one bushels of wheat; a decade later, that suit was worth thirty-one bushels. A bushel of corn that had sold for .22 in 1919 brought just forty-one cents a year later. “’Leven-cent cotton, forty-cent meat, How in the world can a poor man eat?” asked a popular song of the late 1920s.

Overexpansion and debt were at the heart of the farm crisis. During wartime, large European orders for American farm products had prompted farmers to expand, often on borrowed money. When conditions in Europe returned to normal, demand dropped, but output was still increasing, thanks to farmers’ investments in new fertilizers, seed strains, and machinery. At the same time, farms in Canada, Australia, Argentina, and Brazil joined in flooding the world market with excess produce. There was no market for all that farmers could grow. Nearly half a million Americans lost their farms, unable to meet their mortgage payments and equipment loans. Many independent farmers, rather than mortgaging their farms or becoming tenants, simply gave up. For the first time in the nation’s history, the total number of farms dropped. Thirteen million acres of cultivated land were abandoned between 1920 and 1930.

In these years, a new pattern was emerging in American agriculture. Large, well-financed farms now produced more and more of the country’s agricultural output. By 1930, half the nation’s farms yielded nearly 90 percent of the cash crop. Large midwestern grain growers prospered by using mechanized equipment. Immense vegetable farms and fruit orchards thrived in California and Florida by shipping fresh produce to colder regions during the winter. Huge farms exploited both impoverished workers and new technologies, and settled farmers without much land or capital could not compete with them.

California’s fruit and vegetable industry, for example, depended on migrant labor and huge government-financed irrigation projects. Planting and harvesting an acre of lettuce took ten times as much labor as required for as an acre of wheat. As many as 200,000 farm laborers worked in California during the 1920s, three-quarters of them Mexican. Because demand for agricultural labor peaked at harvest time, these workers moved from region to region as the crops ripened. Children and parents worked side by side in the fields, with little opportunity for social life or schooling. Living conditions were atrocious.

Thus, whether within the emerging “agribusiness” of California or on the family farms of the Midwest, economic hard times were a fact of life for rural laborers well before the rest of the country experienced them in 1929. Southern cotton growers particularly faced hard times in years when bumper crops drove prices down. Even worse off were the millions of sharecroppers and tenants. As Alabama sharecropper Ned Cobb put it, “Every time cotton dropped, it hurt the farmer. Had to pay as much rent, had to pay as much for guano [fertilizer], but didn’t get as much for his crop.” Millions of farmers never recovered from the drastic fall in tobacco, cotton, and wheat prices that followed the giddy expansion of World War I.

The crisis in agriculture revived rural protest and political activism. The Non-Partisan League, founded in North Dakota in 1915 and concentrated in the upper Midwest, campaigned for state ownership of grain elevators, packinghouses, and flour mills; state hail insurance; easy rural credit; and tax exemptions for farm improvements. In North Dakota, the League had elected a state governor and other officials and had instituted much of its program by 1920. But in 1921, League opponents forced a recall election that ousted the governor and attorney general and weakened the League’s influence. The American Farm Bureau Federation also promoted farmers’ interests but through more conservative channels, stressing cooperative marketing to increase farmers’ bargaining power. County agricultural extension agents (appointed in every rural county under the federal Agricultural Educational Extension Act of 1914) encouraged farmers to join the federation rather than more radical farm organizations.

Many farmers pinned their hopes on the McNary-Haugen bill, which was designed to restore the prewar relationship between farm prices and industrial prices—a balance between farm product prices and expenses that came to be known as parity. The bill would have set up a government corporation to purchase the agricultural surplus and sell it abroad; tariffs were to protect domestic prices. Congress passed the bill in 1927 and 1928, but President Coolidge vetoed it. The plan was adopted during the New Deal, however, and became a cornerstone of U.S. farm policy.

Organized Labor in Decline

Like farmers, unions suffered in the consumer economy of the 1920s. The once militant labor movement that had led the 1919 strikes virtually disappeared. Conservative labor leaders provided a small elite of craft workers with union representation, but most other workers had to fend for themselves.

Employers used the economic downturn of 1920 and 1921 as an opportunity to reverse labor’s wartime gains; they slashed wages and increased hours, often in violation of union contracts. Building on wartime patriotism and the Red scare, they portrayed union shop contracts, which required all employees to join unions, as an infringement on American liberties. They launched a drive for the “open shop,” a workplace in which union membership either was not required or was forbidden. By late 1923, union membership had fallen to 3.6 million, from a high of over five million in 1920. For the rest of the decade, membership continued to decline despite organizing efforts by radical unions such as the National Textile Workers Union campaign in Southern textile mills. Weakened by membership losses, disheartened by defeat, and confronted by hostile employers and unsympathetic government officials, unions made virtually no progress in organizing the rapidly growing automobile, electrical equipment, and petrochemical industries.

A slightly more encouraging picture emerged initially among the 15,000 sleeping-car porters who worked for the Pullman Company, the manufacturer and administrator of railroad sleeping cars and the country’s largest Black employer. In 1925, the most important Black union in U.S. history, the Brotherhood of Sleeping Car Porters (BSCP), was founded under the leadership of socialist A. Philip Randolph. Thousands of porters soon joined. But the Pullman Company refused to recognize the union, fired several of its leaders, and began hiring Filipino porters to warn African Americans that they could be replaced. Then the AFL, reflecting in part the labor movement’s racism in the 1920s, refused to charter the union. By the end of the decade, the BSCP was down to a few hundred members, mirroring the general decline of organized labor in the conservative climate of the 1920s.

In industries that were dominated by a few large companies, employers promoted stable labor relations with relatively high wages, benefits, or employee welfare programs. Ford’s Five-Dollar Day was only the most generous of a wide range of paternalistic plans that large employers adopted. Some offered stock-purchase plans, pensions, subsidized housing and mortgages, insurance, and sports programs. In southern textile towns, companies built churches and paid ministers’ salaries. Some of these programs were aimed specifically at reducing labor turnover, particularly among skilled workers. “Company unions”—created to keep out worker-controlled unions, build employee loyalty, and settle grievances—ran many welfare programs.

Industries with many small, competing producers generally paid lower wages and offered inferior conditions than those that were controlled by a few large firms. In the bituminous, or soft-coal, industry, high wartime prices had prompted the opening of many new mines, particularly in Kentucky and West Virginia. But when demand fell after the war, with competition from oil, natural gas, and hydroelectric power, there were too many mines and miners. Discouraged, many young men left the mining towns for growing industrial cities such as Akron, Ohio, and Flint, Michigan.

In 1922, 600,000 members of the United Mine Workers of America (UMWA), the country’s largest and most powerful union, struck for over four months in response to pay cuts. In unionized mines in the North, most operators gave in, but the union failed to win new contracts in nonunion mines. By 1926, two-thirds of the nation’s soft coal came from nonunion pits. Hard-pressed owners, determined to compete with new low-priced international competition, broke the unions and slashed workers’ pay. Safety standards deteriorated, working hours increased, and wages fell. By 1928, union membership among soft-coal miners had fallen to 80,000 from about half a million in 1920.

Faced with stiff business opposition, a conservative political climate, and declining membership, AFL leaders grew increasingly cautious, especially after 1924, the year of Samuel Gompers’s death and the La Follette campaign. Unions confronted new legal barriers after a series of U.S. Supreme Court decisions that upheld “yellow-dog” contracts, in which workers promised when hired that they would never join a union. No charismatic new leader rose to the occasion. Gompers’s replacement, UMWA secretary-treasurer William Green, was, a contemporary remarked, “as plain, as plodding, and as undramatic as his name.” (The head of the miners’ union, John L. Lewis, unkindly reflected: “Explore the mind of Bill Green. . . . . I have done a lot of exploring of Bill’s mind and I give you my word there is nothing there.”) Green reinforced the conservative and conventional direction in which the labor movement was already heading when he took charge. Radicals viewed AFL leaders such as Green as overpaid functionaries who were more interested in feathering their own nests than in organizing the unorganized.

The Expansion of American Consumer Culture

The underlying problems of the economy were not immediately apparent to most contemporary observers. They preferred to focus on the more obvious transformations that the new consumer economy and mass culture were bringing to everyday life: the electric appliances appearing in the nation’s homes, the automobiles crowding the streets, the advertisements filling the glossy mass circulation magazines, the radio programs permeating the airwaves, and the ornate movie palaces sprouting up downtown.

The “New Era” brought both continuity and change for women and African Americans. Women, for example, continued to work for considerably lower wages than men even as they ventured into polling places and jury boxes, challenged Victorian restrictions on their behavior, and partook of the new consumer culture. Most Black Americans remained mired in poverty in the Jim Crow South, but even some of them were able to acquire the most important symbol of the consumer economy: an automobile. In addition, tens of thousands of African Americans headed north to seek a better life. Although they faced discrimination in the North as well, it did provide greater freedoms, which, in turn, fostered both political militancy and a rich urban culture.

Transformations in Daily Life

Across the country, builders scrambled to ease a serious housing shortage. Six million new homes went up between 1922 and 1929, twice as many as in any previous seven-year period. Well over half were single-family dwellings. Although fewer than half of all American families owned their homes and brand-new houses were beyond the means of most working-class families, the growth of mortgage financing allowed more of them than ever before to become homeowners.

Residential patterns remained economically and racially segregated. As cities grew, better-off workers moved outward, while poorer families took over the neighborhoods that were being abandoned. Suburban areas grew twice as rapidly as the center cities did. To keep out African Americans, Asians, Mexicans, and Jews, some suburban towns adopted zoning regulations or “restrictive covenants,” special clauses that were written into deeds to regulate the sale of land and houses.

But wherever Americans lived, dramatic changes were taking place inside their homes as well. Oil furnaces, radios, toasters, irons, vacuum cleaners, and wringer washing machines appeared even in working-class homes during the 1920s. Prepared and packaged foods could be found on many working people’s tables, and cleaning products were now purchased rather than being made at home, as in the past.

Throughout the decade, as appliances multiplied, the use of electricity increased. More than two-thirds of U.S. households were wired by 1930—double the proportion of a decade before. Some electric companies offered wiring services on the installment plan. As new power plants came on line, the price of power dropped by one-third between 1912 and 1930. At first, most homes used only a few electric lights, but as electricity got cheaper, people installed more lights, bought more appliances, and used more current. By 1926, more than half the houses in Zanesville, Ohio, had electric irons and vacuum cleaners, and one-fifth had toasters. Still, the average residential customer in 1930 used only 547 kilowatt hours annually. Customers in the early 2000s would use almost twenty times that much.

Farm homes lagged far behind city ones in household technologies and utilities. Nearly three of every four urban families had bathrooms, but only one in three rural families did. Similarly, only about 10 percent of farms had electricity in 1930, compared with 85 percent of urban and small-town dwellings. Many prosperous farmers bought appliances that ran on gas, but almost three-quarters of the farm homes in the Midwest had no modern household equipment at all, although many had automobiles and tractors.

Still, tens of millions of Americans lived outside the boundaries of the consumer economy, unable to afford new clothes, automobiles, or the many other products that were part of Americans’ rising expectations during the 1920s. A third of all families lived in houses that one expert categorized as below “any decent standard.” Times of unemployment, seasonal layoffs, or sickness threw blue-collar workers into hard times. “When my husband’s working steady,” a roofer’s wife explained, “I can just manage, but when he’s out [of work], things go back.” First she stopped sending her wash to the commercial laundry and did it by hand in the kitchen sink. Then she cut back on food. Then, she explained, “the rent goes behind.”

Autos for the Masses

The single most important product in the new culture of consumption was the automobile, and the number of cars that were manufactured more than tripled during the 1920s. By 1929, almost half the families in the United States owned a car—a level that was not reached in England until 1970. “Why on earth do you need to study what’s changing this country?” a Muncie, Indiana, resident asked sociologists Robert and Helen Lynd. “I can tell you what’s happening in just four letters: A-U-T-O!”

The automobile transformed not only the way Americans spent their leisure time, but also the economic and physical landscape. The automobile industry in 1929 accounted for nearly 13 percent of the value of all manufactured goods; it employed 375,000 workers directly and millions more indirectly. Fifteen percent of the nation’s steel went into auto bodies and parts, and 80 percent of its rubber went into tires. Furthermore, the auto industry generated and supported secondary industries and businesses, supplying Americans with tires, new highways, and parking lots. New demands for insurance and consumer credit stimulated financial markets. Whole cities grew up around automobile production. Detroit grew from a population of 285,000 in 1900 to 1.5 million in 1930.

At the beginning of the decade, Henry Ford and the Ford Motor Company dominated the automobile market. Ford was selling transportation, not style, and he emphasized production: the most efficient factories, the lowest price for cars, the largest market. To those ends, during World War I, Ford had developed the huge River Rouge complex in Dearborn, Michigan. This vertically integrated manufacturing operation had its own port, steel mill, power plant, and railroad, and it operated the world’s largest foundry. At its height, the River Rouge complex was the largest factory in the United States, employing more than 75,000 workers. “Fordist” methods—reducing labor costs and increasing output through the use of machinery, keeping prices low and wages high—paid off (see Chapter 4). In 1921, the Ford Motor Company controlled more than half of the U.S. automobile market. The price of a Model T—0 in 1909—dropped to 0 by 1924.

Ford’s main rival, General Motors (GM), took a different approach. GM president Alfred P. Sloan, Jr., believed that Americans with rising incomes would choose cars not only for their price, but also for their comfort and style. Accordingly, in the late 1920s, GM began to offer a wide range of styles and prices, changing them every year. Sloan called this approach “the ‘laws’ of Paris dressmakers”: “keep the consumer dissatisfied” and eager to buy a new and fancier model. Thus, the company maintained five separate car divisions, ranging from Cadillac, which made the fewest, most expensive automobiles, to Chevrolet, which mass-produced low-priced vehicles.

Increasingly, working people found it easier to afford cars, although perhaps not the models they longed for. For many decades, the installment plan had enabled seamstresses and farmers to buy sewing machines and harvesters. Now three major companies, including GM, financed the postwar auto boom by offering loans to buyers.

Farmers and people in small towns were the first to purchase cars in massive numbers. About three of every ten farm families in the Midwest bought their first cars during the 1920s; by 1930, almost nine of every ten farm families in Iowa, Kansas, Minnesota, Nebraska, and the Dakotas owned cars. For these people, living far from stores and neighbors, automobiles filled a real need and transformed rural social life. One farm woman thought it obvious why her family had bought a Model T before equipping their home with indoor plumbing. “Why, you can’t go to town in a bathtub,” she told an inquiring government official.

Even some poor farmers scrimped and saved to buy secondhand vehicles. About half of white Georgia sharecroppers owned cars by the mid-1930s. Autos weakened the tyrannical grip of southern plantation owners and local merchants because farmers could patronize more distant stores, banks, and cotton gins. Black sharecropper Ned Cobb viewed cars as a work incentive to his sons: “my boys, anyway, they done got big enough to go and correspond girls,” he later recalled, and he decided to “buy me a new Ford to please them.”

The automobile changed life in the city, too. Cars enabled people to live farther from work. No longer dependent on public transportation, Americans moved away from older working-class neighborhoods and into the suburbs. The new mobility was accompanied by a sense of freedom and control. For young people, the automobile offered the means to socialize away from their parents. For workers in increasingly routinized jobs, it fueled a growing tendency to make recreation a part of everyday life, not just an occasional event. Like other consumer industries, the automobile makers began manufacturing not only products, but also desires. Ultimately, GM’s emphasis on marketing proved more effective than Ford’s stress on manufacturing. Ford was forced to abandon the Model T, to introduce and advertise new models and colors, and to move toward more flexible manufacturing methods. Henry Ford’s vision of extreme standardization and continuous price reduction was on its way out. Replacing it was a more sophisticated form of mass production, based on creating and then fulfilling ever-changing consumer demands.

The Creation of Customers

If manufacturers such as Ford were the captains of industry, advertising men could be seen as the captains of consumerism, charged with manufacturing dreams and creating new wants. Some advertisements associated products with a desirable lifestyle: “Men at the top are apt to be pipe-smokers,” read an ad for Edgeworth Pipe Tobacco. Others tried to undermine people’s reliance on traditional sources of advice and authority—parents, friends, and neighbors—so that they would trust manufacturers’ claims instead. Some pointed out the dire consequences of not purchasing a product: “A single contact with inferior toilet paper,” warned a Scott Paper ad, “may start the way for serious infection—and a long, painful illness.”

Despite advertisers’ suggestion that everybody had to have the latest of everything, most families set their own priorities and purchased the things they wanted most. “It is not uncommon,” the Lynds wrote about Muncie, “to observe 1890 and 1924 habits jostling along side by side.” A family might have “primitive back-yard water or sewage habits” but own an automobile, an electric iron, or a vacuum cleaner.

Consumption varied from town to town and from region to region but was particularly conspicuous in places such as Flint, Michigan, which was experiencing an economic boom. Flint’s auto industry drew large numbers of young workers to the city. Unrestricted by community traditions and family obligations, they eagerly adopted the ethic of consumption, buying cars and clothing on the installment plan. Young single women, too, went into debt to buy stylish clothes. Dressed for a night out, workers flocked to Flint’s movie theaters, dance halls, and bowling alleys.

Some businessmen argued that the Flint situation was ideal, that the high wages that were common in the auto industry would not only maintain labor peace, but also enlarge the pool of purchasers for consumer products. Boston reformer and department store owner Edward A. Filene believed that the very idea of mass production was “based upon a clear understanding that increased production demands increased buying, and that the greatest total profits can be obtained only if the masses can and do enjoy a higher and ever higher standard of living. . . . Mass production is . . . production for the masses.”

Mass Culture: Radio, Music, and the Movies

The new consumer culture was accompanied by the rise of a truly national popular culture. Popular entertainments such as  radio, recorded music, and motion pictures pulled previously isolated social groups into the mainstream. At the same time, however, they divided families by appealing differently to members of different generations. As they reached their wide audiences, these entertainment forms created new desires and aspirations, reinforcing the development of a consumer culture.

By 1926, more than four million radios had made their way into American homes. Families and neighbors gathered in homes and shops to listen to drama, comedy, and crop and weather reports. For the first time, millions of Americans could hear the president’s voice, the roaring crowds at the World Series, and the very best professional musicians.

Businesspeople quickly realized that radio offered a wonderful new medium for peddling their wares. Within a few years, companies were sponsoring programs that incorporated commercials featuring “branded performers” such as the Ipana (toothpaste) Troubadours, the A&P Gypsies, and “Paul Oliver” and “Olive Palmer,” who performed for the Palmolive Company. By 1928, national networks had been established on an explicitly commercial basis to sell expensive radio time. Filling that time were new forms of sponsored programming, such as Pepsodent toothpaste’s Amos ’n’ Andy, an enormously popular comedy show about African Americans (played by white actors), which premiered in 1928.

Even rural Americans listened to radios powered by batteries or windmills. The new medium gave them vital information about commodity prices and weather. Soon, the Department of Agriculture was providing radio stations with scripts for lessons on dairy production, livestock feeding, and cooking. The National Farm and Home Hour, which debuted on NBC stations in 1928, provided forty-five minutes of music, weather and crop forecasts, and information on soil improvement and home economics. Just as important, the radio kept farm women company as they churned butter and made beds. As one Missouri farmer wrote to a radio station in 1923: “We hillbillies out in the sticks look upon radio as a blessing direct from God.”

Local and ethnic radio broadcasts flourished alongside the emerging national shows. In every city, scores of low-powered stations carried foreign-language programs. Stars of foreign-language radio shows became important figures in the ethnic enclaves.

Ethnic audiences could also buy phonograph records made in foreign languages. By the mid-1920s, phonographs were affordable luxuries for working people. Like radios, they were sold at widely ranging prices, depending on the quality of their cabinets as well as their working parts. “Race records,” marketed to Black audiences, brought Black music to far-flung corners of the United States. In just six months in 1923, blues singer Bessie Smith’s first recording, “Downhearted Blues” sold 750,000 copies.

Theater owners also adapted the movies to the needs of ethnic communities by combining films with live entertainment, often directed at a local audience. In the packinghouse district of Chicago, a Polish play accompanied the film; in Little Sicily, Italian music could be heard at the movie house. In “The Stroll,” Chicago’s African American entertainment district, blues artists played on the same bill as the movie.

Before World War I, moviegoers had been mainly urban, working-class immigrants, but during and after the war, movie theaters sprang up even in remote towns. By the mid-1920s, there were more than twenty thousand movie theaters in the United States. In Carrboro, North Carolina, a small mill town without electricity, the only entertainments had once been baseball, hunting and fishing, music, and conversation. Now, with a movie house equipped with a gasoline-powered generator, mill families could see the latest newsreels and movies. Carrboro was becoming less isolated.

At the same time that movies were arriving in small towns, film distributors began building large, ornate movie palaces in the cities. One Baltimore theater featured a 110-person orchestra, a mammoth organ, and fourteen pianos. Workers took the streetcar, and middle-class people drove downtown, to see new kinds of films at these theaters—at first, silent features running an hour or more and eventually, in the late 1920s, “talking pictures.” By 1930, a total of 100 million movie tickets were being sold every week.

The urban movie palaces had their counterpart in what one film historian calls “new palaces of production”—vast studio lots with shooting stages, film labs, and costume shops—centered in Southern California, especially “Hollywood,” a word that came to describe the movie industry itself, not just its most famous center of production. Starting in 1906 and accelerating thereafter, movie companies headed west to California, seeking sunny weather, low taxes, and weak unions. By 1919, more than four-fifths of the world’s movies were produced in the Los Angeles area.

Businessmen embraced the new popular culture of Hollywood because it stimulated consumption. People wanted to own the cars and clothes they saw in movies and magazines. Young people and adults began modeling their clothing, speech, and behavior after stars of movies, vaudeville, radio, and professional sports.

Many of the new celebrities came from working-class, immigrant backgrounds. Rudolph Valentino, Hollywood’s top male romantic lead, had been born in Castellaneta, Italy. The great magician Harry Houdini was the son of a Jewish tailor. And baseball slugger Babe Ruth, the descendant of German immigrants, lived in a poor neighborhood on the Baltimore waterfront. Fans adored these celebrities in part because they spurned the Protestant middle-class values of self-restraint, hard work, and “character.” Valentino’s erotic portrayal of exotic and passionate characters in movies such as The Sheik challenged the Victorian ideal of restrained and decorous masculinity. Followers celebrated Ruth not only for his extraordinary athletic accomplishments, but also for his oversized appetites for food, clothes, alcohol, and sex.

These icons of mass culture competed with the traditional values of families and local communities in providing the primary channels for children’s access to the outside world. Generational conflict often resulted, especially in immigrant homes. Grace Gello, who grew up in an Italian family on New York’s Lower East Side, remembered that she and her fiancé would occasionally take the afternoon off from work to go to the movies. “We didn’t do this too much because we were afraid of my father. He would say, ‘If I catch you, I’ll break your neck.’”

For immigrants and their children and for farmers, miners, millworkers, and laborers, movies provided a window on the middle- and upper-class world, with which they had no direct contact. Kate Simon, a writer who grew up among immigrants in the Bronx, New York, recalled that from movies, “we learned how tennis was played and golf, what a swimming pool was and what to wear if you ever got to drive a car . . . and of course we learned about Love, a very foreign country like maybe China or Connecticut.”

In a sense, all of America was Americanizing. Mass culture was not only Americanizing immigrants, it was also redefining the nation’s values. Such changes proved threatening, particularly to traditional arbiters of public values—ministers, political leaders, police officials, social workers, and academics—who generally opposed change. Although the movie moguls profited from their role as the nation’s new cultural brokers, they also worried that if they went too far, they would provoke censorship and attack, a serious concern, since so many of them were themselves immigrants. After a young actress was found dead in the aftermath of a drunken party hosted by “Fatty” Arbuckle, one of the nation’s favorite film comedians, the producers realized that they needed a frontman. For 0,000 a year, they hired Will Hays, President Harding’s postmaster general and an elder of the Presbyterian Church, to set up a system of industry self-policing., Baseball owners had followed a similar strategy a few years earlier, after the scandal when gamblers fixed the 1919 World Series.

Women as Workers and Consumers

“In an age where the emphasis is on consumption,” social scientist Lorine Pruette commented in 1929, “women need wages . . . to keep themselves afloat on the tide. . . . The manufacturers need the women as consumers, need the two-wage family and its demands to keep the factories going.” Indeed, to help their families survive or simply to live more comfortably, increasing numbers of married women took jobs outside the home during the 1920s. Their numbers were still modest; in 1930, fewer than 12 percent of all married women worked for wages, women who could afford to stay home generally did so. But those who did take jobs were part of a long-term trend; since 1900, the percentage of married working women had doubled. Because poverty remained the most important determinant of whether a woman worked, married Black women were five times as likely as married white women to be in the paid labor force.

Although more women entered the workforce in the 1920s, they did not do so on an equal basis with men. In 1929, the average working woman earned only fifty-seven cents to a man’s dollar. A key reason for the disparity was that jobs were generally designated as “male” or “female.” For example, both men and women, most of them white, could be found in all types of clerical jobs, but bosses preferred women for routine tasks and men for sales and general clerical posts. Employers rarely promoted women to managerial positions.

To lower costs, companies introduced a variety of calculating machines and Dictaphones (recording machines that eliminated the need for shorthand skills). Following scientific management practices, companies divided the work process into narrow, highly routinized jobs, and the pace of clerical production increased. In large “typing pools,” women typed documents for bosses they never saw. These pools more resembled light manufacturing plants than the small company offices of an earlier era.

More satisfying career opportunities opened up in professions that were filled primarily by college-educated women: social work, nursing, teaching, and librarianship. A small but growing number of women found jobs as lawyers, bankers, religious leaders, and editors. Although a handful of these women became well known, most were marginalized, excluded from power even in professions they dominated. Some professions almost entirely excluded women. Most medical schools had quotas of just 5 percent for women students; only a handful of hospitals would hire women interns. The law schools of Columbia and Harvard would not consider admitting women. All told, only 14 percent of wage-earning women occupied professional positions in 1930, while 19 percent held clerical positions and 30 percent worked in domestic service or as waitresses and beauticians.

As women began to gain some economic independence, their legal and social status began to change. Ratification of the Nineteenth Amendment led to the extension of other legal rights; by the early 1920s, women could serve on juries in twenty states. But women did not use the vote to win greater social and economic equality, as suffragists had long hoped. Property, marriage, and divorce laws remained unfavorable to women. Fewer women than men voted throughout the decade, and women voters made choices on bases similar to those of men, shaped less by gender than by class, region, age, race, and religion. As a result, the major political parties ignored women voters and women’s issues. An Equal Rights Amendment (ERA) to the Constitution repeatedly failed to pass Congress, although it was the focus for many middle- and upper-class women activists. Supporters of hard-won legislation to protect women workers opposed the ERA, fearing that it would invalidate those protections. The only important legislative triumph for the women’s movement in the 1920s was the Sheppard-Towner Maternity and Infancy Protection Act of 1921, which for the first time provided federal funds for health care.

Cultural habits may have changed more than politics. The popular 1920s stereotype of the “flapper”—a “new woman” who wore short, loose dresses and used cosmetics, smoked and drank in public, and embraced the sexual revolution—both mirrored and exaggerated the popular rejection of the genteel, corseted Victorian feminine ideal of the late nineteenth century. These changes did not come about overnight or affect all women equally; working-class women had pioneered some of the new attitudes before World War I. And while the 1920s brought much franker and more open discussions of sexuality, sexual behavior had probably begun to change earlier than that. Whatever the date of the change, Americans in the 1920s were waking up to the realization of a sexual revolution. One study found that women who were born after 1900 were twice as likely as those born earlier to have premarital sex. Birth control, more widely available and more reliable in the 1920s, was one force behind the change in behavior. “Rubber has revolutionized morals,” declared family court judge Ben Lindsey in 1929.

African American Life in the 1920s and the Harlem Renaissance

The new consumer culture of the 1920s affected Black Americans less than most other groups. Most remained poor and in the rural South. Still the Great Migration to the North continued and brought some profound changes as 824,000 African Americans left the South between 1920 and 1930. The Black populations of New York, Chicago, and Cleveland more than doubled; that of Detroit tripled. Some growth came from West Indian immigration; by 1930, 50,000 foreign-born Black people lived in New York alone. Many Black people who remained in the South moved to cities; by the end of the decade, one in five African Americans lived in a southern city, and more African American men held blue-collar jobs than farmed. These growing Black communities provided the basis for a relatively prosperous Black urban culture, extraordinary for its intellectual and artistic accomplishments and significant for its new political militancy.

The movement into urban and blue-collar jobs did not end discrimination. African Americans could generally get only the least attractive jobs. For example, high-paying jobs in the auto industry went almost exclusively to white workers. Henry Ford, however, pursued a different strategy and had roughly 10,000 African American employees by 1926. Ford’s hiring practices were not altruistic. The African American workers his company recruited, generally through local ministers, were unusually loyal—an important consideration, given Ford’s fear of unions.

As the northern Black working class expanded, opportunities developed for Black professionals and businesspeople, and the African American class structure became more complex. Most northern cities had a small Black elite that included college-educated lawyers, doctors, and ministers who served Black clienteles, along with successful musicians, saloonkeepers, and dressmakers, many of whom catered to white patrons. A new Black middle class emerged to provide services—newspapers, drugstores, insurance, funerals—to their community. For example, Caribbean-born Vollington Bristol used his savings from working as head bellhop at Detroit’s Fairfax Hotel to open a successful funeral parlor that catered to the city’s booming Black population. Black women’s clubs, which drew on the new middle class, campaigned for community issues and became a political force in many communities.

These developments fostered a new political militancy and heightened racial pride throughout Black America. African Americans who had fought in or supported World War I demanded greater democracy at home. The National Association for the Advancement of Colored People (NAACP) benefited from this new spirit; by 1919, it had 91,000 members. The organization worked, largely through lobbying and the courts, for civil rights and an end to lynching. It won some victories, such as a 1927 Supreme Court decision (Nixon v. Herndon) that declared unconstitutional a Texas law barring Black people from voting in Democratic Party primaries. But the NAACP failed to win equal voting rights or the allegiance of most poor African Americans, who sometimes viewed it as a club for liberal whites and well-to-do Black people such as Dr. L. A. Nixon, the Black El Paso doctor who brought the case against white-only primaries.

In contrast, Marcus Garvey’s Universal Negro Improvement Association (UNIA) garnered massive support. Garvey founded the UNIA in his native Jamaica in 1914 and brought it to the United States two years later. The UNIA sought Black nationhood and the redemption of Africa from colonialism and promoted self-help and self-respect. Garvey preached pan-Africanism, which viewed Black men and women throughout the world as one people and linked the struggle for Black rights outside of Africa with the fight to free Africa from colonial rule. Garvey’s appeal rested on Black pride. He publicized Black achievements, opposed interracial marriage, and, in a reversal of convention, looked down on light-skinned African Americans. Arguing that Black people should develop their own separate institutions and commercial enterprises, he criticized the NAACP’s goal of racial integration.

Under Garvey’s charismatic leadership, the UNIA attracted followers from virtually every segment of Black America, especially West Indians, recent migrants to the North from the South, and members of the new Black middle class. Within a few years, it had a million members in the United States as well as branches in Africa and the Caribbean. But the UNIA’s success made it a target of government harassment, and mishandled finances and political infighting devastated the organization. Garvey went to prison for mail fraud in 1925 and was deported to Jamaica two years later. His removal shattered the UNIA, although his ideas remained influential.

Other political groups, including the major political parties, competed for Black support. As the northern African American population grew, Black votes became significant to the urban machines. Most Black people voted for Republicans, the party of Lincoln. But by maneuvering between the two parties, Black political leaders won influence and patronage. In New York City, police, fire, and other municipal departments began to hire Black workers. Soon, Black candidates were running for office. Oscar De Priest, the Alabama-born son of formerly enslaved parents who was Chicago’s first African American alderman, became, in 1928, the first northern Black representative elected to Congress.

Postwar African American political activism was paralleled by a flowering of Black culture known as the Harlem Renaissance. Reflecting the racial self-confidence nourished by the growing northern Black communities, the writers and artists of the Harlem Renaissance expressed a new pride in Black racial identity and heritage. “Negro life,” wrote Alain Locke, who coined the phrase “the New Negro” and became the movement’s leading philosopher, “is seizing its first chances for group expression and self determination.” In music, poetry, novels, plays, dance, painting, sculpture, and photography, Black artists and intellectuals celebrated African American spiritual and cultural traditions, rejecting white values and stereotypes.

In an era when pan-Africanism was emerging as an intellectual and political movement, some Black artists looked to Africa for inspiration. Locke’s writings and speeches about Black culture echoed Woodrow Wilson’s references during World War I to a people’s need for self-determination. That theme recurred in Black efforts to win liberation for colonized Africans and to identify with them. In “The Negro Speaks of Rivers,” Langston Hughes, perhaps the greatest poet of the Harlem Renaissance, tied together the Euphrates, the Nile, and the Mississippi as “ancient, dusky rivers [that] I’ve known.” Other writers, however, such as Zora Neale Hurston and Jean Toomer, concentrated instead on the folk culture of the South.

More popular African American cultural forms flourished too. Singers such as Florence Mills and Ethel Waters and dancers such as Bill Robinson traveled a growing circuit of Black nightclubs, theaters, and vaudeville houses. Jazz, an immensely popular and sophisticated musical form, thrived in the developing Black communities of the North. By the early 1920s, Louis Armstrong and other important New Orleans jazzmen had moved to Chicago. Edward “Duke” Ellington moved from Washington, D.C., in 1927 to New York City and began playing at Harlem’s Cotton Club.

The Culture Wars of the 1920s

With cultural change came cultural conflicts. Some pitted the city dwellers, who tended to tolerate a wider variety of behaviors (including drinking and a visible gay culture), against country folks. Protestant fundamentalists viewed modern, urban life with particular suspicion and fought against the teaching of modern ideas, including evolution, in the schools. Although associated with the rural South, fundamentalists also won strong support in the growing cities. So did the most vicious opponents of cultural change: the Ku Klux Klan, which spread hatred and spurred violence against African Americans, Jews, immigrants, and Catholics. Probably the bitterest cultural battle was played out over drinking. Antiliquor forces had won their greatest victory with the passage of the Eighteenth Amendment in 1919. But that victory proved their undoing, as widespread evasion of Prohibition increasingly led Americans to question whether “morality” could be legislated or policed.

The Urban-Rural Divide

Ethnic enclaves in large cities continued to preserve old traditions and respect for family obligations. Tens of millions of Americans grew up in homes where only the children spoke English and many went to religious schools that fostered Old World traditions. As late as 1940, New York had 237 foreign-language periodicals. But although their parents bought radios to listen to the foreign-language programs, the children changed the stations and sneaked off to the movies.

Highbrow culture, too, was centered in the large cities. Many of the best-known writers of the 1920s, including Sinclair Lewis, Sherwood Anderson, and H. L. Mencken, portrayed people who lived in small towns as narrow, hypocritical, and spiritually impoverished. Babbitt, the title of Lewis’s satirical novel about a small-town businessman, entered the language as a synonym for a narrow and self-satisfied conformist.

Cities permitted behaviors and institutions that were unacceptable in small towns. Major cities such as New York became the center of an increasingly visible queer subculture that could be found in certain bars, tearooms, rooming houses, bathhouses, restaurants, and cafeterias, as well as in particular neighborhoods such as Greenwich Village and Harlem. Prohibition pushed life of LGBTQ+ people farther out into the open. By criminalizing drinking behavior that even many middle-class people sanctioned, Prohibition undercut conventional moral authority and fostered a set of institutions (“speakeasies”) and an amusement district (Times Square) where gay men and lesbians could flaunt social convention. By the late 1920s, New York was in the throes of a “pansy craze,” with drag balls that attracted thousands of spectators and Broadway plays that featured queer themes. Gladys Bentley, an African American blues singer known for performing wearing men's clothing, married her white girlfriend in a well-publicized ceremony in Atlantic City.

Not surprisingly, such transgressions against sexual and gender conventions brought a backlash. In 1927, New York police raided plays such as The Captive and Sex and arrested their casts, including the flamboyant Mae West. The New York state legislature quickly followed with a ban on plays “depicting or dealing with the subject of sex degeneracy, or sex perversion.” Four years earlier, the legislature had lashed out against queer bars and “cruising” by defining same-sex solicitation as a form of disorderly conduct—a statute that was often interpreted to mean that all LGBTQ+ gathering places were “disorderly.” By the 1930s, continued legal harassment and police raids had erased queer life from public view.

Christian Fundamentalism and the Scopes Trial

Culturally conservative Americans saw the growing visibility of urban LGBTQ+ culture as one of many signs that cities were the source of sin, depravity, and irreligion. Many of these Americans supported a Protestant fundamentalist movement that had been gaining strength since the late nineteenth century. The movement reacted against modern urban life, modern science, and liberal Protestants who tolerated both challenges to traditional religion. The term fundamentalist came into use in 1909 after publication of a series of pamphlets called The Fundamentals, which denounced as corrupt modern scientific theories such as evolution and modern pastimes such as dancing. Intellectuals and urban Americans in the 1920s (as now) saw fundamentalism as a sign of rural backwardness and opposition to change. H. L. Mencken relentlessly mocked “the forlorn pastors who belabor half-wits in the galvanized iron tabernacles behind the railroad yards.”

Yet fundamentalist and evangelical Christians had a strong presence in the cities and readily adopted modern means of communication in their proselytizing. The evangelist Aimee Semple McPherson, for example, may have started out preaching at revival meetings in tents, but by the mid-1920s, she was presiding over the spectacular Angelus Temple in Los Angeles, where tens of thousands of people heard her sermons, which were also broadcast over the radio. McPherson’s success flowed not just from her message and effective use of the new technology (and her legendary beauty), but also from the incredible growth of the city of Los Angeles, which added 1.3 million new residents in the 1920s.

But if adherents of fundamentalism could be found in cities all over the country, the decade’s most famous confrontation over the truth of the Bible erupted in the small southern town of Dayton, Tennessee, in 1925. There, fundamentalists, hostile to any idea that ran counter to a literal reading of the Bible, rallied against the teaching of Charles Darwin’s theory that human beings shared an evolutionary link with other primates. The Tennessee legislature had recently passed a law prohibiting teaching that “man has descended from a lower order of animals.” When the American Civil Liberties Union chose Dayton high school teacher John T. Scopes to defy the law intentionally as a test of its constitutionality, fundamentalists were outraged. They enlisted former secretary of state and three-time Democratic presidential candidate William Jennings Bryan to aid the prosecution. Clarence Darrow, a prominent liberal lawyer who had defended many political and criminal celebrities, headed Scopes’s defense team. The trial was a carnival of journalists and onlookers; on the street outside, vendors sold Bibles and toy monkeys.

The most famous moment in the Scopes trial came when the defense—prohibited by the judge from calling scientists to defend evolution—put Bryan on the stand as an expert on the Bible. Darrow ridiculed him before the court and the nation, forcing Bryan to admit that some biblical passages could not be interpreted literally. But Bryan’s testimony had no real bearing on the case, and it exaggerated the differences between Darrow and Bryan, both of whom shared a commitment to social justice. In fact, Bryan’s fundamentalism was linked to his populism. He had long opposed social Darwinism, or the application of Darwin’s principle of “survival of the fittest” to human society—to struggling farmers, laborers, and small businessmen.

Both fundamentalists and scientists emerged from the trial as losers. In the face of the scorn heaped on them by intellectuals, fundamentalists retreated from political life and did not fully reenter politics until the 1980s. Scopes was convicted (although his sentence was later thrown out on a technicality), and Tennessee’s antievolution law remained on the books until the 1960s. A few other states passed antievolution laws, and publishers meekly complied by removing discussions of evolution from biology textbooks sold across the nation.

An Upsurge of Racism and Nativism

Like fundamentalism, the Ku Klux Klan is also often associated with southern rural life. Yet in the 1920s, the Klan, too, had a major following in the cities. In its twentieth-century heyday, roughly half of the Klan’s three million members lived in metropolitan areas. And although it had considerable support in the South, the Klan had its strongest support in the Midwest and the Southwest. Founded in 1915 and inspired by the Reconstruction-era organization of the same name, the Klan shared with its nineteenth-century namesake a deep racism, a fascination with mystical regalia, and a willingness to use violence to silence its foes. Unlike its predecessor, it professed anti-Catholicism and anti-Semitism as strongly as it affirmed racism.

The intolerance and vigilantism that were prevalent during World War I had paved the way for the Klan’s rise. Farmers going through hard times, underpaid workers facing competition from immigrants and African Americans, and small businessmen who were losing out to national manufacturers and chain stores all lashed out through the Klan against those they believed were threatening their economic well-being. Country dwellers resented the diminishing importance of rural virtues; city dwellers associated foreigners with gangs and crime. Old-stock urban Protestants felt displaced by Catholics and Jews, and those who remembered the Red scare were left with the suspicion that immigrants were inherently subversive.

Riding on fears of immigrants, Communists, labor unions, African Americans moving north, and Jews and Catholics rising in the economic and social order, the Klan staged parades and cross-burning rallies across the country. Klan leaders gained strong influence over state governments in Texas, Oklahoma, Oregon, Louisiana, Kansas, and especially Indiana. Within a few years, however, a series of sexual, financial, and political scandals had tainted the Klan, and political leaders in several states moved against it.

Although the Klan retreated, the cultural antagonisms that supported it remained strong and surfaced in conflicts over Prohibition. In 1919, the Eighteenth Amendment to the Constitution was ratified, making it illegal to manufacture, sell, transport, import, or export drinking alcohol. Ratification, however, did not reflect a national consensus on drinking. Although the law was not openly flaunted at first, liquor flowed into the country across U.S. borders. Bootlegging and the production of alcohol for medical and religious purposes added to the supply. Alcohol consumption did decline, perhaps by as much as half, but tens of millions of normally law-abiding Americans either broke the law or abetted those who did. Even President Harding had a favorite bootlegger. It became apparent that enforcing Prohibition would require huge police forces.

Opponents of Prohibition argued that because authorities could not enforce the law, it bred crime, corruption, and a disregard for the rule of law in general. Indeed, the vast profits to be made from illegal liquor fed gangsters who were also involved in prostitution and high-interest loans. With profits rolling in, these types of organized crime provided poor Italians, Jews, Poles, and Irish with a means of upward mobility. Gangster organizations grew in size, sophistication, and power, fighting to establish regional fiefdoms by using the latest technology, from fast automobiles to submachine guns. They also bought off politicians and police wholesale. In some cities, gangs became an integral part of local politics. Al Capone and other flamboyant gangsters became celebrities.

“The very fact that the law is difficult to enforce,” an official of the Anti-Saloon League commented in 1926, “is the clearest proof of the need of its existence.” But by then, the failure of Prohibition was obvious, especially in urban areas. Organized opposition, once confined to the unions and the liquor interests, mounted. Of nine state referenda that were held in an attempt to modify the law, the “wets” (opponents of Prohibition) won seven. Public opinion polls showed that especially in the large industrial states, wets predominated

Native Americans defended another front in the cultural wars of the 1920s. Backed by Christian missionaries, Hubert Work, Secretary of the Interior in the mid-1920s, attacked Native peoples' cultures and religions, especially Peyotism (today known as the Native American Church), in which worshippers ingested a hallucinogen during a holy rite. Work charged that “gross immorality . . . accompanies native dances.” Others lashed out at “Indian paganism” and what they described as “horrible, sadistic, and obscene” heathen practices. Defenders of Native cultures, including both Native Americans and white supporters, argued for reform of Federal Indian Policy, based on the Wilsonian principle of self-determination. Conservative critics labeled these defenders “Red Progressives,” “anti-American, and subversive . . . agents of Moscow.” Over the course of the 1920s, however, Native peoples won some modest concessions. In 1924, for example, Congress finally passed a law conferring citizenship on all Native people born in the United States. But many states continued to prevent Native Americans from voting. More far-reaching reform of Federal Indian Policy would not come until the next decade.

Conclusion: Hoover and the Crash

The urban-rural tug of war made its way into the voting booths in the 1928 presidential election. By 1928, the balance of power within the Democratic Party had shifted decisively toward the cities. New York Governor Al Smith easily won the party’s presidential nomination. The contrast—at least in image—between Smith and Republican Herbert Hoover could not have been greater. Smith was an anti-Prohibition “wet,” a Catholic, a product of urban ethnic working-class life. Radio coverage of the campaign broadcast his heavy New York accent throughout the nation. Hoover stressed his boyhood in rural Iowa, professing his love for fishing and the simple, small-town life. In fact, he was a sophisticated businessman, the first president to rise to power from the ranks of the managerial elite rather than through party politics.

Prohibition was a powerful issue in the election. Smith was attacked as the candidate of foreigners and drinkers. Because many immigrants had no taboos about alcohol, the “drys” had long identified their crusade with the “100 percent Americanism” ideas of the war era and with the preservation of the American way of life. But it was probably religion, more than anything else, that shaped voting patterns in 1928. Smith faced a vicious campaign of anti-Catholic attacks, including rumors that he planned to extend the recently built Holland Tunnel across the Atlantic so that it would connect the White House with the Vatican, and that he would annul all Protestant marriages and declare all the children of these marriages to be bastards. Hoover was not only a Protestant with a rural image, he also had another basic advantage: The Republicans got credit for the nation’s prosperity. With both a strong economy and religious prejudice behind him, Hoover won by a landslide, receiving 444 electoral votes to Smith’s 87. Thanks to increased participation by immigrant voters, however, Smith won in the nation’s twelve largest cities, marking the Democrats as the party of urban America.

Hoover’s victory capped his long successful career in industry, relief work, and government. But he had little time to enjoy it. Within a year, the country was plunged into a devastating depression. Hoover could not remedy the economy’s fatal weakness: the tendency of industrial production to far outstrip the American people’s ability to consume. The stock market crashed in October 1929. The “motor city” of Detroit—the exemplar of 1920s prosperity—soon had the highest jobless rate in the nation. By August 1931, Ford Motor Company, which employed 128,000 workers in 1929, had only 37,000 workers. By the early 1930s, Hoover’s political reputation had been destroyed. The man who, along with Henry Ford, perhaps best symbolized America in the 1920s became one of the most hated men in the country. In a popular joke of the day, Hoover asked Secretary of the Treasury Mellon for a nickel—the price of a pay-phone call—to “call a friend.” Mellon replied: “Take a dime; call all your friends.”

The shattering of the idols of the vaunted new era—F. Scott Fitzgerald, Henry Ford, Herbert Hoover—suggested to many Americans that they would need to look in very different directions to cope with the hard times ahead.

Timeline

1920

The Eighteenth Amendment is ratified, prohibiting the sale of liquor.

1921

Congress enacts the Emergency Quota Act to control the flow of immigrants.

1922

Some 600,000 coal miners strike and win some gains, but unions are in sharp decline.

1923

The Ku Klux Klan, which was refounded in Atlanta in 1915, reaches its peak membership.

1924

Banker Charles Dawes brokers a plan to reduce German reparations and save the German economy; the plan fails.

1925

Alain Locke’s The New Negro is published.

1926

American marines suppress the Nicaraguan nationalist rebellion led by Augusto Sandino and impose a dictatorship under General Anastasio Somoza.

1927

The United States deports Marcus Garvey to Jamaica; his departure shatters the Universal Negro Improvement Association, which had won mass support after World War I.

1928

Oscar DePriest of Illinois becomes the first Black representative elected to Congress from the North.

1929

The Agricultural Marketing Act is passed to provide price support for farm products.

Additional Readings

For more on business conservatism at home and abroad, see:

George Black, The Good Neighbor: How the United States Wrote the History of Central America and the Caribbean (1988); Ellis W. Hawley, The Great War and the Search for a Modern Order (1979); Lawrence Levine, Defender of the Faith: William Jennings Bryan: The Last Decade, 1915–1925 (1965); and Emily Rosenberg, Spreading the American Dream: American Economic and Cultural Expansion 1890-1945 (1982).

For more on immigration restrictions, see:

Roger Daniels, Guarding the Golden Door: American Immigration Policy and Immigrants Since 1882 (2004); John Higham, Strangers in the Land: Patterns of American Nativism, 1865–1925 (1955); Mae Ngai, Impossible Subjects: Illegal Aliens and the Making of Modern America (2004); David M. Reimers, Unwelcome Strangers: American Identity and the Turn Against Immigration (1998); and Daniel J. Tichenor, Dividing Lines: The Politics of Immigration Control in America (2002).

For more on the new economy and shifts in manufacturing, see:

Gregg Andrews, City of Dust: A Cement Company Town in the Land of Tom Sawyer (1996); Alfred D. Chandler, Jr., Scale and Scope: The Dynamics of Industrial Capitalism (1990); Jeffrey Marcos Garcilazo, Traqueros: Mexican Railroad Workers in the United States, 1870-1930 (2012); Jacquelyn Dowd Hall, et al., Like a Family: The Making of a Southern Cotton Mill World (1987); Rick Halpern, Down on the Killing Floor: Black and White Workers in Chicago’s Packinghouses, 1904–1954 (1997); Akira Iriye, The Globalizing of America, 1914–1945 (1993); Ronald W. Schatz, The Electrical Workers: A History of Labor at General Electric and Westinghouse, 1923–1960 (1983); and Zaragosa Vargas, Proletarians of the North: A History of Mexican Industrial Workers in Detroit and the Midwest, 1917–1942 (1993).

For more on the expansion of American consumer culture and transformations in daily life, see:

Ronald William Edsforth, Class Conflict and Cultural Consensus: The Making of a Mass Consumer Society in Flint, Michigan (1987); Stuart Ewen, Captains of Consciousness: Advertising and the Social Roots of Consumer Culture (1976); Carolyn M. Goldstein, Creating Consumers: Home Economists In Twentieth-Century America (2012); T. J. Jackson Lears, Fables of Abundance: A Cultural History of Advertising in America (1994); Elspeth H. Brown, The Corporate Eye: Photography and the Rationalization of American Commercial Culture, 1884-1929 (2005); Roland Marchand, Advertising the American Dream: Making Way for Modernity, 1920–1940 (1985); and Nathan Miller, New World Coming: The 1920s and the Making of Modern America (2003).

For more on mass culture, radio, music, and the movies, see:

W. Fitzhugh Brundage, ed. Beyond Blackface: African Americans and the Creation of American Popular Culture, 1890-1930 (2011); Lizabeth Cohen, Making a New Deal: Industrial Workers in Chicago, 1919–1939 (1990); Tona J. Hangen, Redeeming the Dial: Radio, Religion, and Popular Culture in America (2002); David Nasaw, Going Out: The Rise and Fall of Public Amusements (1993); Elena Razlogova, The Listener's Voice: Early Radio and the American Public (2011); Steven J. Ross, Working-Class Hollywood: Silent Films and the Shaping of Class in America (2000); and Susan Smulyan, Selling Radio: The Commercialization of American Broadcasting, 1920–1934 (1994).

For more on women as workers and consumers, see:

Susan Porter Benson, Counter Cultures: Saleswomen, Managers, and Customers in American Department Stores, 1890–1940 (1986); Irving Bernstein, The Lean Years: A History of the American Worker, 1920–1933 (1960); Julia Kirk Blackwelder, Now Hiring: The Feminization of Work in the United States, 1900-1995 (1997); Catherine Ceniza Choy, Empire of Care: Nursing and Migration In Filipino American History (2003); Nancy F. Cott, The Grounding of Modern Feminism (1987); Dana Frank, Purchasing Power: Consumer Organizing, Gender, and the Seattle Labor Movement 1919–1929 (1994); David M. Katzman, Seven Days a Week: Women and Domestic Service in Industrializing America (1981); Alice Kessler-Harris, Out to Work: A History of Wage-Earning Women in the United States (1982); Angel Kwolek-Folland, Incorporating Women: A History of Women and Business in the United States (1998) and Rebecca Jo Plant, Mom: the Transformation of Motherhood In Modern America (2010).

For more on African American life in the 1920s and the Harlem Renaissance, see:

Beth Tompkins Bates. Pullman Porters and the Rise of Protest Politics in Black America, 1925–1945 (2001); Davarian L. Baldwin, Chicago’s New Negroes: Modernity, the Great Migration, and Black Urban Life (2007); Kevin Boyle, Arc of Justice: A Saga of Race, Civil Rights, and Murder in the Jazz Age (2004); Genevieve Fabre and Michel Feith, eds., Temples for Tomorrow: Looking Back at the Harlem Renaissance (2001); Nathan Huggins, Harlem Renaissance (1971); David Krasner, A Beautiful Pageant: African American Theatre, Drama, and Performance in the Harlem Renaissance, 1910–1927 (2002); David Levering Lewis, W.E.B. Du Bois: A Biography (2009); David Levering Lewis, When Harlem Was in Vogue (1989); Neil McMillan, Dark Journey: Black Mississippians in the Age of Jim Crow (1989); Judith Stein, The World of Marcus Garvey: Race and Class in Modern Society (1985) and Joe William Trotter, Black Milwaukee: The Making of an Industrial Proletariat, 1915-1945 (2006).

For more on the culture wars of the 1920s, see:

Douglas Carl Abrams, Selling the Old-Time Religion: American Fundamentalists and Mass Culture, 1920–1940 (2001); Stanley Coben, Rebellion Against Victorianism: The Impetus for Cultural Change in 1920s America (1991); Lynn Dumenil, The Modern Temper: American Culture and Society in the 1920s (1995); Edward J. Larson, Summer for the Gods: The Scopes Trial and America’s Continuing Debate over Science and Religion (1997); Allan J. Lichtman, Prejudice and the Old Politics: The Presidential Election of 1928 (1979); and George S. Marsden, Fundamentalism in American Culture (1980).