United States

How $15 an hour became the de facto minimum wage

INSIGHT ARTICLE  | 

It is becoming abundantly clear that the minimum wage has gone the way of the buggy whip and that a new de-facto entry-level wage of $15 per hour is becoming the national standard in the private sector.

The shock to the economy unleashed by the pandemic and the response by workers have radically transformed the wage-earning landscape. And this is taking place in the private sector, not in federal, state or local governments.

Middle market firms that comprise the real economy are adapting to this transformation by paying higher wages, improving working conditions, offering more flexibility to workers—or all of the above.

At the same time, many firms are looking to substitute technology for labor to offset the rising costs. But this dynamic will remain fluid as the natural tension between labor and capital evolves.

In the end, the era of surplus labor and the setting of entry-level wages well below the cost of living appear to be in the rearview mirror.

A short history of monopsony

The pandemic and the economic shutdown allowed employees the opportunity to rethink their working conditions. In many cases, workers realized that their jobs were just not worth their time.

And because of the rise of online shopping and, most important, Amazon’s $15 starting wage—the private sector has found itself competing for a dwindling supply of low-wage employees after four decades of what economists call monopsony power.

At first, employers were shocked that they could not attract quality workers at wage scales they were used to paying. But then they realized that the landscape had shifted, and that the de facto minimum wage had become $15 an hour. Walmart Inc., McDonald’s and Chipotle Mexican Grill have already announced their intention to match Amazon’s starting salaries, and small businesses will be hard-pressed not to follow.

Middle market firms that comprise the real economy are adapting to the transformation in the labor market by paying higher wages, improving working conditions, offering more flexibility to workers–or all of the above.

Evolution of the minimum wage

Monopsony is the consolidation and non-competitiveness of employment choices. The earliest examples were the coal mining towns of West Virginia. But in the modern era, Walmart Inc. in the South, Rite Aid in New England, and McDonald’s everywhere have become examples.

As each of those establishments moved into areas that once had scores of small businesses in downtown commercial districts employment choices declined from many to just a few as those small businesses closed. And because the minimum wage didn’t keep up with the rising cost of living, the working poor increasingly had fewer choices for work. In many cases, the choice was to join the multinational company at wages that ever so steadily lost buying power, or perhaps go on public assistance.

That was not the original intent of the minimum wage. Introduced in 1933, the minimum wage was set at 50% of the average rate of 50 cents per hour. It was an efficient way to bring working families out of the poverty of the Great Depression.

By 1964, at the height of U.S. industrial power, the minimum wage remained at 50% of the average hourly rate of $2.50.

Over the next five decades, however, increases in the minimum wage did not keep up with average hourly earnings. By 2019, before the pandemic, the minimum wage had fallen to 30% of average hourly earnings, while the purchasing power of the dollar had declined by nearly 90% since 1964.

In effect, a stagnant minimum wage acted as an artificial ceiling on wages. As manufacturing jobs began disappearing from the American economy in the 1980s, employers faced less competition for workers. And because low-income families don’t have the means to move to higher-paying areas, there were plenty of willing workers as long as their new jobs were paying anything above minimum wage.

Federal minimum wage and average hourly earnings chart

Consider that by 2019, only 3.5% of the labor force was unemployed. Yet more than 10% of the population (including the working poor) was still living below the poverty line.
When inflation was controlled, the current minimum wage was, in fact, lower than the real minimum wage levels during most years since 1950, except for two brief periods in 1990 and 2007, which were both recession years.

This is a stark contrast to other indicators for the United States overall income levels like real gross domestic product per capita, which has more than quadrupled since 1950.
So why didn’t wages for all income levels keep up with modern-era price increases despite increasing demand for labor?

  • The changing value of labor: First, the current minimum wage at $7.25 per hour can be viewed as an enshrined distortion of the value of labor. It took the shock of the pandemic and a dramatic shift in how we live, work and shop for that realization to occur.
  • The effects of consolidation: Second, there has been a consolidation of businesses over past decades, distorting the role that competition plays in a market-based economy in terms of both economic growth and employment advancement. From 1988 to 2018, the average company size in the United States increased 21.5% from 17.7 to 21.5 employees per company. Also, the share of the biggest organizations in the United States with more than 500 employees increased from 0.26% to 0.34%; while at the same time, the share of the total number of working employees in those companies jumped from 45.5% to 53.2%.

Average firm size by numbers of employees chart

What can businesses and consumers expect?

It’s no mystery that a one-time jump in low-paying wages to $15 per hour will increase operating costs that will be passed on to consumers. And if labor were to retain its newfound bargaining power, increases in earnings for low-wage occupations are likely to keep up with the inflation rate.

Payments to low-wage families—either by increased wages or public assistance—are spent in their entirety on food, shelter and other necessities.

The Bureau of Labor Statistics reported that fewer than 2% of all hourly workers in 2019 earned the federal minimum wage or below. Those workers tended to be young, with two-fifths of minimum-wage workers under 25. They were more likely to work in the service sector, in food preparation and serving, often with pay supplemented by tips.
States with the highest percentages of hourly paid workers earning the minimum wage or below were in the South: South Carolina (about 5%), Louisiana (about 5%) and Mississippi (about 4%).

It’s hard to argue that bringing working families above the poverty level would damage the economy. One reason is that payments to low-wage families—either by increased wages or public assistance—are spent in their entirety on food, shelter and other necessities, with that consumption added to the sum of economic output.

Think of it this way: While the cost of a Big Mac might rise and your tab at your local restaurant might increase to account for the higher cost of paying the janitor, the cost to society of maintaining adequate living standards for low-wage earners through public assistance payments should diminish.

Just as pandemic subsidies brought families out of poverty, increases in salaries for the people who process food or clean toilets will be more in line with their value to society.


More from our November 2021 economic report

The Real Economy: November 2021

The Real Economy: November 2021

A profound transformation is taking place in America’s workforce as businesses grapple with the shock of the pandemic economy.

RSM CONTRIBUTORS


Subscribe to The Real Economy