A version of this story appeared in the daily Threat Status newsletter from The Washington Times. Click here to receive Threat Status delivered directly to your inbox each weekday.
OPINION:
Elected officials, business leaders and other commentators have warned that the widespread deployment of artificial intelligence will lead to severe and lasting, even economically devastating, unemployment.
Last year, Sen. Bernard Sanders, Vermont independent, opined that AI will “wipe out tens of millions of decent-paying jobs,” and Ford CEO Jim Farley predicted that half of all white-collar jobs will be lost to the technology. Even AI champion Dario Amodei, CEO of Anthropic, has warned that AI could perpetrate a “white-collar bloodbath” that would push unemployment to 10% to 20% over the next five years.
These grim predictions are refuted by history.
For at least two centuries, each wave of transformative technology has triggered anxiety about machines wiping out jobs and leaving millions idle. In a 1962 news conference, President Kennedy remarked, “I regard it as the major domestic challenge, really, of the 1960s, to maintain full employment at a time when automation, of course, is replacing men.” In an essay written almost a century ago, economist John Maynard Keynes warned of “technological unemployment … unemployment due to our discovery of means of economizing the use of labor outrunning the pace at which we can find new uses for labor.”
Perhaps this time is different, but on each occasion so far, the alarm about technology creating long-term unemployment has been overstated. Time and again, the story has ended the same way: not with mass unemployment but rather with adaptation, new industries and higher living standards.
Artificial intelligence will eliminate — already has eliminated — some jobs and will change many others. Over the long term, it is unlikely to cause a net increase in unemployment. The historical record demonstrates that people are remarkably adept at finding new forms of work when old ones are automated away.
Concerns about “technological unemployment” are almost as old as industrialization itself. In 1811, weavers calling themselves “Luddites” destroyed textile machinery they believed threatened their livelihoods. The Industrial Revolution did displace certain skilled laborers. Within decades, the textile industry employed more people than ever before, as cheaper cloth generated booming demand and new roles in manufacturing, sales and logistics.
A century later, similar anxiety accompanied the proliferation of electricity and the internal combustion engine. In the early 1900s, the majority of Americans worked in agriculture. By midcentury, mechanized farming had reduced that share dramatically. Yet total employment rose. Millions left the fields for factory floors and, later, for offices and service jobs created by an expanding economy.
Even the computing revolution in the latter half of the 20th century, arguably the closest historical analog to today’s AI wave, provoked dire predictions. In 1964, an ad hoc committee of well-known social activists sent a memo to President Johnson decrying the “Cybernation Revolution” in which “the underlying cause of excessive unemployment is the fact that the capability of machines is rising more rapidly than the capacity of many human beings to keep pace.” By the 1980s, however, the increasingly computerized economy had spawned new industries, such as software, information technology services and advanced telecommunications. Employment didn’t shrink; it grew.
When new technology increases productivity, the costs of goods and services fall. Consumers, in turn, have more disposable income to spend, which boosts demand for new products and services. Automation directly eliminates some jobs, but the wealth and efficiency it creates indirectly generate others.
This is why we did not see long-term elevations in unemployment coinciding with, for example, the massive automation that swept through the manufacturing sector in the 1970s and 1980s, nor with the internet revolution that took place from the late 1990s into the early 2000s. To be sure, there were spikes in the unemployment rate above 7.5%, but these followed directly on the heels of recessions: 1973 to 1975, 1981 to 1982 (unemployment rocketed above 10% in late 1982), 1990 to 1991 and the Great Recession of December 2007 to June 2009 (unemployment again hit 10% in late 2009). High unemployment stems from macroeconomic conditions, specifically recessions.
What makes AI distinct is its ability to perform not just manual or routine tasks but also creative ones, such as writing, analysis and design. Early evidence suggests that AI will function less as a replacement for human labor and more as a force multiplier and enabler of higher-value work.
In the longer run, AI is likely to do what previous technological advances have done: shift employment rather than eliminate it. New fields, such as AI safety, data ethics, model training and human-AI collaboration, are already emerging.
Artificial intelligence will reshape the labor market, just as the steam engine, electricity and computing once did. It will be disruptive, but long experience instructs that society and the economy will adapt, as they always have. We are not running out of work; we are inventing new kinds.
• Thomas Beck spent more than a decade as a senior human resources executive with the largest health care system in the United States. Before that, he served as chairman of the Federal Labor Relations Authority and practiced labor and employment law with a global law firm.

Please read our comment policy before commenting.