Every automation wave in history created more jobs than it destroyed. Will AI be different?
The pattern has held for mechanized farms, electrified factories, and computerized offices. AI tests its core assumption in new ways

Matthew Chattle/Future Publishing via Getty Images
In 1900, about 41% of the U.S. workforce farmed. That share dropped to 16% by 1945, to 4% by 1970, and to just 2% by 2000, according to the White House Council of Economic Advisers.
The economy did not collapse. It reorganized.
Millions of displaced agricultural workers and their descendants moved into manufacturing, services, and professions that did not exist when the tractor arrived. Economists have long cited this pattern as evidence that technological disruption, however painful in the short term, generates more work than it eliminates.
Today, as AI drives a contraction in white-collar employment without clear precedent, the question is whether that pattern will still hold.
The engine of the cycle: displacement, reinstatement, and new tasks
MIT economists Daron Acemoglu and Pascual Restrepo formalized the mechanism behind this historical regularity in a 2019 paper in the Journal of Economic Perspectives. Automation displaces workers from tasks they used to perform, shifting production against labor. But the effects of automation are counterbalanced by the creation of new tasks in which labor has a comparative advantage. The introduction of these new tasks reinstates labor demand and raises the labor share.
This displacement-reinstatement cycle has repeated across centuries. Using data from the Federal Reserve Bank of Philadelphia, Acemoglu and Restrepo found that about half of employment growth between 1980 and 2015 took place in occupations where job titles or tasks performed by workers changed.
In other words, the economy did not just shuffle workers into existing roles. It invented new ones.
The tractor, the dynamo, and the ATM
Each major automation wave followed a recognizable arc. Agricultural mechanization is the starkest case.
According to the USDA Economic Research Service, the number of self-employed and family farmworkers declined from 7.6 million in 1950 to 2.06 million in 2000, a 73% reduction. Hired farmworker employment fell 51% over the same period. Yet the U.S. economy absorbed these workers, creating entire sectors of employment that agriculture's decline made possible.
Electrification followed a similar logic. Between 1900 and 1930, the share of power in American manufacturing coming from electricity grew from 10% to 80%. A study published through the Centre for Economic Policy Research found that in early 20th-century Sweden, electrification led to the creation of new jobs accessible to workers with a primary education, contributing to reduced inequality and inclusive growth. The research showed that new technology in the form of electrification did not result in massive layoffs or technological unemployment. Instead, new opportunities were created.
The ATM may be the most widely cited example. As Boston University economist James Bessen documented in a 2015 analysis for the IMF, ATMs reduced the cost of operating a bank branch. The number of tellers required to operate a branch in the average urban market fell from 20 to 13 between 1988 and 2004. But banks responded by opening more branches, which increased 43% in urban areas. Teller jobs did not disappear. They changed. While ATMs automated some tasks, the remaining tasks that were not automated became more valuable.
The computerization precedent and its limits
The personal computer era offers perhaps the closest analog to AI. Since the introduction of the IBM $IBM PC in 1981, desktop computers became a standard fixture in most workplaces. Through their ubiquity and impact on how work is done, personal computers transformed the workplace. Computerization eliminated routine cognitive tasks: bookkeeping, filing, scheduling. But it created demand for workers who could operate, manage, and build upon those systems.
High computer-use sectors re-organized work and introduced new products and services that disproportionately employed more educated workers. By 1993, almost half the workforce used computer keyboards at work. The pattern held, but with a cost that is instructive now. Computerization hollowed out middle-skill work. As David Autor, professor of economics at MIT, argued in a 2024 NBER working paper, the utopian vision of the Information Age was that computerization would flatten economic hierarchies by democratizing information. The opposite occurred.
Information turned out to be merely an input into decision-making, which remained the province of elite experts. This distinction matters because it reveals the structural assumption underlying every prior wave: Automation targeted specific, bounded tasks. Human judgment, creativity, and the capacity to handle ambiguity remained inputs that machines could not replicate. Workers displaced from automated tasks could move into roles that required those human capacities.
Where AI may break the pattern
AI challenges this assumption in a way no prior technology has. Previous waves of automation expanded the set of tasks performed by machines, but they reliably created new tasks requiring human cognition. Acemoglu and Restrepo warned that automation increases the size of the pie, but labor gets a smaller slice. There is no guarantee that the productivity effect is greater than the displacement effect. That warning intensifies with AI.
Acemoglu and Simon Johnson, professor of entrepreneurship at MIT, wrote in a 2023 piece for the IMF that there is no guarantee that, on its current path, AI will generate more jobs than it destroys. They coined the term "so-so automation" to describe technology that displaces workers without producing enough productivity gains to generate new employment elsewhere.
Self-checkout kiosks in grocery stores bring limited productivity benefits because they merely shift work from employees to customers. Fewer cashiers are employed, but there is no major productivity boost to stimulate the creation of new jobs. Groceries do not become cheaper, and shoppers do not live differently. The risk with AI is that a similar dynamic could play out across a broader range of cognitive work.
Acemoglu and Restrepo suggested that slower employment growth over the last three decades is accounted for by an acceleration in the displacement effect, a weaker reinstatement effect, and slower productivity growth. If AI accelerates displacement without a corresponding burst of new tasks, the historical pattern breaks.
Autor, for his part, has argued that AI could go either way. AI can either enhance or undermine the value of human expertise. If AI is used to automate and simplify expert tasks, it may commodify expertise and reduce its economic worth. If AI is designed to complement human skills, it can elevate human capabilities. His thesis is that AI, if used well, can assist with restoring the middle-skill, middle-class heart of the U.S. labor market that has been hollowed out by automation and globalization. But he frames this as an argument about what is possible, not a forecast.
The historical record shows that automation waves do create more jobs than they eliminate. It also shows that the process is slow, uneven, and painful for the workers caught in transition.
What history cannot show is what happens when the technology targets the very capacities that made human labor indispensable in every prior cycle. That is the experiment now underway.