TMCnet News

Automation vs. Jobs [Mechanical Engineering]
[June 28, 2012]

Automation vs. Jobs [Mechanical Engineering]


(Mechanical Engineering Via Acquire Media NewsEdge) Two researchers believe automated technology explains today's high unemployment and low hiring rates. If they're right, where will jobs come from in the future? Two hundred years ago, back when the Luddites revolted against industrialization by sabotaging mechanical looms, David Ricardo tackled the problem of automation.



Ricardo was one of the founders of modern economic theory. Like most of his contemporaries, he believed technological advances would raise living standards for everyone. Yet when he revisited the topic, he concluded that machines might one day make goods so cheaply that factories could not remain competitive if they paid laborers a living wage. Manufacturers would have to invest in machines.

The result would be technological unemployment. And ever since Ricardo's day, theorists have predicted that it was just around the corner. Their ranks have included such distinguished economists as John Maynard Keynes and Nobel Prize winner Wassily Leontief, and groundbreaking management consultant Peter Drucker.


Yet despite massive mechanization and automation, the U.S. economy kept creating jobs. Automation eliminated some positions, but new technologies as well as rising prosperity created others.

Now, two Massachusetts Institute of Technology professors believe the economy has turned that corner. They claim that automation- not just in factories, but in fields ranging from call centers and airline ticketing services to medical diagnostics and corporate supply chains- is replacing people faster than the economy can create jobs.

They are Erik Brynjolfsson, an economist and director of MIT's Center for Digital Business, and Andrew P. McAfee, the center's associate director and principal research scientist. They present their theory in a new book, Race Against The Machine: How the Digital Revolution Is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy.

The authors contend that advanced information and communications technology is transforming our economy by capturing jobs that only humans could have done before. In fact, they argue that the rapid inroads made by technology explain the central paradox of the Great Recession: economic growth without jobs.

The Great Recession officially ended in June 2009. Since then, the United States' gross domestic product has grown 75 percent as fast as its average between 1948 and 2007, to an all-time high. Corporate profits have reached record levels.

Ordinarily, such growth would spur companies to hire more workers. Indeed, unemployment among professionals with graduate degrees remains almost vanishingly low. Yet overall unemployment has hovered above 9 percent for most of the past three years and remains stubbornly high. The percentage of working adults is at its lowest level since 1983, when women were still entering the workforce. Once fired, the average worker takes 40 weeks to find a new job, almost twice as long as in any other post-war recovery.

Instead of hiring workers, companies are investing in equipment and technology, which rebounded quickly after the recession. According to Brynjolfsson and McAfee, this investment lets companies boost productivity and expand output without hiring additional workers.

They point to the Bureau of Labor Statistics' Job Openings and Labor Turnover Survey (JOLTS), which shows a significant decrease in hiring since 2000. "Lack of hiring, rather than increases in layoffs, is what accounts for most of the current joblessness," they wrote.

New Economics of IT Because so many thinkers have cried wolf about computerization in the past, it is easy to question this thesis. Many economists claim that the slow recovery is part of the ordinary business cycle. Others believe that innovation has stagnated, leaving us without new, emerging industries to drive the economy the way computers did in the 1980s or telecommunications and the Internet in the 1990s.

In fact, few economists even consider digital technology when looking at the potential causes of unemployment. A 2010 analysis of long-term structural unemployment by the Federal Reserve Bank of Richmond never even mentioned the words "computer," "hardware," "software," or "technology." Two recent reports by the International Monetary Fund were equally silent on the topic.

Yet others have seen technological unemployment coming. In his 1995 book, The End of Work, economist Jeremy Rifkin said humanity is entering "a new phase in world history-one in which fewer and fewer workers will be needed to produce the goods and services for the global population." Economist W. Brian Arthur, currently a visiting researcher with the Intelligent Systems Lab at the Palo Alto Research Center, agrees. He calls the automated services we now see all around us- from voice-activated automated call centers and ATMs to e-commerce and corporate supply chains- the second economy.

Airline ticketing is an example of this invisible economy. Twenty years ago, passengers presented paper tickets to airline agents, who registered them, notified their flight, and checked their luggage. Today, this all takes place when we insert a credit card into a computer. But that is not all, Arthur said: "The moment the card goes in, you are starting a huge conversation conducted entirely among machines. Once your name is recognized, computers are checking your flight status with the airlines, your past travel history, your name with the Transportation Security Agency (and possibly also with the National Security Agency). They are checking your seat choice, your frequent-flier status, and your access to lounges.

"This unseen, underground conversation is happening among multiple servers talking to other servers, talking to satellites that are talking to computers (possibly in London, where you're going), and checking with passport control, with foreign immigration, with ongoing connecting flights. And to make sure the aircraft's weight distribution is fine, the machines are also starting to adjust the passenger count and seating according to whether the fuselage is loaded more heavily at the front or back." This type of conversation underscores the radical changes in technology over the past decade.

"If you look at automation before the computer age, it was about substituting machines for human muscle power," McAfee said in an interview. "If any kind of communication was needed to a customer, supplier, or employee, no machine could do that for you." Humans were essential because they recognized complex patterns. They knew what information to draw on to help customers with a complaint or to diagnose the symptoms of a disease. Equally important, they could communicate their findings.

"The reason to be concerned now is that computers have gone from being really lousy to pretty amazing at recognizing patterns and communicating," McAfee said. Google's autonomous car, IBM's Jeopardy-playing Watson supercomputer, and software that analyzes medical images for cancer are some of the examples he mentions.

"Humans are no longer the best pattern recognizers," he said. "The bundle of things for which you have to hire human beings is shrinking, and I believe that is happening fairly quickly.

"Once computers get better than people, you don't have to hire people to do that job any more. That doesn't mean that people can't find work. There will always be an amount of work to do, but they won't like the wages they are offered," he said.

It's David Ricardo all over again.

The Chessboard So, why is this happening now? Simply put, computers have grown far more powerful. Over the past half century, microprocessors have grown 1,000 times faster, and the algorithms used to manage those processors have become 43,000 times more efficient. Moreover, microprocessors continue to follow Moore's Law, just as they have since they were introduced in 1958, and continue to double their performance every 18 months.

This is only the beginning, said Brynjolfsson and McAfee. To explain the growing power of computers, they turn to an apocryphal tale of a Chinese emperor who decided to reward the inventor of the new game of chess. When allowed to choose his award, the inventor asked for an amount of rice equal to one grain, doubled for each square on the chessboard.

After 32 squares, the amount equaled the harvest from one large rice paddy. By the end of the board, the pile of rice had grown as large as Mount Everest. (At this point in some stories, the enraged emperor chops off the inventor's head.) Doubling performance every 1.5 years since the invention of the integrated circuit in 1958 filled in the first 32 squares of the chessboard by 2006. At first, each new square yielded incrementally faster software. The later squares are producing one surprise after another.

Autonomous vehicles are one of them. Only eight years ago, economists Frank Levy and Richard Murnane contrasted the abilities of computers and humans in The New Division of Labor. Humans, they noted, beat out machines in pattern recognition tasks where rules could not be inferred. An example was driving a truck, which involves reacting in real time to moving objects with variable trajectories.

They assumed it would take decades for computers to master this task. After all, the Defense Advanced Research Projects Agency had just sponsored its first driverless vehicle challenge. The best contender made it only seven miles along the 150-mile route.

Yet three years later, six autonomous cars completed the DARPA Urban Challenge. They followed a 60-mile course through an abandoned military base, obeying traffic laws, stopping for lights and stop signs, and merging with traffic. Then, in 2011, Google revealed that its fleet of autonomous cars had driven 140,000 miles with only minor human intervention (and 1,000 miles without).

Such rapid progress is being made on scores of formerly obstinate problems. Apple's voice recognition software, Siri, for example, understands spoken commands well enough to run iPhone applications by voice alone. Lionbridge's GeoFluent, co-developed with IBM, translates written words fast and accurately enough for companies to use it to provide online service for customers speaking other languages.

Across the Internet, applications analyze and summarize news. At law firms, automated search software unearths paper trails among hundreds of thousands of legal documents. Researchers are getting closer to software that can diagnose tumors and other medical conditions as well as experienced clinicians can.

Such applications are likely to expand. Computers and communications are a general-purpose technology, which like steam power, electricity, and the internal combustion engine, finds many uses throughout the economy.

In fact, most economists believe the widespread introduction of information technology was responsible for the 2.5 percent per year growth in productivity over the past decade. This was far faster than during the 1970s and 1980s, and higher even than the 1990s. Moreover, conventional measures of productivity do not begin to capture the 24/7 convenience of ATMs, online banking, or such free services as Facebook, Wikipedia, and YouTube.

Productivity growth clearly shows there is no stagnation in technological progress, economists assert. Yet in the past, this would have generated profits, business expansion, and job creation. Profits set all-time records in 2011, but hiring has stalled. The MIT researchers believe this is a sign of deeper structural changes in the economy.

Winners and Losers These changes show up in uneven income distribution. Although GDP and total income have grown steadily, the inflation-adjusted median income of working-age households has been slowing since the 1970s. During the past decade, it fell to $55,821, from $60,746, the first time it has ever declined over a 10-year span.

The researchers looked at median income because it is the income level that divides the population in half, with half of American families making more and half making less. It has declined because increases in wealth have gone to those at the top.

The same concentration of wealth at the top shows up in studies by economist Ed Wolff of family assets. He found that from 1983 to 2009, more than 100 percent of the increase in total U.S. wealth went to the top 20 percent of households. The top 5 percent accounted for more than 80 percent of the increase, and the top 1 percent, more than 40 percent. The lower 80 percent saw a net decrease in wealth over the same period.

Brynjolfsson and McAfee explain the disproportion by considering three sets of winners and losers in today's technology-driven economy. The winners, they assert, have learned to race with machines, rather than against them.

First, they looked at high-skilled and low-skilled workers. They note that robots, numerically controlled machines, computerized supply chain software, automatic transcription, and other technologies are increasingly handling routine and repetitive tasks. As knowledge, such as how to cut a piece of metal or manage a warehouse, is embedded in a machine, the machine eliminates workers or reduces the amount of skill (and pay) needed to fill a job.

"Meanwhile," they wrote, "other technologies like data analytics, high-speed communications, and rapid prototyping have augmented the contributions of more and data-driven reasoning, increasing the value of those jobs." The second group of winners is the superstars. Compared everyone else, they receive a giant share of the rewards. In business, this includes CEOs, key executives, and top professionals. The reason is simple: "Technology increasingly captures business processes, and so the talents, insights, and of an individual can dominate a national or even market," they wrote. Companies that are good but not great are crowded out of their markets.

They point to the CVS retail pharmacy chain and its comprescription drug ordering process as an example. Every time CVS improves this system, the results show up in 4,000 stores across the nation, magnifying its value. The value of executives who consistently make those improvements or reach the best business decisions is similarly No wonder Brynjolfsson, in previous research, linked the rise in CEO pay- 300 times that of the average worker in 2005, up from 70 to one in 1990- to greater use of information technology.

A similar division exists between capital and labor. The relative bargaining position of the two depends on their relative contribution to production. When labor is scarce, goes the theory, workers can demand higher pay. As technology replaces labor, though, the contribution of capital increases.

This is exactly what happened. According to data from the U.S. Commerce Department, corporate profits were 23.8 percent of total corporate income in 2010, a full percentage point higher than the previous record, set in 1951. Meanwhile, labor's share of GDP was essentially flat between 1974 and 1983 and has been falling ever since.

Brynjolfsson and McAfee contend that concentrating wealth in fewer families could dampen overall economic demand, since the affluent save more and may opt for greater leisure. They also note that technological unemployment could be with us for a long time to come.

Other economists have countered such claims. They assert that the wealthy invest in new technologies that improve productivity and drive down costs, making an affluent lifestyle more affordable for everyone. They also note that in the past, technology has created more jobs than it has eliminated.

That was true in the past, the authors respond. Over time, a well-functioning economy should adjust to technological unemployment and the reallocation of income. It happened before. In 1800, 90 percent of Americans worked on farms. That fell to 41 percent by 1900 and to just 2 percent by 2000. workers left farms, emerging industries put them to work.

Yet America had more than 100 years to adjust to agricultural mechanization. Digital technology works faster. "Accelerating technology in the past decade has disrupted not just one sector but virtually all of them," the MIT professors wrote.

At first, those changes looked temporary. During the "jobless recovery" that followed the 2000 recession, good jobs were harder to replace and median wages fell. Yet many families believed these problems were temporary, and borrowed to maintain their lifestyles. This kept consumption rising even as the median wage fell.

Over time, the economy might have adjusted. Families might have reduced their borrowing, and businesses might have adjusted to a shift in income to more affluent savers by gradually automating to save money. Instead, the Great Recession hit, compressing what could have been a decade or two of adjustment into months.

Companies unloaded workers and used the savings to invest in technology. The compressed time frame made it harder for workers "to change their skills, entrepreneurs to invent new business models, and managers to make the necessary adjustments equally quickly," Brynjolfsson and McAfee wrote.

Moving Forward Their analysis explains why economics is often called the dismal science. Yet not every economist buys it. Dean Baker, co-director of the Center for Economic and Policy Research, noted that the same argument was made in the 1950s and 1960s. "Unemployment is due to bad economic policy," he said. "We shouldn't blame machines." Businesses, he explained, will not hire additional workers until they see greater demand. Capacity utilization in many industries remains below 70 percent, which certainly indicates weak demand. He favors greater stimulus to spark demand. Companies have boosted profits by keeping wages down in a very weak economy at a time when unions are especially weak. He also argues that productivity growth over the last decade has not been as fast as the period from 1947 to 1973, a period when less-skilled jobs were also eliminated.

Brynjolfsson and McAfee do not disagree that a weak economy has purged jobs and kept wages low. They argue that technology has helped companies boost profits despite weak demand. Moreover, it has done so in ways that have reduced the need to hire as many workers as the economy rebounds.

Even so, they see some sunlight through the clouds. They believe the economy can win when people learn to race with rather than against the machine.

Another example from the chess world shows what they mean. In 1997, IBM's Deep Blue supercomputer beat the world's best chess player, Gary Kasparov. Since then, it has been clear that computers can beat humans. Yet the best chess players today are not computers. Nor are they humans. They are teams of humans using computers.

The winner of one recent competition was a pair of amateurs using three computers. As Kasparov noted in his analysis, "Their skill at manipulating and 'coaching' their computers to look very deeply into positions effectively counteracted the superior chess understanding of their grandmaster opponents and the greater computational power of other participants." In other words, good process overcame raw computational strength.

Brynjolfsson and McAfee believe that this is a blueprint for economic success. Computers are better at repetitive, error-free routines, and are improving in pattern recognition and complex communication. Humans have the intuition and creativity computers lack. They can define their own tasks and work outside their predefined domains. Married by the right processes, people and machines could forge potentially beautiful partnerships.

What will those relationships look like? The authors are not sure. But they do know that digital technology lets users combine and recombine existing innovations in an enormous number of new ways. Somewhere among those variations are the new processes, organizational structures, and business models that will enable people to thrive best with machines.

They note that 600,000 entrepreneurs make a living by selling online through eBay and Amazon. Tens of thousands of others have piggybacked on existing technologies to sell applications for Facebook, Twitter, and smartphones.

Yet not everyone is going to go to college, develop software skills, or become an entrepreneur. So where will the jobs come from? McAfee has no pat response. "We need an answer that doesn't rely on MIT and other elite institutions," he said. "We need to let the job creators figure it out." Perhaps that is why Race Against The Machine's action agenda sounds so much like other societal fixes suggested by engineers, scientists, and professionals. It includes improved education, better support for entrepreneurs, more investment in infrastructure and research, and changes in laws, regulations, and taxes. McAfee believes that with some encouragement, entrepreneurs will find new ways to make money and employ people.

Will it be enough? "We're still in the early stages of a transition," McAfee said, "and it is unclear how it will all play out." The moment the card goes in, you are starting a huge conversation conducted entirely among machines.

In 2011, Google revealed that its fleet of autonomous cars had driven 140,000 miles with only minor human intervention (and 1,000 miles without).

The best chess players today are not computers. Nor are they humans. They are teams of humans using computers.

Every time CVS improves its system, the results show up in 4,000 stores across the nation, magnifyinq its value.

Alan S. Brown is associate editor of Mechanical Engineering magazine.

(c) 2012 American Society of Mechanical Engineers

[ Back To TMCnet.com's Homepage ]