Threat or Opportunity?

Keeping Perspective and Redefining Work in the Age of Advanced Automation
Written by Ian McCausland

Recently, the specter of automation has cast a shadow over the manufacturing industry. Alongside gains in productivity, automation also produces widespread fear of job losses and mass unemployment. Statistical evidence, based on historical trends in automation, reveals a much less alarmist picture, while advances in the area broadly defined as Artificial Intelligence (AI) are seized upon as evidence that in the near future, human labour will be made obsolete, or worse, AI will be the catalyst of humanity’s destruction.
~
As research into Artificial Intelligence has accelerated in recent years, however, there is also mounting evidence that fully autonomous automation, whether in the form of self-driving cars or generalized artificial intelligence that can perform at the cognitive level of humans or operate factories without human intervention, is, in fact, a very long way off. In this context, the future of automation presents an opportunity to redefine work, to automate physical or cognitive tasks that are routine, tedious, dangerous, or dull, and put humans to work in roles that are better suited to their capabilities and skills.

Historically, automation’s impact on employment has never been directly causal: not only are there other mitigating factors at play, but job loss in one area may create jobs in other areas. As far back as 1994, an OECD study observed that technology “both eliminates and creates jobs,” generally destroying “lower wage, lower productivity jobs, while… creat[ing] jobs that are more productive, high-skill and better paid.” More recent studies corroborate this claim, such as a 2017 Canadian study by the C.D. Howe Institute that looked at robot density in manufacturing in various countries around the world. Its authors hypothesized that if automation were responsible for destroying jobs in the industry, high robot densities should correlate with job losses; however, their research uncovered no such correlation, concluding that technological change “does not inevitably lead to a reduction in human labour” because automation either “complements human labour,” which generally “increases productivity” and “should be reflected in higher wages and overall economic gains,” or takes over “specific aspects of a job, rather than replacing it entirely.”

Susan Houseman, Senior Economist at the Upjohn Institute for Employment Research, has argued that manufacturing job losses in the United States have more to do with globalization – outsourcing production overseas – than automation. She bases her conclusion on how statistics on U.S. manufacturing are collected and interpreted, which overemphasize the manufacture of computers and semiconductors (industries that are not only already heavily automated but also traditionally rely on outsourced labour) and consequently paint a misleading picture of output growth in the industry.

The problem is that the computer industry doesn’t fit the usual interpretation of productivity growth, which tends to indicate that “workers are working faster or that automation… is driving the growth.” In other words, that productivity is directly proportional to output. Instead, as Houseman explains, productivity growth in computer-related industries “primarily reflect[s] innovations from research and development and innovations in the production process.” The same amount of labour can produce more value, because the product is better, more advanced, and so consumers are willing to pay more for it.

To illustrate how this works in the computer industry, one need not look further than the price of Apple’s recent iPhone X (not manufactured in the U.S., of course, but this is not the point of this example): workers are producing more value (measured as output) not because they are working any faster to produce more, or are otherwise more productive in the traditional sense of producing more output with the same amount of labour, but because one iPhone X is worth 1.4 times more than the previous generation iPhone. This evidence suggests, according to Houseman, that while manufacturing in the U.S. is declining, the decline has little to do with automation as such.

Advances in automation, then, may even help to reinvigorate the industry. Consider, for example, Foxconn’s plan, announced in July of 2017, to build a $10 billion LCD display panel facility in Wisconsin. Just one month before, in May 2017, Apple, Inc. announced the creation of a $1 billion fund to invest in “advanced manufacturing” in the U.S., going on to invest $200 million of this fund in Corning, Inc., the company, based out of Harrodsburg, Kentucky, that manufactures the glass used for the front (and back, for newer models) of the iPhone.

Given that these manufacturing plans are still in the early stages, it’s difficult to know what kind of impact they will have on the U.S. economy as a whole – in 2013, Foxconn announced and then scrapped a plan to open manufacturing facilities in the U.S., so an announcement is not a guarantee that this facility will be created – but it’s certainly worth considering whether advances in automation have helped lay the foundation for this interest in bringing manufacturing facilities (back) to North America, particularly in industries that are generally considered to rely on a large pool of cheaper labour in countries like China.

Here, a question arises: given the explosive advances in Artificial Intelligence since the beginning of the decade, should we not expect a disruption from new forms of automation that break with the above-mentioned trend, combining advanced artificial intelligence with robotics? Are we facing what Klaus Schwab, founder and executive chairman of the World Economic Forum, calls, in his book of the same name, the fourth industrial revolution, characterized by widespread (internet) connectivity, big data, and artificial intelligence? These technological advances are assumed to enable the automation of cognitive tasks in addition to physical ones.

There is reason to suppose, however, that this trend will continue. In the first place, the field that predominates in artificial intelligence – so much so that it tends to be synonymous with it – is machine learning (ML), which allows computers to isolate and extract patterns within data and make predictions based on these patterns without being programmed, by way of typical, procedural instructions. ML powers the voice assistant in your smartphone (natural language processing), image recognition, search engines, and recommendations on services such as Spotify, Netflix, and Amazon.

Fundamentally, machine learning is a form of statistical analysis and it is only as good as the data it has at its disposal, all other things being equal. So the fact that ML has developed alongside the massive accumulation of data made possible by the current state of internet connectivity, known colloquially as Big Data, is not a coincidence; it is extremely useful in sifting through this data to glean patterns and trends. But it requires, particularly in the context of supervised learning, training data to reveal the salient features it should be looking for. In other words, there is, implicit in this relationship between machine learning and its data, the need for human intervention.

Rodney Brooks, former director of the MIT Computer Science and Artificial Intelligence Laboratory and founder and CTO of Rethink Robotics, has recently written a number of essays that serve to temper the current hype around the perceived rapid progress in the field of AI, that is assumed to be on a very realistic path toward surpassing human intelligence. His company builds industrial robots that can be ‘trained’ and operated by ordinary people through a combination of hardware and intuitive software. The model of interest here is called Baxter – which is like the iPhone of industrial robots. In a 2013 TED Talk, titled “Why We Will Rely On Robots,” he tells the story of Mildred, a long-time factory worker who was able to program and work alongside Baxter within an hour.

Brooks’ vision for robots like Baxter is for factory workers to move from working on the assembly line to becoming robot trainers. Mildred has, in fact, a deep knowledge of what kinds of actions need to be performed on the assembly line, and Baxter’s user interface facilitates the transfer of this knowledge from worker to robot, not in order to replace her but to take over the more repetitive and tedious aspects of the job. By considering the human context in which the robot will be embedded, Brooks’ creation lays the foundation for a redefinition of work, rather than an outright replacement of it, and comprises an engineering solution guided by a realistic understanding of the capabilities of AI.

In an October 2017 article for MIT’s Technology Review, “The Seven Deadly Sins of AI Predictions,” Brooks refers to Amara’s Law (coined by Roy Amara, an American researcher and scientist, who co-founded the Institute for the Future in Palo Alto): “We tend to overestimate the effect of technology in the short run, and underestimate the effect in the long run.” Brooks illustrates this ‘law’ through the development of GPS in 1978. Although it started as a U.S. military project to deliver munitions, and didn’t manage to make good on its intended purpose until Operation Desert Storm in 1991, it is used today in ways that one would never have imagined from the vantage point of the 80s and early 90s, when the technology was in its infancy.

We are currently witnessing this “overestimation” of advances in the field of machine learning, so much so that it is difficult, at times, to see the forest for the trees. Rather than getting too far ahead of ourselves, it is worth considering instead how automation can be used to facilitate greater productivity and efficiency by working in collaboration, rather than in competition, with humans. Playing to the strengths of technology and understanding its limits allows for more effective uses of automated and human labour alike. Above all, though, it is worth remembering that while particular jobs may come and go as tools become more sophisticated and refined, there is still work to be done.

AUTHOR

CURRENT EDITION

Daisy Chains and Golden Gates

Read Our Current Issue

PAST EDITIONS

The World in a Grain of Sand

March 2024

Harvesting the Sun

February 2024

Making it Right

December 2023

More Past Editions

Featured Articles