The Economics of Artificial Intelligence Read online

Page 7


  growth

  growth

  productivity growth

  Previous ten- year average LP growth

  0.0857

  (0.177)

  Previous ten- year average TFP growth

  0.136

  (0.158)

  Previous ten- year average TFPua

  0.158

  growth

  (0.187)

  Constant

  1.949***

  0.911***

  0.910***

  (0.398)

  (0.188)

  (0.259)

  Observations

  50

  50

  50

  R- squared

  0.009

  0.023

  0.030

  Note: Standard errors in parentheses.

  ***Signifi cant at the 1 percent level.

  **Signifi cant at the 5 percent level.

  *Signifi cant at the 10 percent level.

  Table 1.2

  Regressions with standard errors clustered by decade

  (1)

  (2)

  Labor

  Total factor

  (3)

  Ten- year average productivity growth

  productivity

  productivity

  Utilization- adjusted

  (SEs clustered by decade)

  growth

  growth

  productivity growth

  Previous ten- year average LP growth

  0.0857

  (0.284)

  Previous ten- year average TFP growth

  0.136

  (0.241)

  Previous ten- year average TFPua

  0.158

  growth

  (0.362)

  Constant

  1.949**

  0.911**

  0.910

  (0.682)

  (0.310)

  (0.524)

  Observations

  50

  50

  50

  R- squared

  0.009

  0.023

  0.030

  Note: Robust standard errors in parentheses.

  ***Signifi cant at the 1 percent level.

  **Signifi cant at the 5 percent level.

  *Signifi cant at the 10 percent level.

  Artifi cial Intelligence and the Modern Productivity Paradox 35

  Fig 1.5 Labor productivity growth scatterplot

  Fig. 1.6 Total factor productivity growth scatterplot

  faster growth in the following decade. In the TFP growth regression, the

  R 2 is 0.023, and again the coeffi

  cient on the previous period’s growth is insig-

  nifi cant. Similar patterns hold in the utilization- adjusted TFP regression

  ( R 2 of 0.03). The lack of explanatory power of past productivity growth is

  also apparent in the scatterplots (see fi gures 1.5, 1.6, and 1.7).

  The old adage that “past performance is not predictive of future results”

  applies well to trying to predict productivity growth in the years to come,

  36 Erik Brynjolfsson, Daniel Rock, and Chad Syverson

  Fig. 1.7 Utilization- adjusted total factor productivity growth scatterplot

  especially in periods of a decade or longer. Historical stagnation does not

  justify forward- looking pessimism.

  1.5 A Technology- Driven Case for Productivity Optimism

  Simply extrapolating recent productivity growth rates forward is not a

  good way to estimate the next decade’s productivity growth. Does that imply

  we have no hope at all of predicting productivity growth? We don’t think so.

  Instead of relying only on past productivity statistics, we can consider

  the technological and innovation environment we expect to see in the near

  future. In particular, we need to study and understand the specifi c technolo-

  gies that actually exist and make an assessment of their potential.

  One does not have to dig too deeply into the pool of existing technologies

  or assume incredibly large benefi ts from any one of them to make a case

  that existing but still nascent technologies can potentially combine to create

  noticeable accelerations in aggregate productivity growth. We begin by look-

  ing at a few specifi c examples. We will then make the case that AI is a GPT,

  with broader implications.

  First, let’s consider the productivity potential of autonomous vehicles.

  According to the US Bureau of Labor Statistics (BLS), in 2016 there were

  3.5 million people working in private industry as “motor vehicle operators”

  of one sort or another (this includes truck drivers, taxi drivers, bus driv-

  ers, and other similar occupations). Suppose autonomous vehicles were to

  reduce, over some period, the number of drivers necessary to do the current

  workload to 1.5 million. We do not think this is a far- fetched scenario given

  the potential of the technology. Total nonfarm private employment in mid-

  Artifi cial Intelligence and the Modern Productivity Paradox 37

  2016 was 122 million. Therefore, autonomous vehicles would reduce the

  number of workers necessary to achieve the same output to 120 million. This

  would result in aggregate labor productivity (calculated using the standard

  BLS nonfarm private series) increasing by 1.7 percent (122/ 120 = 1.017).

  Supposing this transition occurred over ten years, this single technology

  would provide a direct boost of 0.17 percent to annual productivity growth

  over that decade.

  This gain is signifi cant, and it does not include many potential productiv-

  ity gains from complementary changes that could accompany the diff usion

  of autonomous vehicles. For instance, self- driving cars are a natural comple-

  ment to transportation- as-a- service rather than individual car ownership.

  The typical car is currently parked 95 percent of the time, making it readily

  available for its owner or primary user (Morris 2016). However, in locations

  with suffi

  cient density, a self- driving car could be summoned on demand.

  This would make it possible for cars to provide useful transportation services

  for a larger fraction of the time, reducing capital costs per passenger- mile,

  even after accounting for increased wear and tear. Thus, in addition to the

  obvious improvements in labor productivity from replacing drivers, capital

  productivity would also be signifi cantly improved. Of course, the speed of

  adoption is important for estimation of the impact of these technologies.

  Levy (2018) is more pessimistic, suggesting in the near term that long dis-

  tance truck driver jobs will grow about 2 percent between 2014 and 2024.

  This is 3 percent less (about 55,000 jobs in that category) than they would

  have grown without autonomous vehicle technology and about 3 percent of

  total employment of long distance truck drivers. A second example is call

  centers. As of 2015, there were about 2.2 million people working in more

  than 6,800 call centers in the United States, and hundreds of thousands more

  work as home- based call center agents or in smaller sites.11 Improved voice-

  recognition systems coupled with intelligence question- answering tools like

  IBM’s Watson might plausibly be able to handle 60– 70 percent or more of

  the calls, especially since, in accordance with the Pareto principle, a large

  fraction of call volume is due to variants on a small number of basic queries.

  If AI reduced the number of workers by 60 percent, it would increase US

  labor
productivity by 1 percent, perhaps again spread over ten years. Again,

  this would likely spur complementary innovations, from shopping recom-

  mendation and travel services to legal advice, consulting, and real- time per-

  sonal coaching. Relatedly, citing advances in AI- assisted customer service,

  Levy (2018) projects zero growth in customer service representatives from

  2014 to 2024 (a diff erence of 260,000 jobs from BLS projections).

  Beyond labor savings, advances in AI have the potential to boost total

  factor productivity. In particular, energy effi

  ciency and materials usage

  could be improved in many large- scale industrial plants. For instance, a

  11. https:// info .siteselectiongroup .com / blog / how - big - is - the - us - call - center - industry

  - compared - to-india- and- philippines.

  38 Erik Brynjolfsson, Daniel Rock, and Chad Syverson

  team from Google DeepMind recently trained an ensemble of neural net-

  works to optimize power consumption in a data center. By carefully track-

  ing the data already collected from thousands of sensors tracking tempera-

  tures, electricity usage, and pump speeds, the system learned how to make

  adjustments in the operating parameters. As a result, the AI was able to

  reduce the amount of energy used for cooling by 40 percent compared to

  the levels achieved by human experts. The algorithm was a general- purpose

  framework designed to account complex dynamics, so it is easy to see how

  such a system could be applied to other data centers at Google, or indeed,

  around the world. Overall, data center electricity costs in the United States

  are about $6 billion per year, including about $2 billion just for cooling.12

  What’s more, similar applications of machine learning could be imple-

  mented in a variety of commercial and industrial activities. For instance,

  manufacturing accounts for about $2.2 trillion of value added each year.

  Manufacturing companies like GE are already using AI to forecast product

  demand, future customer maintenance needs, and analyze performance data

  coming from sensors on their capital equipment. Recent work on training

  deep neural network models to perceive objects and achieve sensorimotor

  control have at the same time yielded robots that can perform a variety

  of hand- eye coordination tasks (e.g., unscrewing bottle caps and hanging

  coat hangers; Levine et al., [2016]). Liu et al. (2017) trained robots to per-

  form a number of household chores, like sweeping and pouring almonds

  into a pan, using a technique called imitation learning.13 In this approach,

  the robot learns to perform a task using a raw video demonstration of what

  it needs to do. These techniques will surely be important for automating

  manufacturing processes in the future. The results suggest that artifi cial

  intelligence may soon improve productivity in household production tasks

  as well, which in 2010 were worth as much as $2.5 trillion in nonmarket

  value added (Bridgman et al. 2012).14

  Although these examples are each suggestive of nontrivial productivity

  gains, they are only a fraction of the set of applications for AI and machine

  learning that have been identifi ed so far. James Manyika et al. (2017) ana-

  lyzed 2,000 tasks and estimated that about 45 percent of the activities that

  people are paid to perform in the US economy could be automated using

  existing levels of AI and other technologies. They stress that the pace of

  12. According to personal communication, August 24, 2017, with Jon Koomey, Arman

  Shehabi, and Sarah Smith of Lawrence Berkeley Lab.

  13. Videos of these eff orts available here: https:// sites .google .com / site / imitationfrom observation/.

  14. One factor that might temper the aggregate impact of AI- driven productivity gains is if product demand for the sectors with the largest productivity AI gains is suffi

  ciently inelastic.

  In this case, these sectors’ shares of total expenditure will shrink, shifting activity toward slower- growing sectors and muting aggregate productivity growth à la Baumol and Bowen

  (1966). It is unclear what the elasticities of demand are for the product classes most likely to be aff ected by AI.

  Artifi cial Intelligence and the Modern Productivity Paradox 39

  automation will depend on factors other than technical feasibility, including

  the costs of automation, regulatory barriers, and social acceptance.

  1.6 Artifi cial Intelligence Is a General Purpose Technology

  As important as specifi c applications of AI may be, we argue that the

  more important economic eff ects of AI, machine learning, and associated

  new technologies stem from the fact that they embody the characteristics

  of general purpose technologies (GPTs). Bresnahan and Trajtenberg (1996)

  argue that a GPT should be pervasive, able to be improved upon over time,

  and be able to spawn complementary innovations.

  The steam engine, electricity, the internal combustion engine, and com-

  puters are each examples of important general purpose technologies. Each

  of them increased productivity not only directly, but also by spurring impor-

  tant complementary innovations. For instance, the steam engine not only

  helped to pump water from coal mines, its most important initial appli-

  cation, but also spurred the invention of more eff ective factory machinery

  and new forms of transportation like steamships and railroads. In turn,

  these coinventions helped give rise to innovations in supply chains and mass

  marketing, to new organizations with hundreds of thousands of employees,

  and even to seemingly unrelated innovations like standard time, which was

  needed to manage railroad schedules.

  Artifi cial intelligence, and in particular machine learning, certainly has

  the potential to be pervasive, to be improved upon over time, and to spawn

  complementary innovations, making it a candidate for an important GPT.

  As noted by Agrawal, Gans, and Goldfarb (2017), the current generation

  of machine- learning systems is particularly suited for augmenting or auto-

  mating tasks that involve at least some prediction aspect, broadly defi ned.

  These cover a wide range of tasks, occupations, and industries, from driv-

  ing a car (predicting the correct direction to turn the steering wheel) and

  diagnosing a disease (predicting its cause) to recommending a product (pre-

  dicting what the customer will like) and writing a song (predicting which

  note sequence will be most popular). The core capabilities of perception and

  cognition addressed by current systems are pervasive, if not indispensable,

  for many tasks done by humans.

  Machine- learning systems are also designed to improve over time. Indeed,

  what sets them apart from earlier technologies is that they are designed to

  improve themselves over time. Instead of requiring an inventor or devel-

  oper to codify, or code, each step of a process to be automated, a machine-

  learning algorithm can discover on its own a function that connects a set

  of inputs X to a set of outputs Y as long as it is given a suffi

  ciently large set

  of labeled examples mapping some of the inputs to outputs (Brynjolfsson

  and Mitchell 2017). The improvements refl ect not only the discovery of

  new
algorithms and techniques, particularly for deep neural networks, but

  40 Erik Brynjolfsson, Daniel Rock, and Chad Syverson

  also their complementarities with vastly more powerful computer hardware

  and the availability of much larger digital data sets that can be used to train

  the systems (Brynjolfsson and McAfee 2017). More and more digital data

  is collected as a byproduct of digitizing operations, customer interactions,

  communications, and other aspects of our lives, providing fodder for more

  and better machine- learning applications.15

  Most important, machine- learning systems can spur a variety of comple-

  mentary innovations. For instance, machine learning has transformed the

  abilities of machines to perform a number of basic types of perception that

  enable a broader set of applications. Consider machine vision—the abil-

  ity to see and recognize objects, to label them in photos, and to interpret

  video streams. As error rates in identifying pedestrians improve from one

  per 30 frames to about one per 30 million frames, self- driving cars become

  increasingly feasible (Brynjolfsson and McAfee 2017).

  Improved machine vision also makes practical a variety of factory au-

  tomation tasks and medical diagnoses. Gill Pratt has made an analogy to

  the development of vision in animals 500 million years ago, which helped

  ignite the Cambrian explosion and a burst of new species on earth (Pratt

  2015). He also noted that machines have a new capability that no biological

  species has: the ability to share knowledge and skills almost instantaneously

  with others. Specifi cally, the rise of cloud computing has made it signifi -

  cantly easier to scale up new ideas at much lower cost than before. This

  is an especially important development for advancing the economic im-

  pact of machine learning because it enables cloud robotics: the sharing of

  knowledge among robots. Once a new skill is learned by a machine in one

  location, it can be replicated to other machines via digital networks. Data

  as well as skills can be shared, increasing the amount of data that any given

  machine learner can use.

  This in turn increases the rate of improvement. For instance, self- driving

  cars that encounter an unusual situation can upload that information with

  a shared platform where enough examples can be aggregated to infer a pat-