Here is an excerpt from an article written by Alex Singla, Alexander Sukharevsky, Elia Berteletti, Lareina Yee, and Michael Chui, for McKinsey Quarterly. To read the complete article, check out others, sign up for email alerts, and obtain subscription information, please click here.
Illustration Credit:
* * *
Innovation has been the driver of the extraordinary progress from which humankind has benefited for a couple of centuries, but it faces a largely hidden threat: Innovation is becoming harder and more expensive.
It’s instructive here to take the long view. For most of recorded human history, improvements in human welfare from generation to generation have been limited. Take, for example, GDP per capita as a measure of economic prosperity. For most of human history, roughly until the early 1800s, the measure barely moved to $1,200. But since that time, it has grown by more than 14 times (Exhibit 1). Human health has followed a similar trajectory—low for centuries and only significantly improving in recent generations. In 1900, for example, the average life expectancy of a newborn was 32 years. By 2021, this had more than doubled to 71 years.
These and many other improvements in our lives have been driven by a set of scientific discoveries and products engineered based on those breakthroughs. These innovations have enabled economies to grow and people’s lives to improve. The steam engine helped power the Industrial Revolution. Vaccines that prevent diseases such as smallpox, measles, and polio continue to save millions of lives each year; infant mortality is estimated to have decreased 40 percent in the past 50 years because of vaccines.3 The invention of the integrated circuit for computing and lasers for communication through fiber-optic cables helped create the global internet.
But the rate of progress enabled by innovation now faces an under-recognized threat: Innovation is getting more difficult and more expensive.
Even as science advances, R&D productivity is on the wane
By many metrics, and in many fields, each dollar spent on R&D has been buying less innovation over time. In other words, R&D productivity has been declining.
Take the semiconductor industry. With integrated circuits embedded in products that support nearly every part of our lives, this sector has advanced in accordance with “Moore’s Law”—the remarkable observation put forward by Intel cofounder Gordon Moore that the number of transistors on an integrated circuit will double about every two years.4 This is roughly equivalent to an exponential growth rate of 35 percent annually in transistors per dollar.
But this level of performance increase has been bought at the cost of increasing expenditures in R&D. Nicholas Bloom, an economics professor at Stanford University, and his research collaborators published a paper in 2020 that examined the real R&D expenditures of semiconductor companies and equipment manufacturers and estimated that their annual research effort rose by a factor of 18 between 1971 and 2014.5 In other words, maintaining the performance growth rate in Moore’s Law required 18 times more inflation-adjusted R&D spending in 2014 than it did in 1971 (Exhibit 2).
* * *
Here is a direct link to the complete article.
This article is a collaborative effort by Alex Singla, Alexander Sukharevsky, Elia Berteletti, Lareina Yee, and Michael Chui, representing views from QuantumBlack, AI by McKinsey, and McKinsey’s Operations Practice.