🤔 The flying cars we never got: Are we wrong about what caused the Great Stagnation?
Maybe the problem isn't that good ideas became harder to find — but that we got worse at turning them into good and useful innovations
😎 I’ll keep this short and sweet. This the last day of my End of Summer sale! All my wonderful free subscribers can become paid subscribers at a 25 percent discount from the regular annual price of $70 a year. An annual subscription works out to about $52 a year, or $4.38 a month. (Or about $5 + a bit if you pay month-to-month.)
Click the big blue buttons for more details!
And for that value, you get all essays, Q&As, and podcasts (with transcripts) that put forward an ambitious-yet-practical vision of how to discover, create, and invent a better America and world. I hope you think Faster, Please! is worth it. I sure love bringing it to you.
Melior Mundus
What caused the Great Stagnation, the slowing of measured US productivity and economic growth since the early 1970s? It’s almost certainly a multicausal explanation. But not all causes are equally important, and one thesis seems to have particularly strong explanatory power: We squeezed all the juice from the great technological ideas of the past and finding new ones is getting harder and harder (requiring more researchers and resources). All the low-hanging fruit has been picked.
That’s my quick and dirty summary and synthesis of two scholarly efforts. First: In his remarkable 2016 book The Rise and Fall of American Growth, Northwestern University economist Robert Gordon argues that a cluster of great inventions — including electrification (both electric motors and lights), the internal combustion engine, and the industrial processes of molecular rearrangement that gve us chemicals, plastics, and pharmaceuticals — created a “special century” of rapid productivity growth from roughly 1870 to 1970 (especially robust from 1920 through 1970).
But by the time Americans walked on the Moon, the great inventions had all been largely implemented with productivity gains harvested and spread throughout the economy. And the IT revolution, while important, has not yet been as important as that historic cluster of one-off great inventions. As such, he concludes “economic growth was a one-time-only event.” (My apologies to our future AI overlords and their warp-speed growth economy.)
Second: In the 2017 (and since updated) paper, “Are Ideas Getting Harder to Find?” a team of economists (Nicholas Bloom, Chad Jones, John Van Reenen, and Michael Webb) present evidence that although our societal research effort is rising substantially, “research productivity is declining sharply.” Just take a look at Moore’s Law: “The number of researchers required today to achieve the famous doubling of computer chip density is more than 18 times larger than the number required in the early 1970s. More generally, everywhere we look we find that ideas, and the exponential growth they imply, are getting harder to find.” So we better ramp up those STEM classes and R&D dollars.
What both lines of scholarship importantly have in common is a focus on economic growth as the result of idea generation. Good ideas are not enough, though. They must be entrepreneurially processed into new products and techniques. Economist Martin Weitzman explained in 1998 that “the ultimate limits to growth lie not so much in our ability to generate new ideas as in our ability to process an abundance of potentially new ideas into usable form.”
In other words, an important idea is not immediately an important innovation. Alexander Fleming discovered penicillin in 1928, but it then took two decades of development by pharmaceutical firms before becoming available to the US public in 1945. (This is hardly uncommon and is why it’s kind of amazing that just a decade after the CRISPR breakthrough, drugs are being readied for market.) This development and commercialization process is a critical step for turning a great idea into something that boosts total factor productivity growth.
Econ 101 interregnum: TFP is that bit of economic growth that cannot be explained by inputs such as the number of hours worked or the amount of capital used. Presumably, it reflects advances in production technologies and processes. Gains in TFP have accounted “for well over half the growth in measured U.S. labor productivity (output per hour of work) over the past century,” according to the CBO.
Yet as economists Kevin R. James (LSE, Systemic Risk Centre), Akshay Kotak (LSE, Systemic Risk Centre), and Dimitrios P. Tsomocos (University of Oxford) observe in the new paper “Ideas, Idea Processing, and TFP Growth in the US: 1899 to 2019,” (which uses the penicillin example) much of the economic literature about economic growth and the Great Stagnation — with the notable exception of Weitzman, they add — “essentially ignores idea processing all together.” The researchers, JKT from here on, conclude this neglect is an important oversight. First, the paper’s summary and then what it means:
Innovativity—an economy’s ability to produce the innovations that drive total factor productivity (TFP) growth—requires both ideas and the ability to process those ideas into new products and/or techniques. We model innovativity as a function of endogenous idea processing capability subject to an exogenous idea supply constraint and derive an empirical measure of innovativity that is independent of the TFP data itself. Using exogenous shocks and theoretical restrictions, we establish that: i) innovativity predicts the evolution of average TFP growth; ii) idea processing capability is the binding constraint on innovativity; and iii) average TFP growth declined after 1970 due to constraints on idea processing capability, not idea supply.
This notion of “innovativity” is key, as is the notion that a decline in idea processing capability — not necessarily shortage of good ideas — is the culprit in the “disastrous fall” of average annual US TFP growth from 1.9 percent in the 1950s and 1960s to 0.8 percent over the 1980-2019 period. And as they add in a commentary on their paper: “If idea supply isn’t the problem, then increasing idea supply by rounding up the usual suspects of higher R&D spending and easing immigration restrictions for STEM workers (etc.) isn’t the solution.”
So, a big claim. How do they arrive at it? How do they calculate innovativity? These are the key insights:
Longtermism is good. An economy’s innovativity is stronger when more firms choose long-horizon innovation strategies.
Firms will choose such long-term strategies if rewarded by financial markets. This means, as explained by JKT, that “firms can credibly engage with key stakeholders to pursue long-horizon strategies. [But when] markets work poorly, firms lack that ability and so pursue quick-win strategies that demand less commitment from key stakeholders instead.”
Depending on financial market effectiveness, different periods of US history can be divided into High and Low states. JKT: “The financial market reforms of the 1930s/1940s significantly improved financial market effectiveness relative to the de facto unregulated Pre-Reform period, shifting innovativity from its Low state into the High state for 1951/1969. We find that financial market effectiveness declined after the 1960s, pushing innovativity back to its Pre-Reform Low state for the 1980/2019 period. As predicted, average TFP growth tracks innovativity.”
Accelerating TFP growth may require a different set of tools. Bad news here is really kind of good news, as they see it. Fixing our faulty idea process might be a lot cheaper than fixing our idea generation. They note that doubling R&D intensity from current levels, as suggested by “Are Ideas Getting Harder to Find?,” would cost over $600 billion per year in the US and that would result in merely maintaining current TFP growth. Helping avoid a “dismal neo-Malthusian future” by improving regulation would certainly cost a lot less and might prove far more effective.
And now a few thoughts and observations by me:
Thinking about TFP as a product of both idea generation and idea processing is undoubtedly helpful. It takes into account the key role of the entrepreneur and firm, as well the difference between scientific discovery, technological invention, and commercial innovation. It sure seems as if many Washington policymakers do not.
Also helpful: Thinking about how regulation can potentially disrupt idea processing capability. Talk to entrepreneurs and investors who are dealing with atoms rather than bits — say in the emerging energy and biotech sectors — and the role of regulation looms large. The focus here, of course, is on financial regulation. (A brief email exchange with James suggests the researchers see two areas of possible reform as high frequency trading and index investing.) But I wonder — and likely so do many of you subscribers — about the role of environmental and other anti-build regulations, as well as financial ones. I would also imagine the venture capitalists might have their own interesting ideas for financial regulatory reform.
I still favor greater R&D funding, as well as “metascience” reform that addresses funding incentives that result in less efficient and less venturesome research. Indeed, there is a process capability there as well. But the paper reminds us of the need for a broad-based approach when thinking about moving from the Great Stagnation to a possible Great Acceleration.
Micro Reads
▶ Why even environmentalists are supporting nuclear power today- Uri Berliner, NPR | The turnabout on Diablo Canyon is noteworthy because California is the birthplace of America's anti-nuclear movement. The case against nuclear power stems primarily from fears about nuclear waste and potential accidents as well as its association with nuclear weapons. The two operating generators at Diablo Canyon had been set to shut down by 2025. And for years the momentum to shutter the plant seemed inevitable, with anti-nuclear sentiment in California remaining high. Even the utility that operates Diablo Canyon, PG&E, wanted to pull the plug. So it is striking that the most vehement arguments to keep Diablo Canyon running haven't come the nuclear industry. Instead, they have been put forward by a most unlikely collection of pro-nuclear advocates. It seemed quixotic, even hopeless, in 2016, when Shellenberger along with the pioneering climate scientist James Hansen and Stewart Brand, founder of the crunchy Whole Earth Catalog, began advocating to save Diablo Canyon.
▶ U.S. life expectancy drops sharply, the second consecutive decline - Kate Sheridan, STAT | This year’s life expectancy figure is 0.9 years lower than last year’s. Covid-19 accounted for about half of the decline, and a category encompassing accidents and unintentional injuries is responsible for another 16%. That category includes overdoses; in fact, about half of the unintentional injury deaths in this analysis were due to overdoses. … Although deaths from heart disease were the third biggest contributor to the decline in life expectancy, the number of people dying from this condition actually decreased.
▶ The first private mission to Venus will have just five minutes to hunt for life - Jonathan O'Callaghan, MIT Tech Review | Rocket Lab has developed a small multipurpose spacecraft called Photon, the size of a dining table, that can be sent to multiple locations in the solar system. A mission to the moon for NASA was launched in June. For this Venus mission, another Photon spacecraft will be used to throw a small probe into the planet’s atmosphere. That probe is currently being developed by a team of fewer than 30 people, led by Sara Seager at MIT. Launching as soon as May 2023, it should take five months to reach Venus, arriving in October 2023. At less than $10 million, the mission—funded by Rocket Lab, MIT, and undisclosed philanthropists—is high risk but low cost, just 2% of the price for each of NASA’s Venus missions. “This is the simplest, cheapest, and best thing you could do to try and make a great discovery,” says Seager.