I wish the article had addressed the other issue in the area: funding. The US stopped lavishly funding scientific research sometime in the 1970s. Private industry has taken up the difference, but private industry wants to focus on immediately usable research for profit, rather than fundamental stuff that’ll be useful for the next century for all of society.
Is it that surprising that the most cited research papers come from the tail end of the federally funded research era?
That "lavish funding" era extended largely from WWII, with specific focus on technologies such as radar, fire-control (computers), and the Manhattan Project, was inspired strongly by Vannevar Bush's "Science: The Endless Fronteir" (itself something of an HN perennial), and kicked mightily in the keister by the Sputnik scare and nuclear / missile arms race of the 1960s.
By a decade later, numerous factors had taken much of the steam out of the sails (to mix metaphors): the Vietnam war, foreign exchange and major changes in global currency, and the emergence of domestic peak oil in the US (lower 48 at least) with ceding of control over global petroleum production and prices to the Middle East, along with numerous consequences there. At the same time, Detante and the opening of China, and political scandals (most notably Watergate), and the civil rights and anti-war movements, changed attitudes toward government (amongst the Left) and toward academia (amongst the Right). The former is well documented through the general counterculture movement, the latter probably through the Lewis Powell Memorandum.
At the same time, there was what I'd see as a real decline in the pace of both scientific and technological progress in almost all areas, save information technology and some materials science.
TFA actually focuses fairly narrowly on one element, which is the explosion in publishing. I'll address that in a top-level comment, as I feel it's been overlooked by most other comments.
I'd add a growing awareness of and concern for the environment. The United States had its "Moore's Law" era for nuclear technology for about 20 years after WW II. After that, concern about weapons test fallout and other environmental releases of radionuclides made experiments much slower and more expensive. To the extent that many ideas never left the drawing board.
There are similar stories with chemical technology, manufacturing, even electricity generation. Fossil fuel depletion is one example of overtaxed sources. Strontium 90 in human teeth, acid rain, phosphate driven algal blooms, etc. are emblematic of overtaxed sinks. The US circa 1960 enjoyed a faster-than-sustainable pace of development (scientific and technological) by borrowing from the future on multiple axes.
I didn't want to head down that rabbit hole, but there are a few lines of argument which lead to the conclusion that the end-stage of most technologies involves both ever-diminishing positive returns and an increased concern in dealing with unintended consequences. I call these "hygiene factors", though environmental concerns would certainly be a prime example.
One framing of this looks at the mechanisms by which technologies achieve results. I've identified nine of these: fuels, materials, energy & power transmission and transformation, technical knowledge ("technology"), causal knowledge ("science"), networks, systems, information, and hygiene. These seem reasonably well-defined.
The area of accelerating rates of returns seems specific to network / dendritic structures (physical, conceptual, or both). Even here, growth ultimately slows, probably best considered governed by a logistic function.
Is it that surprising that the most cited research papers come from the tail end of the federally funded research era?