In the first half of the 20th century, America raced ahead of the rest of the world, riding an unprecedented surge in productivity. Understanding why this happened is one of the most consequential tasks in economics.
The most famous answer to date came from Robert Solow in 1957, who claimed that 7/8ths of US productivity growth from 1909 to 1949 was growth in Total Factor Productivity, a concept typically taken to represent (in a very broad sense) technological change. This has led to the common understanding that technology, and smart ways of using technology, triggered the American century.
This working paper from Nicholas Crafts (2017) gives a more robust update to Solow’s work—and to the various other economists that followed Solow (particularly Kendrick (1961)). If you parse the economese, Craft’s answer is basically as follows:
From 1899 to 1941, America’s productivity (TFP) grew 1.3 percent a year on average—much lower than Kendrick’s 1961 estimate of 1.7 percent. The reason for the difference is that Crafts thinks labour productivity grew 0.8 percent a year in the period, not the 0.3 percent Kendrick estimated. This is mainly because Crafts takes into account the improved education level of workers. In other words, an input to productivity improved quite significantly—in this case, the quality of America’s workers—and that meant there was less left over to be explained by the residual of TFP (which essentially reflects the effectiveness with which an economy turns productivity’s inputs into outputs).
The implication is that TFP does not explain 7/8ths of US productivity growth in the period, as Solow estimated, but around 60 percent. Technology wasn’t quite so overwhelmingly important to America’s surge; smarter workers were also key.