December 28th, 2016
(written by lawrence krubner, however indented passages are often quotes). You can contact lawrence at: firstname.lastname@example.org
I wrote on Hacker News:
In recent months, there has been a large number of posts on Hacker News extolling the coming robot (and/or AI) revolution. I’ve read that we are facing a jobless future because all the jobs will be automated.
All of that might be true, at some point in the future. The future is a very long time. I have no idea how they economy will work 500 years from now. But we can form reasonable opinions about what will happen during the next 10 years.
For all the rhetoric about a productivity revolution thanks to robots and/or AI, we should check in with reality and remember how bleak the present is. Productivity in the USA ran at a high level during the 1940s, 1950s and 1960s, but it stalled out in 1973.
“Labor productivity in the private nonfarm business sector rose by an average of 2.9 percent per year between 1948 and 1973. Beginning in the earlier 1970s, though, productivity slowed sharply, averaging only 1.5 percent growth between 1973 and 1995. Several factors can help explain the downshift. First, growth in the immediate post-war era benefited from the commercialization of numerous innovations made during World War II, including the jet engine. The early 1970s marked the point at which the wartime innovations became exhausted. Public investment also slowed, and the 1970s oil shocks and collapse of the Bretton Woods system caused dislocations that weighed on growth. ”
For awhile, the situation was better in Britain, but since 2008 the situation has been worse. Britain has seen a complete collapse in productivity growth since 2008.
All of this is difficult to reconcile with talk of a revolution in productivity. Maybe that revolution will happen, but as late as 2016, there was no evidence of it in any government statistic, in any of the advanced economies.
It’s also worth noting, if we do see uptick in productivity growth (thanks to robots or any other technology) it might simply get us back to the kind of productivity we took for granted during the boom years of the 1940s and 1950s and 1960s. And those were decades of full employment. And that would be awesome.
Somebody offered this bit of nonsense:
Labor Productivity measurement depends on GDP, so it suffers from classic GDP mismeasurement error. GDP only measures $, not value, so it fails to measure technological improvements in a competitive industry — so it utterly fails to measure productivity (hence the charts showing that labor productivity growth is always tiny). Your computer today costs $1000, but it’s 1000x more useful and more complex than that $2000 computer of 30 years ago.
You are completely wrong. At least in the USA, the government does adjust for the increasing power of computers. Indeed, that is one of the arguments that productivity is really lower than what the government says (and therefore inflation is higher).
… the numbers are skewed by huge gains in real output in computer and electronics manufacturing that mainly reflect quality adjustments made by government statisticians, not increases in real-world sales.
But in fact, the author points out that it is altogether appropriate for the government to make those adjustments:
“Multi-factor productivity is simply a ratio of value-added to an index of inputs. Value-added in the manufacturing sector is a mesure of the economic value of all the goods produced. Not the physical number, the economic value. And hence manufacturing MFP is a measure of how much economic value that sector produces – not the physical number of goods – per unit of input used.”
And even with those adjustments, productivity growth in the USA has been weak.
If you were to remove the quality adjustments that the USA government makes to the computer and consumer electronics sector, then productivity has been even worse, and inflation much higher, and growth even weaker, than what we typically think.Source