At the end of this historic and difficult year, one somewhat surprising question has been on my mind.
As hard as this past year has been, is it possible that a new “Roaring Twenties” is in our near future? A time when the economy hums with productivity (and perhaps too, markets continue to soar)?
Two prominent economists seem to think so.
Recently, George Mason Professor of Economics Tyler Cowen described what is termed Total Factor Productivity (TFP) – an approximation of the effects of strategic and technological progress on economic productivity. And he suggests that, in 2021 and beyond, we may see the highest TFP ever.
To understand how, Stanford Professor of Economics Erik Byrnjolfsson points to the “Productivity J-curve.” The first section of the “J” represents the introduction of new technologies, investments that are often costly. A period of flattening or even declining productivity typically follows. But as time moves along the “J,” those technologies take ever greater effect, and processes tend to become more efficient. Productivity swings upward.
Both economists feel that technologies and methods ranging from artificial intelligence (AI) to data analytics and automation may be poised to finally come into their own and pull us out of what has been a long period of stagnation in the realm of productivity. What Byrnjolfsson and co-author Andrew McAfee, a research scientist at the Massachusetts Institute of Technology (MIT) have called the “Great Restructuring” may be about to move into a new phase.
As a new piece in Axios puts it, while 2020 certainly has not been an easy year, “the economy may be close to consolidating years of technological advances – and ready to take off in a burst of productivity growth.”
The Rules of the World
The types of influential innovations the likes of Byrnjolfsson, Cowen, and McAfee are thinking about are best illustrated by looking at the latest iterations of some key technologies. All of them were announced during this, our Covid year.
First consider DeepMind’s AI agent MuZero, a “deep reinforcement learning” technique that works with many-layered neural networks such that machines teach themselves via trial and error and are rewarded for success rather than being directed what to do. It considers its environment and executes “tree searches” – examining many steps ahead to determine how to proceed toward a best outcome.
Part of what makes MuZero unique is that it only models glimpses of the environment essential to its decision-making process, rather than a more sweeping approach (DeepMind offers an apt example – “knowing an umbrella keeps one dry is more useful to know than modelling the pattern of raindrops in the air”). In essence, MuZero can extract key insights from less data than other AI techniques. It seems to better intuit the rules of the world than earlier tools. Potential practical uses of such capabilities include video data compression and next generation virtual assistants.
A second example struck me this past summer, when I wrote about AI lab OpenAI’s release of language generator AI technology GPT-3, the most powerful language generation software yet created. It digests huge amounts of text from the web, analyzing which letters and words tend to logically follow one another, “learning” how to produce text.
Potential applications of GPT-3 include everything from chatbot improvement to website design and medical prescription. OpenAI offers businesses a paid-for subscription to GPT-3 via the cloud. Trevor Callaghan, a former employee at rival DeepMind, wondered, “If you assume we get NLP (natural language processing) to a point where most people can’t tell the difference, the real question is what happens next?”
Finally, 2020 also saw news of the release of a new computer chip from Apple called the M1. M1’s integrated graphics mean a significant increase in graphics performance combined with lower power consumption. It also features unified memory architecture. A November article on M1’s release insists that:
This is an astonishing move in technology in about 45 years…. This is all being manufactured by Apple on a 5nm silicon die…. Apple has successfully moved a dozen chips to a single chip with substantial space savings, power savings, and speed increases. In many ways this has surpassed Moore’s Law.
As a piece in Verge put it, “The conversation has flipped instantly: it’s no longer ‘why would you take a gamble on Apple’s new, unproven processor’ but ‘how will competitors like Intel, AMD, and Qualcomm respond?’”
Better Days Ahead?
As the aforementioned Axios piece also notes, according to a World Economic Forum October survey, more than 80% of global firms plan to accelerate the digitization of business process and grow remote work. Fully 50% of those firms plan to accelerate automation. And some 43% expect such initiatives may well reduce their workforces. That last point doesn’t sound like good news, yet such a result would imply an increase in productivity, which in turn may eventually create completely new opportunities for workers.
It’s also important to note that the surveyed firms are global, that the digital revolution is a global phenomenon, and that a number of emerging economies may well benefit more and more from this fact over time. As Morgan Stanley strategist Ruchir Sharma writes in The New York Times:
From the steam engine to cars, the economic effect of tech revolutions has tended to gain momentum over time and peak decades after the original invention. The digital revolution is young; its biggest influence on the growth of emerging economies is most likely still to come…. Lacking the means to spend, poorer countries are pushing reforms that, while often unpopular, should boost productivity and promote growth.
Still, back here in the United States, it’s important to remember that it’s going to be a long, tough few months as the holidays come to a close and winter takes hold. The ravages of Covid19 will continue for some time.
To avoid exposure to the virus, many Americans are sure to remain homeshored for months to come.
Which brings me back to the Roaring Twenties, and a scene inside the lavish home of Jay Gatsby, protagonist of F. Scott Fitzgerald’s classic 1925 novel, The Great Gatsby. During a booze-filled party at Gatsby’s mansion, narrator Nick Carraway wanders into Gatsby’s library, where he encounters a spectacled man he refers to as “Owl Eyes.”
The mysterious owl-eyed man has been drinking and appears to be pleasantly surprised at all the books on the shelves around him.
Upon further inspection, however, the owl-eyed man discovers that the books, though authentic, have uncut pages. The books are impressive to look at but remain unread. Fitzgerald seems to suggest that Gatsby’s carefully crafted image might be an illusion, that the American Dream might not be all that it seems.
It’s enough to make one wonder – if Erik Byrnjolfsson, Tyler Cowen, and others are correct, and we someday look back on the coming decade as a new kind of Roaring Twenties characterized by a productivity boom and a 21st century brand of techno-consumerism, what will eventually be revealed about the dazzling technological innovations of our time?
Will they spawn vast new opportunities for workers as well as enterprises, or will that dream prove illusory? What will be the reality of our new Gatsby, our updated American Dream?
Nobody can see into the future, not even an owl-eyed man. But Hard Times don't last forever. Should a new Roaring Twenties indeed come to pass, I’m betting it ends better than the last one.
Happy New Year.
Image: image from the 2013 movie, ‘Gatsby’