You Decide: Is it really 2030?
Published 9:02 am Wednesday, August 5, 2020
Getting your Trinity Audio player ready...
|
By Dr. Mike Walden
The calendar says 2020, but some say it’s really 2030. Huh? Did we suddenly lose a decade? I, for one, certainly hope not, because that would make me 79 instead of 69.
Actually, no one is saying it really is 2030. What they mean is the ongoing trends in the economy have accelerated so rapidly that the world we are looking at now is closer to what it would have been in 2030. In other words, the future is on us sooner than we thought.
What is the cause of this time travel? It’s the COVID-19 pandemic. As economists look at how businesses, households and workers have coped with the virus, many of us see outcomes we wouldn’t have expected until many years in the future.
Here’s a good example. Meat processing plants use large numbers of individuals working in close proximity to convert cattle, hogs and poultry into products supermarkets and restaurants can use. In fact, meat processing is an important economic sector in North Carolina.
When some of these plants had virus outbreaks, several economists – including me – speculated that down the road we would see the processing plants begin to replace workers with machines and technology. The logic was that machines and technology are immune to virus outbreaks, and thus when a future pandemic occurred, these high-tech food processing plants could continue operating.
I thought such a conversion was years away. Then a couple of weeks ago I read that some meat processing plants have already begun to introduce robots for some of their work. The article said that the robots weren’t yet ready to do all the processing work, but over time the robots would be refined and their tasks expanded.
Another example is remote working. Prior to the pandemic, remote working was expanding, but it was still relatively small, accounting for under 10 percent of the workforce. Futurists thought it would gradually expand, perhaps doubling between 2020 and 2030.
However, today there are estimates that perhaps 30 percent of employees are remotely working, and in the next decade that number could expand to as high as 40 percent. Once again, the trend was already there; it’s just the pandemic has pushed the pedal on it.
The commonality of these two examples is technology. For years economists have talked about “technological unemployment” as a trend shaping the labor force. Indeed, in 2013 two British economists estimated almost half of today’s occupations could be susceptible to downsizing due to the substitution of technology for humans in doing work. While not all economists agree with those predictions, it looks as if the COVID-19 pandemic could make them more likely.
Technological unemployment is not new. It goes back at least as far as the 18th century when English textile workers opposed factory owners replacing them with machines. Once perfected, machines can usually produce more output in a given period of time than can humans. Plus, the machines don’t need rest or vacations.
Today there’s an additional reason for companies to consider replacing workers with technology. Technology and machines don’t get sick for long periods of time like people infected with COVID-19. Technology and machines also don’t spread sickness from machine to machine, and machines aren’t subject to stay-at-home orders during a pandemic.
Now, before you think I’m unaware of spreadable “computer viruses,” I am! I know that users of modern technology must use protective computer programs and be cautious of opening unknown attachments. Maybe someday – hopefully soon – we’ll have similar techniques, like a vaccine, to protect us against viruses. Unfortunately, just like computer viruses, human viruses can be totally different each year, thereby requiring an entirely new vaccine.
Therefore, until we have better protection from infections like COVID-19, I expect people and businesses won’t let their guard down. If they can, more workers will consider remotely working. Also, if they can, more businesses will look for ways to use fewer people and more machines and technology as a means to protect against disease and pandemics.
A new study from two MIT economics professors raises an additional and important worry. If the technological unemployment spurred by the COVID-19 occurs, it may dramatically reduce the number of jobs available for those without post-high school training. In one way, this is a plus, because most of those jobs pay low wages. However, such a situation also creates challenges of retraining displaced workers for other – preferably higher-paying – occupations.
This raises the important question of how this retraining will occur. Will businesses do it on their own with “on the job training.” If so, what strings might be attached to prevent retrained workers from moving to other companies?
Or, will we need to rely mostly on our public educational system, including community colleges and four-year colleges and universities? If these institutions do carry the bulk of the retraining, then they will need to provide quick, inexpensive and focused education in specific work tasks. Workers losing their jobs to technological unemployment, especially those with families and dependents, won’t be able to spend multiple years in new learning.
The COVID-19 pandemic has been more than a health event. It has had a profound impact on our economy by pushing existing economic trends ahead faster than we could have ever imagined. So, if 2020 really is like 2030, do you like what you see? You decide!
Dr. Mike Walden is a William Neal Reynolds Distinguished Professor and Extension Economist in the Department of Agricultural and Resource Economics at North Carolina State University who teaches and writes on personal finance, economic outlook and public policy.
FOR MORE COLUMNS AND LETTERS TO THE EDITOR, CHECK OUT OUR OPINION SECTION HERE.
ALSO OF INTEREST: