Work is set for a big shakeup. Not just different industries or new technology, but a new world of work defined by artificial intelligence.
It raises many questions about what we value, about education, and about how we might spend our time when work may be only for the very few. These and other views of the future and how we prepare for it are being studied by technologists, social scientists, economists, and philosophers who are starting to imagine the post-AI (artificial intelligence) world.
The future is a lot closer than we think, as several studies and reports just this month have affirmed.
- In a workplace study, 60 percent of full-time employees admit to feeling stressed at work “most of the time.” Forty-three percent blame the fear of losing their job to artificial intelligence systems.
- Being worried about AI systems is not frivolous. Researchers from Oxford and Yale released a report this month analyzing “When Will AI Exceed Human Performance?” Gathering information from more than 350 scientific researchers, they conclude that by 2062, there is a 50 percent chance that every single task humans now perform will be able to be done better by AI. If that sounds like it’s a long way off, consider that your 5-year-old will be 43; your grandchildren may just be graduating from high school.
- Then what will the jobs of the future look like? Ask some respondents to a study conducted by the Pew Research Center and Elon University’s Imagining the Internet Center. 1408 respondents were asked to discuss their expectations of what would evolve by 2026 – just nine years from now. “A noteworthy share of respondents … look into the future and see a world where most of the work is done by robots and automated processes, as humans are replaced by algorithm-driven work solutions. Some of these people dismiss the idea that any kind of training ecosystem is likely to matter in a world where they believe fewer and fewer people will work,” the study’s researchers reported.
- And just last week, Amazon purchased Whole Foods setting up an extremely realistic scenario that potentially changes the face of grocery shopping, while eliminating thousands of jobs from shelf-stockers to baggers to checkout line operators.
With so much modern art, culture, and film focusing a visions of a post-apocalyptic future where humans are enslaved by technology, it’s easy – but risky – to write off the very real prospects of a major change in our societal compact about humanity and work as just some Sci-Fi plot.
In the Oxford-Yale study, 67 percent of researchers who were surveyed said that AI had progressed significantly faster in recent years.
Although AI is certainly an aspect of robotics, it will not be robots that take over jobs. It will be systems, computers, data that eventually achieve HLMI (higher learning machine intelligence).
For years, we have talked about “trickle-down,” and it is finally in this arena that we might actually see it work.
Pew outlines this scenario:
Auto-drive vehicles eliminate the need for paid drivers – from school buses to semis. Although this throws a bunch of people out of work, it has a positive effect in reducing accidents. That, however, reduces the need for police (fewer tickets, accidents, etc.) and segments of the health care industry (ER doctors, triage specialists, and emergency rooms). 3-D printing may finally help realize the dream of “Star Trek” fans everywhere. OK, we won’t be beaming things around the galaxy, but we may be printing large-scale materials for construction and manufacturing. That drastically impacts import/export and throws economic power and influence wielding out of balance.
The Pew report points out the importance of preparing now for what we should consider inevitable changes. The timeline may shorten or extend, but the end result seems unavoidable.
A major subtheme of the study is the role of education, and a significant view is that education as we see it today “will not meet 21st-century needs by 2026.”
“Technology (particularly AI) is moving faster than rational thinking about our future workforce,” said a respondent.
“How to learn and how to lead in online and offline contexts and how to translate those ideas to practical problems must be placed at the core of new programs. Success will require huge public investment and a reimagining of what we value in education,” noted MIT Ph.D. and researcher Erhardt Graeff. “This is hard; the problem and our responses cannot be reduced to pushing STEM or vocational training at scale. We can’t throw out the important societal and civic role played by liberal education by chasing technical skills that might be obsolete in a few years.”
Workers of the future will need to focus on what makes humans unique: “Workers of the future will learn to deeply cultivate and exploit creativity, collaborative activity, abstract and systems thinking, complex communication, and the ability to thrive in diverse environments,” Pew states.
That also means mastering the technology to harness its power, not to be yoked to it. Today, we put an enormous emphasis on learning to code. That’s a good skill to have, as far as it goes. But coders are the coal miners of the 22nd century: masters of an outdated skill that, if the Oxford study is on target, will be performed by AI itself by 2104.
Even today, armies of coders are employable but advancement beyond a cubicle and a computer is contingent upon a number of higher-level analytical and collaborative skills. Our emphasis on learning to code as the culmination of our education is misplaced. To stake out a place — today and in the future — learn to think.