This piece was originally published on LinkedIn
"Homo sapiens will be split into a handful of 'gods,' and then the rest of us."
"Back to the Future" Day was roughly three weeks ago. As fans around the globe donned Marty McFly and Doc Brown costumes, many compared the 2015 technology imagined in the sci-fi classic to where our technology stands today. From the Lexus Hoverboard to Microsoft's HoloLens, we can see that our reality isn't too far off from the future predicted in the 1980s trilogy. However, not all sci-fi films were so optimistic. The first "Terminator" film was released in 1984, just one year before the first "Back to the Future." The film paints a grim picture: malicious artificial intelligence systematically attacking and destroying the human race.
Stephen Hawking, Bill Gates, and Elon Musk have all warned us about artificial intelligence. "Success in creating AI would be the biggest event in human history," wrote English theoretical physicist Hawking in an op-ed, which appeared in a 2014 issue of The Independent. "Unfortunately, it might also be the last, unless we learn how to avoid the risks. In the near term, world militaries are considering autonomous-weapon systems that can choose and eliminate targets." Then, in a 2014 BBC interview, Hawking added, "Humans, limited by slow biological evolution, couldn't compete and would be superseded by AI."
In spite of these warnings, the technology sector speeds forward, creating fantastical machines that could easily be mistaken for Hollywood movie props (The robot in the photo above is not from an upcoming action sci-fi movie -- it's Japan's Kurata mech). While robots and artificial intelligence lead to discussions of morality and the human pursuit of power and immortality, such developments present economic concerns as well. In a new report from Bank of America Merrill Lynch, it forecasts a catastrophic number of jobs being eliminated by AI: up to 35% of all workers in the UK and 47% of those in the US, including white-collar jobs.
Are these numbers a cause for panic? Is this the advent of a mechanized takeover? Perhaps; but history has a different tale to tell. From the Industrial Revolution in 19th-century England to the print unions protesting in the 1980s about computers, there have always been people fearful about technological advancements. An even more fascinating trends is that the economy continues to produce new jobs in the wake of these developments.
"The poster child for automation is agriculture," said Calum Chace, author of Surviving AI as well as the novel Pandora's Brain. "In 1900, 40% of the US labor force worked in agriculture. By 1960, the figure was a few percent. And yet people had jobs; the nature of the jobs had changed.
"But then again, there were 21 million horses in the US in 1900. By 1960, there were just three million. The difference was that humans have cognitive skills -- we could learn to do new things. But that might not always be the case as machines get smarter and smarter."
"Humans, limited by slow biological evolution, couldn't compete and would be superseded by AI."
What if we're the horses to AI's humans? The combination of robotics and artificial intelligence is advancing at incredible speeds. MIT recently released a video of an autonomous drone flying at 30 miles per hour, avoiding obstacles -- all without a pilot, using only its onboard processors, essentially learning its environment throughout the course of its flight. MIT also built bipedal robots designed to soften their impact when falling over as well as a "robot cheetah," which can jump over obstacles of up to 40 centimeters without help.
Earlier this year, Toshiba released Aiko Chihira, an android, on the floor of a Tokyo department store. She was so lifelike that many shoppers confused her for a human being. "She's 165 centimeters [5 feet 5 inches] tall ... and she's supposed to be 32 years old," designer Hitoshi Tokuda said. "Her movement is done by 30-times-per-second data [transfers]," and she is powered by 43 motors, making her movements so subtle that she appears "90 percent" human-like.
Add to that Moore's Law, an observation made by Intel co-founder Gordon Moore in 1965, which states that the power of microprocessor technology doubles and its costs of production halve every 18 months, and you can see why fear of a robot revolution isn't so far-fetched.
However, the invasion of AI in our daily lives started well before autonomous drones and lifelike androids. From cooking systems with vision processors that can determine how cooked a burger is to robo-advisors that provide automated, algorithm-based portfolio management advice without the use of human financial planners, artificial intelligence has been an active component of human life for a long time.
The Associated Press has sports and business news stories written automatically by a system developed by Automated Insights. Even doctors and lawyers may be under threat. About 570,000 "robo-surgery" operations were performed last year. Oncologists at the Memorial Sloan-Kettering Cancer Center in New York have used IBM's Watson supercomputer, which can read one million textbooks in three seconds, to help them with diagnosis. On the other hand, advanced databases can sort through giant files faster than any lawyer. The fact of the matter is computers and mechanization are continuously displacing work.
So how will robotics and AI impact our jobs, our economy, and our society? In a 2013 paper The Future of Employment: How Susceptible Are Jobs to Computerization?, Carl Benedikt Frey and Michael Osborne point out that even while some jobs are replaced, new ones are quick to spring up, often times re-allocating labor to focus more on service and interaction with and between people. In an interview with the Observer, Frey said, "The fastest-growing occupations in the past five years are all related to services."
However, Frey finds that technology is leading to a scarcity of leading-edge employment. Fewer and fewer people have the skills needed to work in the front line of its advances. "In the 1980s, 8.2% of the US workforce were employed in new technologies introduced in that decade," he writes. "By the 1990s, it was 4.2%. For the 2000s, our estimate is that it's just 0.5%. That tells me that, on the one hand, the potential for automation is expanding -- but also that technology doesn't create that many new jobs now compared to the past."
This trend worries people like Chace. "There will be people who own the AI, and therefore own everything else," he says. "Which means homo sapiens will be split into a handful of 'gods,' and then the rest of us.
"I think our best hope going forward is figuring out how to live in an economy of radical abundance, where machines do all the work, and we basically play."
Might we already be part of the way there? As automation and AI become more accessible, wouldn't productivity lead to increased leisure time? Chace warns that a work-less lifestyle also means "you have to think about a universal income" -- a basic, unconditional level of state support.
Perhaps it is still too early to properly assess the social effects of AI. Technology moves fast; figuring out what happened in the past is difficult enough, let alone what the future will bring.
In business and academic circles, productivity is often hailed as the driver of economic growth. 18th century scholar Thomas Malthus notoriously predicted that a rapidly rising human population would result in war, plague, and famine. But Malthus failed to take into account the drastic technological changes -- from steam-powered transportation to enhancements in agricultural technology -- that would allow the production of food and other staples to expand even more rapidly than the number of hungry mouths. The puzzle and answer to economic progress is the ability to do more with the same investment of capital and labor.
The introduction of robots has reduced the amount of time and resources needed in the production process. Yet as workers are laid off from production lines, new jobs are created elsewhere. To date, fears of mass unemployment as a result of a machine takeover are as unfounded as those that have always accompanied other great technological revolutions.
This may sound hunky-dory and all, but there is an important caveat. The relatively low-skilled factory workers who have been displaced by robots are rarely the same people who miraculously become app developers or analysts. Technological progress is already a suspect for exacerbating inequality, a trend Bank of America Merrill Lynch maintains may continue in the future.
Massive economic benefits may be reaped from the rise of machines and AI; but unless it is carefully managed, those very benefits may be monopolized by wealthier, upper-class members of our society, exacerbating inequality and perpetuating class-based issues.
- Patrick Lin