4th Industrial Revolution: can lessons from the past teach us how to prepare for the future of disruption?
“The more you know of history, the more liberated you are,” Maya Angelou
Digital disruption has brought the world to the verge of another major technological and societal shift, one that we’re still grappling to understand. As they say, history repeats itself – so what can it tell us about the future of mankind?
A brief history lesson
- The 1st industrial revolution (18th century) was ushered in by the invention of the steam engine. Major urbanisation occurred, shifting economies in the developed world from agricultural to manufacturing based. Machines like the steam engine and the water-powered spinning frame dehumanised labour. Humans, unable to compete with machines on productivity, moved to more intellectual forms of labour.
- – The 2nd industrial revolution (19th century) occurred with the rise of steel, oil, electricity and innovations like the telephone. With physical labour made entirely more efficient by machines, humans continued to engage in mentally intensive work, such as engineering and the sciences.
- The 3rd industrial revolution (end of the 20th century) saw the rise of digital technologies including the PC and internet. This is significant, because it sets the ball rolling for the dehumanisation of not only physical labour, but intellectual labour as well.
The fourth industrial revolution builds on this digital revolution, and is marked by emerging technologies: AI, nanotechnology, 3D printing, quantum computing and autonomous vehicles. These technologies are able to outperform humans where intelligence is the key component of production.
As Kenneth Rogoff, Professor of Economics and Public Policy at Harvard University, said, “Since the dawn of the industrial age, a recurrent fear has been that technological change will spawn mass unemployment. By and large, neoclassical economists’ prediction that people would find other jobs, though possibly after a long period of painful adjustment, has been proven correct – but for how much longer?”
The difference this time, is the sheer pace and scale at which it’s happening. In an article on The Conversation, professor of computer science at Rice University, Moshe Y. Vardi points out that just like when industrial machinery came along in the 18th century, the economic growth that happened in a couple hundred years was vastly greater than the previous thousands of years. The 4th industrial revolution is likely to usher in a similar tipping point, where economic growth of the past seems insignificant compared to the productivity potential of the future.
Unlike in the 19th century, however, the effects of globalisation and automation are spreading across the developing world. A recent report from the International Labour Organization found that more than two-thirds of Southeast Asia’s 9.2 million textile and footwear jobs are threatened by automation.
Google’s director of engineering and notable “future teller”, Ray Kurzweil says, “While there will be jobs lost, newer ones will be created. What these are, I obviously don’t know since they haven’t been invented yet.”
The question is where do we move once our brains are no longer able to compete with machines? Where do we shift our efforts? Andrew Fursman & Georgia Frances King suggest in an article in Quartz Magazine:
“Perhaps we are not in a fourth industrial revolution that will simply progress the roles of humans in production. Perhaps we are in the final stages of a grand process to create and automate all the tasks necessary to sustain a stable society. Perhaps jobs are not the source of human dignity. Perhaps escape from the burden of labour is not unemployment, but freedom.”
Beware the hype
Hilary Sutcliffe, Director of SocietyInside states in a World Bank article, that we must be cognisant of the hype of disruptive technologies. New technologies need new names and metaphors to explain them. Often this language is heavily dependent on military, engineering or IT-based metaphors of control and dominance over nature or scientific precision which doesn’t reflect reality, certainly in the early days. “This love of the macho, domineering metaphor brings with it unsettling comparisons and is not shared by everyone. The way funding works is that in order to get the money, scientists and businesses have to massively exaggerate the potential benefit of their ‘ology’ – the media love it, funders get excited and the money flows.”
Descriptions such as ‘post-human’ or ‘bio-hybrid human,’ used to refer to someone with a prosthetic limb, or a new type of artificial heart, is an exaggeration and more fun for Avenger’s fans than it is a reflection of reality. So are terms like ‘singularity’ (predicted by macho scientists to happen in 30 years) where robots and people will merge and become post-human. “This hype is distracting the focus of what good robots can do for humankind now,” says Sutcliffe. “What will be the result of the hype about the precision of CRISPR and gene editing when it becomes clear it is not a panacea for every genetic disease? Are we wasting time debating the ethics of science fiction when we could be discussing the not inconsiderable impact of the reality?”
Our increased access to (sometimes fake) information often makes it difficult to see where real evidence of potential benefit or harm lies. Its human nature to cherry pick what we choose to believe from the sheer quantity of information out there.
As Sutcliffe says, “It is important that as a society we do debate the potential benefits and the acceptable risks of technology thoughtfully and wisely. The history of innovation shows us that, whilst we are brilliant and inventive, for every act of creation and innovation there exists the potential, also, for our undoing.”
The best we can do is prepare ourselves to seize opportunities that emerging technology brings. LCIBS offers access to renowned thought leaders and business practitioners on the cusp of future trends. Our Emerging Tech executive programme equips future leaders to become captains of industry and chart a path of societal change for good.
Author: Brett Kilpatrick