At the start of the 20th Century, the majority of Americans were farmers, today that number is less than 2%.We've created new jobs. Most jobs that exist today didn't exist 100 years ago. In fact, in 1910 service jobs and agriculture together accounted for 70% of the US labor market. Today, service jobs account for almost 80% of jobs with industry making up the remaining 20%.
Image Source: US Bureau of Labor Statistics
In the past 10 years alone, we’ve seen the advent of new jobs which are now widespread, including app developer, social media manager, cloud computing specialist, data scientist, sustainability manager and YouTube content creator, amongst a plethora of other roles.
A recent World Economic Forum report has built on this and found that 65% of children entering primary school today will end up working in completely new job types that don’t exist yet. This aligns with what Kevin Kelly, founder of Wired Magazine, told me in a recent chat for the Future Squared podcast in which he reiterated that by the end of the Century, 70% of today’s jobs will be replaced. Optimists like X-Prize founder Peter Diamandis point towards the past as a predictor of the future insofar as the creation of new jobs go. The idea is that technology continually creates new and higher functioning jobs than it displaces, ultimately benefiting humanity.
This has been the case so far.
However, US labor market statistics in the 21st Century paint a different picture.
Image Source: US Bureau of Labor Statistics
Some economists such as Tyler Cowen (Future Squared ep. 124) suggest that we’re in the middle of a productivity paradox, where, just like our transition from steam to electricity over a Century ago, it took decades for industry and people to reorganise around and master the efficient use of the new technology.
One thing Kevin Kelly pointed to in our conversation was the concept of long term thinking or what he calls the ‘long now’. This is the opposite of what influential bitcoin personality, Andreas Antonopoulos, calls framing our beliefs about the world based on circumstances or conditions we’ve grown up with or become accustomed to as the ‘illusion of permanence’. Just because we have known things to be a certain way throughout our short lifetimes (which are a pinprick in the evolution of humanity) it doesn’t mean that they will always be that way.
The modern company has been in existence for less than 400 years, the global financial system less than 150, the commercial internet less than 30, the smartphone less than 10.
It’s one thing to point towards the recent times and suggest that the future will be a reflection of the past, but some might say that’s just short-sighted.
While humanity adjusted to changes in technology throughout the 20th Century to create more jobs than it destroyed, there might be one fundamental difference.
We didn’t have sophisticated, general, artificial intelligence in the 20th Century - we’ve never had it.
And while many like to point to sophisticated AI that can do all manner of things well (general AI) rather than just one thing well (narrow AI) as being ‘decades away’, they often seem to underestimate the rate at which technology is accelerating.
In the period from 2010 to 2016 we added 100 times more transistors to a microchip than we did throughout the entire 1990s. This is Moore’s Law in action - the doubling of computing power every 18 to 24 months. The aggressive rate at which technology and AI is accelerating prompted popular website Wait But Why to publish a piece called The Road to Superintelligence, best encapsulated by the following graph.
Image Source: WaitButWhy
A recent Harvard Business Review article built on what AI can and can’t do - right now.
It ultimately concluded that “the biggest harm that AI is likely to do to individuals in the short term is job displacement, as the amount of work we can automate with AI is vastly bigger than before”.
A survey of experts at a recent AI conference concluded that by 2050, AI will be able to do anything and everything a person has ever done in all of history - that’s a big, bold claim. This means everything - physical work, academic, philosophical and creative.
This echoes what Ray Kurzweil, author of The Singularity is Near, thinks when he suggests that by 2045, AI will surpass the brainpower of all human beings combined.
So assuming that this holds true, and AI can do ‘everything a person has ever done’, to sit back and say “technology continually creates new and higher functioning jobs than it displaces” doesn’t seem to take into account the realities associated with artificial intelligence.
The new jobs we created needed humans to do them. The new jobs that we, or our ‘robot overlords’ create, might not need us.
While we can prophesize about the future - what we can reliably talk about with certainty right now is the fact that disruptive technology is already empowering companies to do much more with much less.
A decade ago, Blockbuster had a market cap of US$5B, 9,000 physical stores and 60,000 employees worldwide.
Today, Netflix has a market cap of US$70B, 5,000 employees, over 100 million customers and no physical stores to speak of.
Then there’s the oft-quoted fact that in the year that Kodak filed for bankruptcy with thousands of employees, Instagram was acquired by Facebook for a billion dollars and had only 13 employees.
So with this in mind, I’m pondering the following questions. On the proviso that we don’t replace today’s jobs and the number of jobs dwindles to a fraction of what exists today as AI takes on more and more of what humans can do, only does them better and at a fraction of the cost:
What do you think?
This guide provides an overview of the five key stages of design thinking, from empathy through to test. Find out how to apply the approach and start innovating at your organisation.