Article by Chris Dodds, Icon Co-Founder and Managing Director, Growth & Innovation
At a trial in January 1813, 60 men were prosecuted for breaking machines with sledgehammers in a York factory. They were part of the Luddite movement protesting the automation of textile making, where businesses were replacing highly skilled workers with less skilled ones to maintain the machines.
This was the mid-point of the Industrial Revolution when human and animal labour was replaced by machines running on steam and, eventually, electricity. Entire industries and job types collapsed and were replaced with the new. It was a disruptive and worrisome time for developing nations.
More recently, the Digital Revolution displaced a previous century of change and, in turn, created yet another wave of new jobs, services and mediums.
We are now on the growing crest of the next wave, and it will bring a tsunami of change.
How did we get here?
In the internet age, “If you’re not paying for the product, you are the product”. Since the birth of the web, almost everything we publish and consume online has been scraped, synthesised, mixed and matched – creating an information treasure trove designed to engage, entertain and sell to us.
If we’d bothered to read the T&Cs of loading our stuff to the cloud, we might have noticed the platform’s reuse terms. Here’s Instagram’s:
When you share, post or upload content that is covered by intellectual property rights (such as photos or videos) on or in connection with our Service, you hereby grant to us a non-exclusive, royalty-free, transferable, sublicensable, worldwide licence to host, use, distribute, modify, run, copy, publicly perform or display, translate and create derivative works of your content.
“Derivative works” is the gateway to generative AI. It feasts on our data to create something new (or newish). We’ve given this data away freely. And now, with the emergence of large language models (LLMs), decades of freely sharing our data have birthed the internet’s first truly native construct.
Like many, I am nervous about what AI will do to our work. And more profoundly, the meaning of work when much of it can be automated. There are also ethical issues surrounding source material, i.e. the scraping of artists’ work to train generative AIs like Midjourney, DALL-E and numerous music generators.
I’m old enough to have witnessed multiple waves of technologically-charged change. When I was a kid, old print media said TV was set to create square-eyed, dumbed-down zombies.
I’ve seen concern as infinitely replicable digital artefacts overtook the analogue world, sampled music was decried as theft, and desktop publishing was framed as the death of design.
I’ve watched newspaper editors dismiss digital platforms and ignore a new generation’s ability to self-publish content at unbelievable scale and speed.
Before my time, critics considered radio a brainless diversion that eroded the listeners’ ability to think, inquire, and judge.
In the 18th century, when newspapers became common, people rallied against the printed page, arguing that socially isolated readers would be detracted from the spiritually uplifting group practice of getting news from the pulpit.
In the 1800’s, people were worried that travelling at speed would make rail passengers unable to breathe or that they would be shaken unconscious by the vibrations, and women riding bicycles would create untold freedoms destined to destabilise the family unit.
And most famously, Socrates warned against writing because it would “create forgetfulness in the learners’ souls, because they will not use their memories.”
My point is that with every significant period of change comes great concern, and eventually, new things become routine. That doesn’t mean we shouldn’t monitor and place safeguards around new technologies, but we should remember that change is continual and inevitable.
Adopt and adapt with understanding
So what does this mean for the work we do? How do we protect creative copyright and jobs while leaning into change? What defines 'work' when knowledge and expertise become commodities available to anyone with a computer and internet connection?
Governments have a long history of playing catch-up with industry, but there are beginnings of legislation to protect creators, as well as grassroots action. The Nightshade tool messes with how image-generating AI models process data, making it impossible to copy an artist’s work. The Hollywood AI protests demand rights over one’s image and voice. Some education institutions have given up on trying to identify AI-generated essays and are instead teaching students how to use IA responsibly and focusing on learning how to learn. Government-backed, global governance principles are being created, but it only takes one country to see opportunity and back newer, more powerful technologies.
We cannot afford to be passive at this point. We must be curious and actively involved in implementing change to help guide and adapt to a different future. We are heading into a period where people, businesses and governments are being destabilised by technology. Navigating the issues of disinformation, misinformation, AI hallucinations, deep fakes, ethics, and copyright is complex and requires understanding and action.
Icon is successful because we’re curious. We haven’t shied away from change and have remained useful because we continually adopt new skills. Our shared legacy won’t just be our impactful work – it’s ensuring our team has the skills and knowledge to remain relevant to our fields, clients, and careers.
Predicting the future is difficult, if not impossible. But there are converging trends and signals. I’m confident we are at the start of one of the most significant upheavals recent humanity has experienced. That’s a big statement, but when the majority of AI engineers and theorists predict a 50% chance that artificial general intelligence (AGI) will arrive within 30 years from now, we need to take notice. (An artificial general intelligence is a hypothetical type of intelligent agent. If realised, an AGI could learn to accomplish any intellectual task that human beings or animals can perform).
We’ll experience thousands of small and large technological steps leading to a currently theoretical singularity event. AI isn’t hypothetical anymore, and the hype is real. What started as automating discrete skills such as content production, legal analysis, cancer diagnosis, self-driving cars, and automated manufacturing will explode into fully autonomous workflows and workplaces.
Start-ups will replicate entire business models with AI. Wars and elections will be fuelled by AI-generated misinformation and subterfuge. Governments that slow down AI with policy will be overtaken by other nations – taking jobs and income from competitors by being first to market. Like it or not, change is inevitable.
The future may feel dystopian and unfair, but it’s also impossible to stop. Hopefully, our data, content, and creative outputs will be legally recognised as a ‘resource’, with taxes or revenue-sharing models distributing a share of the profits. There’s also the idea of a Universal Basic Income (UBI) as a potential answer to job displacement from artificial intelligence.
Maybe there are lessons we can take from the Luddite protest of the early 1800s. Should we try to break the knitting machines to save our jobs or learn how to use and work with them instead? With universal governance rules and a global AI code of conduct, a middle ground is possible, coupled with the certainty that change is happening faster than anyone can predict.
Image credit: Bing Image Creator, powered by DALL-E 3