Table of Contents
Artificial intelligence (AI) is all the rage, and anyone who has used ChatGPT will attest to why. Whilst AI technology dates back to the 1950s, recent advancements have placed its benefits into the hands of everyday users. Once seen as a dystopian dream, AI has proven to be a capable, efficient and innovative disrupter of industries.
However, for Web3 aficionados, the hype feels far too familiar. And just like Web3, every project and their uncles are trying to shoehorn AI in, clouding the core capabilities of this remarkable technology.
This article was prepared in collaboration with Franklin Templeton Digital Assets.
AI is a big deal, let's not overlook that. AI is making a resurgence 50 years after its debut as a result of significant improvements in:
- Advancements in machine learning
- Increased computing power
- Big data
- Cloud computing
- Open source ecosystem
- Investment and funding
- Practical applications, such as automating repetitive tasks and healthcare
AI has reached a turning point for humanity. In Franklin Templeton's report, Quick Thoughts: Artificial intelligence — a Primer, chief market strategist Stephen Dover suggests AI could serve as a bookmark in human history.
Inventions and achievements such as Eli Whitney's cotton gin, Johannes Gutenberg's printing press, the Wright brothers' first flight and even Sputnik's space launch might "ultimately prove to be inconsequential compared to the arrival of artificial intelligence (AI)," Dover writes.
Dover caveats this bold statement with it being shared by "many fans and foes alike of AI" but there are certainly elements of truth in its potential.
The progression of humanity has accelerated over the past century in large part due to harnessing technology to refine efficiencies and productivity. AI is the ultimate tool for this.
Although there are concerns about job displacement, AI's potential to catalyze transformative change across industries is undeniable. As painful as it is for us to admit, AI even has the potential to make (certain) writers redundant.
But disruption doesn't necessarily only lead to displacement. As Dover says, "Not all earlier innovations replaced workers." Indeed, AI offers a tool that increases productivity. Work smarter, not harder.
Dover points towards innovations such as the internet which allowed workers to improve their skills and productivity, and how the wheel "enabled people and goods to be transported to where they were most needed." The evolution of the telegraph to the telephone to the internet vastly improved communication capabilities that led to job creation too.
These echo what Web3 once pledged. Blockchain technology, too, held the promise of revolutionising industries.
Hype vs. Reality
If we've learned anything over the last year it's that the Web3 world fell victim to hype mentality. Whether it was the countless NFT projects overpromising and underdelivering or celebrities blindly endorsing FTX, it's safe to say we all got a bit too excited about Web3 and paid dearly for it.
"Cryptocurrencies do provide a potential way to address a number of these issues, making it easier, cheaper, faster, and more equitable for people to do what they need to do to manage their financial lives."
Now, the Web3 industry is picking up its broken pieces, trying to (re)convince the Web2 world that it is still the future.
AI must not make the same mistake.
Artificial Intelligence Requires Intelligence
Just like Web3, AI is not a solution for everything. Whilst we're all excited about AI, we must be realistic in our ambitions for the tech.
A lesson we can learn from Web3 is to define business problems before implementing AI rather than forcefully trying to apply AI to unsuitable problems. Throughout last year, NFTs were seen as the latest, greatest tech. Consequently, brands from almost every industry were trying to capitalise on it.
Coca-Cola, Lamborghini, Nike, TIME Magazine, Air Europa, Pinkfong, McDonald's, Hasbro and even the Golden State Warriors are among the many firms with their own NFT projects. Despite the enthusiasm, none of them were really a global success.
Companies have yet to embrace AI in the same volume. So far, AI has been adopted and leveraged by your typical tech titans (Amazon, Google, Apple, Alibaba, Meta, Microsoft etc.) but not your household brands like Coca-Cola.
Speaking to Blockhead, Google Cloud's head of Web3, James Tromans highlighted how the internet giant has been in the AI game for over a decade. "Google's been an AI-first company for 10 years plus," he said. "Ever since deep learning progressions were made in 2013, we've been looking to see how this technology can be leveraged in our own business successfully."
Google is now exploring how Web3 can help generative AI (Gen AI), which Tromans says "might be provenance."
"It might be the security of training data, it might be making sure the training data is correctly linked to certain models in an immutable way."
The firm is also looking at how AI can help Web3, specifically for code writing. "It's actually the same way AI will help Web2, which is helping us write better quality code," Tromans explains.
Google's use of AI in this respect is a result of years of thinking, understanding and development; processes that were too easily overlooked in Web3.
AI's widespread adoption will be successful if companies thoroughly consider its benefits, rather than simply jumping on the bandwagon.
The Dystopian View of AI
AI has repeatedly been portrayed as a threat to humanity, evoking images of robot uprisings. As Dover explains, "Captured in literature, movies and the arts, those vast dislocations, and the human pain and suffering they induced, are etched into the consciousness of peoples around the world. Modernity has always had a rough edge and has probably invited more trepidation than utopian dreaming over the several centuries of its existence."
As disruptive as it promised to be, no one was scared of Web3. The thought of pitching a blockbuster feature to Netflix about a blockchain-based uprising is laughable in itself.
Therefore, AI has another hurdle to overcome that its Web3 cousin skipped. Ethical concerns including biases, privacy, autonomy, accountability and manipulation have arisen. Again, these weren't issues Web3 faced.
AI thus has more barriers to break down before it is fully welcomed by society. That said, humanity has taken bigger leaps to embrace tech than newage innovations like self-driving cars. Consider the transition from horse and cart to the car, or scarier still, the first commercial flights.
Society will take time to adjust to AI and to calm their fears. Tesla perfected its self-driving capabilities years ago but people are still fearful of jumping in a driverless car. If anything, society's cautious perception of AI is stunting its growth. But this could be a good thing, as companies will have to seriously consider the impact of AI on their customers, preventing a blind rush into the space that Web3 experienced.
As Dover quite rightly says, the test will be whether we can adapt to ensure AI is beneficial, rather than harmful.
If you're interested in reading more on the topic of AI click here.