Intersecting Point of AI and Cryptocurrency: This article is based on an article titled “AI <> Crypto Projects That Aren’t Complete Bullsh*t” by a former Bankless researcher and compiled, translated, and written by Techflow.
Table of Contents:
A High-Level Overview of AI Stacks
Open Source Empowering Crypto
Decentralized Physical Infrastructure Networks (DePINs)
Data Networks: The Case of Grass
GPU Networks: The Case of io.net
Utilizing Incentive Structures
Building AI Networks: Exploring Bittensor
Remembering Incentives: Exploring Morpheus
How to Tell If a Project Is Completely Useless
Type 1 – Crypto Assisting AI
Type 2 – AI Assisting Crypto
Where Are We Headed?
In the search for new alpha information, we inevitably encounter some garbage. When a project can quickly raise six to seven figures with just a semi-clear description and decent branding, speculators jump on every new narrative. With the traditional financial industry joining the AI trend, the “Crypto AI” narrative has exacerbated this problem.
The majority of these projects have two main issues:
Most crypto projects don’t need AI.
Most AI projects don’t need cryptocurrencies.
Not all decentralized exchanges (DEXs) need built-in AI assistants, and not every chatbot needs an accompanying token to facilitate its adoption curve. The rigid combination of AI and cryptocurrency technology nearly drove me to collapse when I first delved into this narrative.
The bad news is what? Continuing down the current path and further centralizing this technology will ultimately lead to failure, and a plethora of false “AI x Crypto” projects will hinder our turnaround.
The good news is what? There is light at the end of the tunnel. Sometimes, AI can indeed benefit from the cryptoeconomics. Similarly, in some use cases of cryptocurrencies, AI can solve practical problems.
In today’s article, we will explore these critical intersections. The overlap of these niche innovative ideas creates a whole that is greater than the sum of its parts.
Here is my perspective on the different verticals in the “Crypto AI” ecosystem (for a deeper understanding, refer to Tommy’s article). Note that this is a highly simplified view, but I hope it helps lay the foundation.
From a high-level perspective, this is how it works together:
Massive data collection.
Processing this data to enable machines to understand how to ingest and apply it.
Training models on this data to establish a general model.
Then fine-tuning these models to handle specific use cases.
Finally, these models are deployed and hosted for applications to query them for useful implementations.
All of this requires a significant amount of computing resources, which can be executed locally or obtained from the cloud.
Let’s explore each of these areas, focusing specifically on how different cryptoeconomic designs can actually improve standard workflows.
The debate between “closed-source” and “open-source” development methods dates back to the Windows-Linux debate and Eric Raymond’s famous “The Cathedral and the Bazaar” theory. While Linux is widely used among enthusiasts today, about 90% of users choose Windows. Why? Because of incentives.
At least from an outsider’s perspective, there are many benefits to open-source development. It allows the most people to participate in the development process and contribute to it. But in this headless structure, there is no unified directive. CEOs don’t actively want as many people as possible to use their products to maximize their bottom line. In an open-source development process, projects can turn into a “hodgepodge” that splinters at every intersection of design philosophies.
What is the best way to adjust incentives? Build a system that rewards behaviors that contribute to achieving goals. In other words, put money in the hands of actors who can bring us closer to our goals. With cryptocurrencies, this can be hard-coded into law.
Let’s look at some projects that are doing this.
“Oh, come on, not this again?” Yes, I know the DePIN narrative is almost as overused as AI itself, but please bear with me for a moment. I am willing to believe that DePINs are a genuinely transformative use case for cryptocurrencies. Think about it.
What are cryptocurrencies really good at? Removing intermediaries and incentivizing activities.
The original vision of Bitcoin was peer-to-peer currency designed to exclude banks. Similarly, modern DePINs aim to exclude centralized powers and introduce provably fair market dynamics. As we will see, this architecture is ideal for crowdsourced AI-related networks.
DePINs use early token issuance to increase the supply side (providers), hoping to attract sustainable consumer demand. This aims to solve the cold start problem of new markets.
This means that early hardware/software (“nodes”) providers earn a large amount of tokens and a small amount of cash. As users leverage these nodes (in our example, machine learning builders) to bring in cash flow, it begins to offset the diminishing token issuance over time until a fully self-sustaining ecosystem is established (which may take several years). Early adopters like Helium and Hivemapper have demonstrated the effectiveness of this design.
GPT-3 is said to be trained on 45TB of pure text data, equivalent to about 90 million novels (but it still can’t draw a circle). GPT-4 and GPT-5 require even more data than what is available on the surface web, so calling AI “data-hungry” is the most understated statement of this century.
Getting access to this data is extremely challenging if you are not a top player (OpenAI, Microsoft, Google, Facebook). The common strategy for most people is web scraping, and that works fine until you try to scrape a large number of websites with an Amazon Web Services (AWS) instance, which quickly hits rate limits. This is where Grass comes in.
Grass connects over two million devices that organize them to scrape websites from users’ IP addresses, collect, structure, and sell them to AI companies in desperate need of data. In return, participants in the Grass network can earn a steady income from AI companies using their data.
Currently, there are no tokens, but future $GRASS tokens might incentivize users to be more willing to download their browser extension (or mobile app). They have already attracted a large number of users through an extremely successful referral campaign.
Perhaps even more important than data is computational power. Did you know that in 2020 and 2021, China invested more money in GPUs than in oil? It’s insane, but it’s just the beginning. Goodbye petrodollar, make way for computecoin.
There are now many GPU DePINs on the market, and their operation works roughly as follows:
Machine learning engineers/companies in need of computation.
On the other side, data centers, idle mining machines, and amateur enthusiasts with idle GPUs/CPUs.
Despite the massive global supply, coordination is lacking. Contacting 10 different data centers to bid for their usage is not easy. A centralized solution would create a rent-seeking intermediary whose incentive is to extract maximum value from each party, but cryptocurrency technology can help.
Cryptocurrencies are excellent at building market layers that efficiently connect buyers and sellers. A code snippet doesn’t need to be accountable to shareholders’ financial interests.
io.net stands out because it introduces some cool new technology crucial to AI training – their cluster stack. Traditional clusters involve physically connecting a bunch of GPUs in the same data center to enable them to work together for model training. But what if your hardware is distributed across different locations? io.net has developed a cluster middleware software in collaboration with Ray (used to build ChatGPT) that can connect GPUs from different locations.
And the registration process with AWS can take days, while clusters on io.net can be launched without permission in 90 seconds. For these reasons, I can see io.net becoming the hub for all other GPU DePINs to plug into with their “IO engines,” unlocking built-in clusters and providing a smooth onboarding experience. All made possible with the help of cryptocurrency technology.
You will notice that most ambitious decentralized AI projects (such as Bittensor, Morpheus, Gensyn, Ritual, Sahara) have a clear “computational” requirement – this is where GPU DePINs come in, as decentralized AI needs permissionless computation.
Back to the Bitcoin revelation. What if miners were building AI instead of solving useless math problems? That’s what you get with Bittensor.
Bittensor aims toEstablish several experimental ecosystems for testing the production of “commodified intelligence” within each ecosystem. This means that one ecosystem (referred to as a subnetwork or SN) may focus on developing language models, another on financial models, and others on speech synthesis, AI detection, or image generation (see current active projects).
For the Bittensor network, what you want to do is not important. As long as you can prove that your project is worth funding, incentives will flow. This is the goal of subnetwork owners who register subnetworks and adjust the rules of the game.
The participants in this “game” are called miners. They are ML/AI engineers and teams who build models. They are locked in a continuous review “Thundersphere” and compete with each other to earn the most rewards.
Validators are another aspect, responsible for reviewing and scoring the work of miners. If collusion is found between validators and miners, they will be expelled.
Miners earn more by beating other miners within the subnetwork, which drives the development of AI.
Validators earn more by accurately identifying high-performing and low-performing miners, which maintains the fairness of the subnetwork.
Subnetwork owners earn more when the AI models generated in their subnetwork are more useful than those in other subnetworks, which drives subnetwork owners to optimize their “game”.
You can think of Bittensor as a permanent reward machine for AI development. Emerging ML engineers can try to build something, pitch to VCs, and try to raise funds. Or they can join one of the Bittensor subnetworks as miners, showcase their skills, and earn a large amount of TAO. Which one is easier?
Some top teams are building on the network:
– Nous Research is the king of open source. Their subnetwork disrupts traditional fine-tuning of open source LLMs. They manipulate the rankings by continuously synthesizing data flows for models, unlike traditional benchmark tests like HuggingFace.
– Taoshi’s proprietary training network is essentially an open-source quantitative trading company. They ask ML contributors to build trading algorithms predicting asset price trends. Their API provides quantitative-level trading signals for both retail and institutional users, and they are rapidly moving towards significant profits.
– Cortex.t, developed by the Corcel team, has two purposes. Firstly, they incentivize miners to provide API access to top models such as GPT-4 and Claude-3 to ensure the continuous availability of developers. They also provide synthetic data generation, which is useful for model training and benchmark testing (which is also why Nous uses it). Check out their tools – Chat and Search.
Bittensor reaffirms the power of incentive structures, all achieved through cryptoeconomics.
Now, let’s take a look at the two aspects of Morpheus:
– Cryptoeconomic structures are building AI (cryptocurrencies help AI).
– AI-enabled applications enable new use cases in cryptocurrencies (AI helps cryptocurrencies).
“Intelligent agents” are AI models trained by smart contracts. They understand the internal workings of all top DeFi protocols, know where to find profits, where to bridge, and how to identify suspicious contracts. They are the future “automated routers” and, in my opinion, will be the way everyone interacts with blockchain in 5-10 years. In fact, once we reach that point, you may not even realize that you are using blockchain technology. You will simply tell the chatbot that you want to transfer some savings to another investment, and everything will happen in the background.
Morpheus embodies the “incentivize them, and they will come” information. Their goal is to have a platform where intelligent agents can propagate and thrive, with each agent building on the success of the previous one in a minimally externalized ecosystem.
The token inflation structure highlights the four main contributors to the protocol:
– Code: The builders of agents.
– Community: Building front-end applications and tools to attract new users to the ecosystem.
– Computation: Providing computational power for executing agents.
– Capital: Providing their earnings to drive Morpheus’ economic machine.
Each category receives an equal share of the MOR inflation rewards (with a small portion stored as an emergency fund), forcing them to:
– Build the best agents: Builders are rewarded when their agents are consistently used. This is different from providing free plugins to OpenAI, as this way, builders are paid instantly.
– Build the best front-end/tools: Builders are rewarded when their creations are consistently used.
– Provide stable computational power: Providers are rewarded when they lend out their computational power.
– Provide liquidity for the project: Earn their MOR share by maintaining the liquidity of the project.
Despite the presence of many other AI/intelligent agent projects, Morpheus’ token economic structure is particularly clear and effective in designing incentive mechanisms.
These intelligent agents are the ultimate example of how AI can eliminate barriers in crypto applications. The user experience of dApps is notoriously bad (despite significant improvements in recent years), and the rise of LLMs has ignited the passion of every Web2 and Web3 entrepreneur. While there are many profit-driven projects, outstanding projects like Morpheus and Wayfinder (see demonstration below) demonstrate how simple on-chain transactions can become.
(See related tweets)
When all of these systems come together, the interactions between them may look something like this. Note that this is an extremely simplified view.
Remember our two broad categories of “crypto x AI”:
– Cryptocurrencies help AI.
– AI helps cryptocurrencies.
In this article, we mainly focus on the first category. As we can see, a well-designed token system can lay the foundation for the success of the entire ecosystem.
The DePIN framework can help kickstart the market, and creative token incentive structures can coordinate open-source projects towards previously unattainable goals. Yes, there are several other legitimate intersections that I did not cover due to space limitations:
– Decentralized storage.
– Trusted Execution Environment (TEE).
– Real-time data access (RAG).
– Zero-knowledge x machine learning for inference/origin verification.
When determining whether a new project is truly valuable, ask yourself:
– If it’s a derivative of another mature project, are its differences enough to make it stand out?
– Is it just a repackaging of open-source software?
– Does this project truly benefit from cryptocurrency technology, or is cryptocurrency technology being forced in?
– Do we really need 100 cryptocurrency projects like HuggingFace (a popular open-source machine learning platform)?
In this category, I personally see more noise than genuine projects, but there are indeed some cool use cases. For example, AI models can eliminate barriers in the user experience of crypto applications, especially intelligent agents. Here are some interesting categories to watch in the field of AI-supported crypto applications:
– Enhanced intent systems – Automating cross-chain operations.
– Wallet infrastructure.
– Real-time alert infrastructure for users and applications.
If it’s just a “chatbot with tokens,” it’s a garbage project in my opinion. Please stop hyping these projects to maintain my sanity. Also:
– Adding AI to your project won’t magically make your failed application/chain/tool fit the market.
– No one will play a bad game just because it has AI characters.
– Adding the label “AI” to your project won’t make it interesting.
Despite the noise, there are serious teams working hard to realize the vision of “decentralized AI,” and that’s worth striving for.
In addition to incentivizing open-source model development projects, decentralized data networks open new doors for emerging AI developers. When most of OpenAI’s competitors cannot achieve large-scale transactions with Reddit, Tumblr, or WordPress, decentralized data fetching can bridge this gap.
A company’s computing power may never exceed the total computing power of all other companies in the world, but with a decentralized GPU network, anyone has the ability to compete with top companies. All you need is a crypto wallet.
We are at a crossroads today. If we focus on the truly valuable “crypto x AI” projects, we have the ability to decentralize the entire AI stack.
The vision of cryptocurrency is to create an interference-free hard currency through the power of cryptography. Just as this emerging technology is gaining popularity, a more formidable challenger has emerged.
In the best-case scenario, centralized AI will not only control your finances but also impose biases on every piece of data we encounter in our daily lives. It will enrich a few tech leaders in a self-perpetuating loop of data collection, fine-tuning, and model injection.
It will know you better than you know yourself. It knows which buttons to press to make you laugh more, get angrier, and consume more. Despite appearances, it is not accountable to you.
Initially, cryptocurrency technology was seen as a force against the centralization of AI. Cryptocurrency technology can coordinate the efforts of decentralized individuals to achieve a common goal. However, now this ability is facing a more powerful enemy than central banks: centralized AI. This time, time is of the essence, and we need to take action quickly to resist the centralization trend of AI.
(See related reports)