AI+Web3 in Business: The Necessity
In this article, we will discuss the combination of AI and Web3 in the business world and the necessity and challenges of decentralized compute services. We will also focus on analyzing the key information of the representative project IO.NET in AI decentralized compute. This article is sourced from Alex Xu, written by Mint Ventures, and compiled by PANews.
Commercial Logic: The Combination of AI and Web3
2023: The New “Miracle Year” Created by AI
The Combination of AI and Crypto
Example A: Solving Randomness, AI Agents Based on Cryptoeconomics
Example B: Resource Shaping Through Token Incentives
Example C: Open Source Code, Introducing ZK to Distinguish Between Human and Machine
The Business Necessity of Decentralized Computing
Decentralized AI Computing Platform: IO.NET
Project Positioning
Product Mechanism and Business Data
Team Background and Financing Situation
Valuation Calculation
In the previous article, it was mentioned that in this cycle of the cryptocurrency bull market, there is a lack of influential new business and asset narratives compared to the previous cycles. AI is one of the rare new narratives in the Web3 field in this cycle. In this article, the author will combine the hot AI project IO.NET of this year to try to sort out the thinking about the following two questions:
The necessity of AI + Web3 in business
The necessity and challenges of decentralized compute services
Furthermore, the author will sort out the key information of the representative project IO.NET in AI decentralized compute, including product logic, competition situation, and project background, and make valuation deductions for the project.
The section of this article about the combination of AI and Web3 is partially inspired by “The Real Merge” written by Delphi Digital researcher Michael Rinko. Some viewpoints in this article are based on the digestion and citation of the original article. Readers are recommended to read the original text.
This article represents the author’s phased thinking at the time of publication and may be subject to change in the future. The viewpoints are highly subjective and may contain errors in facts, data, and logical reasoning. Please do not use them as investment references. Criticism and discussion from peers are welcome. The following is the main text.
2023: The New “Miracle Year” Created by AI
Looking back at the history of human development, once technology achieves breakthrough progress, it will lead to earth-shaking changes from individual daily life to various industry patterns, and even the entire human civilization.
There are two important years in human history, namely 1666 and 1905, which are now known as the two “miracle years” in the history of technology.
1666 was considered a miracle year because Newton’s scientific achievements emerged in a concentrated manner that year. He pioneered the field of optics, established the branch of calculus in mathematics, and formulated the law of gravity, which became the foundation of modern natural science. Each of these contributions laid the foundation for the future development of human science and greatly accelerated the overall scientific progress.
The second miracle year was 1905, when the 26-year-old Einstein published four consecutive papers in the “Annalen der Physik” (Annals of Physics), which involved the photoelectric effect (laying the foundation for quantum mechanics), Brownian motion (becoming an important reference for analyzing stochastic processes), special relativity, and the mass-energy equation (the famous formula E=mc^2). In the evaluation of later generations, each of these four papers exceeded the average level of the Nobel Prize in Physics (Einstein himself also won the Nobel Prize for the paper on the photoelectric effect), pushing the historical course of human civilization several steps forward.
And just recently in 2023, it is highly likely to be another “miracle year” because of ChatGPT.
We consider 2023 as a “miracle year” in the history of human technology, not only because of the great progress in natural language understanding and generation achieved by GPT but also because humans have discovered the law of the growth of large language models through the evolution of GPT. By expanding model parameters and training data, the model’s capabilities can be exponentially improved, and this process does not currently have a bottleneck (as long as there is enough computing power).
This capability goes far beyond understanding language and generating conversations and can be widely used in various technological fields. Taking the application of large language models in the field of biology as an example:
In 2018, Nobel Prize winner Frances Arnold said at the award ceremony, “Today we can read, write, and edit any DNA sequence in practical applications, but we cannot create (compose it) through it.”
Just 5 years later, in 2023, researchers from Salesforce Research, an AI startup company from Stanford University and Silicon Valley, published a paper in “Nature – Biotechnology”. They used a large language model fine-tuned based on GPT3 to create 1 million new proteins from scratch and found 2 proteins with completely different structures but both had antibacterial properties, which could be potential solutions for bacterial resistance other than antibiotics. In other words, with the help of AI, the bottleneck of protein “creation” has been broken.
Prior to this, the AI algorithm AlphaFold predicted the structures of almost all 214 million known proteins on Earth within 18 months, which is hundreds of times the achievements of all previous human structural biologists.
With various AI models based on AI, from biotechnology, material science, pharmaceutical research, and other hard sciences to law, art, and other humanities fields, there will be earth-shaking changes, and 2023 is the beginning of all this.
We all know that human wealth creation ability has been growing exponentially over the past century, and the rapid maturity of AI technology will further accelerate this process.In addition, Worldcoin has also recently open-sourced the code for its iris hardware orb, providing assurance for the security and privacy of user biometric data.
Overall, the advantages of the cryptocurrency economy, including the determinism and resource circulation brought by code and cryptography, the advantages of non-permission and token mechanisms, and the trustless attributes based on open-source code and public ledgers, have become an important potential solution for the challenges that human society faces in the AI era.
Furthermore, the most urgent and demanding challenge in the commercial sector is the extreme thirst for AI products in terms of computing resources, revolving around chips and computing power. This is also the main reason for the rise of decentralized computing power projects in this bull market cycle, which dominates the overall AI field.
Extension reading:
Depin + AI Wave》3 GPU mining projects that retail investors can participate in, creating a decentralized computing power network
AI requires a large amount of computing resources, whether it is for training models or inference.
In the practice of training large language models, there is a fact that has been confirmed: as long as the scale of the data parameter is large enough, large language models will exhibit new capabilities that were not present before. Each generation of GPT models has exponentially improved capabilities compared to the previous generation, and this is due to the exponential growth in computational power during model training.
Research by DeepMind and Stanford University has shown that different large language models, when faced with different tasks (computation, Persian question and answer, natural language understanding, etc.), as long as the scale of the model parameters is increased (correspondingly, the computational complexity also increases), before the training volume reaches 10^22 FLOPs (FLOPs refers to floating-point operations per second, used to measure computational performance), the performance of any task is similar to randomly guessing the answer. However, once the parameter scale surpasses that critical value, the task performance improves dramatically, regardless of the language model.
Source: Emergent Abilities of Large Language Models
It is also the verification of the rule of “miracles from computational power” and the practice of verification that prompted Sam Altman, the founder of OpenAI, to propose raising $7 trillion to build an advanced chip factory that is more than 10 times the size of TSMC (with an estimated cost of $1.5 trillion) and to use the remaining funds for chip production and model training.
In addition to the computational power required for training AI models, the inference process of the model itself also requires significant computational power (although less than training). Therefore, the thirst for chips and computing power has become the norm for participants in the AI race.
Compared to centralized AI computing providers such as Amazon Web Services, Google Cloud Platform, and Microsoft Azure, the main value propositions of decentralized AI computing include:
Accessibility: It usually takes several weeks to obtain access permission to the computational chips through cloud services such as AWS, GCP, or Azure, and popular GPU models are often out of stock. In addition, to obtain computational power, consumers often need to sign long-term, inflexible contracts with these large companies. Distributed computing power platforms can provide flexible hardware options and greater accessibility.
Lower pricing: By using idle chips and adding token subsidies from network protocols to chip and computing power suppliers, distributed computing power networks may be able to provide more affordable computing power.
Censorship resistance: Currently, cutting-edge computational chips and supplies are monopolized by large technology companies, and governments, represented by the United States, are increasing their scrutiny of AI computing services. The ability to access AI computing power in a decentralized, flexible, and free manner is gradually becoming an explicit demand, which is also a core value proposition of AI computing service platforms based on Web3.
If fossil energy is the lifeblood of the industrial age, then computing power may be the lifeblood of the new digital age opened by AI. The supply of computing power will become the infrastructure of the AI era. Just as stablecoins have become a thriving branch of fiat currency in the Web3 era, will the decentralized computing power market become a branch of the rapidly growing AI computing power market?
Since this is still a relatively early market, everything remains to be observed. However, the following factors may stimulate the narrative or adoption of decentralized computing power:
Continued supply shortage of GPUs. The continued shortage of GPU supply may drive some developers to try decentralized computing power platforms.
Expansion of regulation. Obtaining AI computing power services from large cloud platforms such as AWS usually requires KYC and multiple layers of review. This may promote the adoption of decentralized computing power platforms, especially in restricted and sanctioned regions.
Stimulus of token prices. The price increase of tokens during a bull market cycle increases the subsidy value of the platform to the GPU supply side, thereby attracting more suppliers to enter the market, increasing the market size, and reducing the actual purchase price for consumers.
However, the challenges of decentralized computing power platforms are also quite apparent:
Technical and engineering challenges.
Work verification problem: Due to the hierarchical structure of deep learning models, where the output of each layer serves as the input for the next layer, validating the effectiveness of the computation requires executing all previous work, and cannot be easily and efficiently verified. To solve this problem, distributed computing platforms need to develop new algorithms or use approximate verification techniques that can provide probabilistic guarantees of result correctness, rather than absolute certainty.
Parallelization challenge: Distributed computing platforms aggregate long-tail chip supplies, which inevitably limits the computing power that a single device can provide. It is almost impossible for a single chip supplier to independently complete AI model training or inference tasks within a short period of time. Therefore, tasks must be decomposed and assigned through parallelization methods to shorten the overall completion time. Parallelization faces a series of problems such as task decomposition (especially for complex deep learning tasks), data dependencies, and additional communication costs between devices.
Privacy protection problem: How to ensure that the purchaser’s data and models are not exposed to the recipient of the task?
Regulatory compliance challenges.
Due to the non-permissioned nature of decentralized computing platforms and the bilateral market of supply and procurement, they can be attractive to some customers. However, they may also become the target of government regulation as AI regulatory standards become more complete. In addition, some GPU suppliers may be concerned about whether the computing resources they rent out are provided to businesses or individuals subject to sanctions.
In general, the consumers of decentralized computing power platforms are mostly professional developers or small and medium-sized organizations. Unlike crypto investors who purchase cryptocurrencies and NFTs, these users have higher requirements for the stability and continuity of the services provided by the protocols, and price may not be their main decision-making factor. Currently, decentralized computing power platforms still have a long way to go to gain the recognition of these users.
Next, we will analyze and evaluate the project information of the new decentralized computing power project IO.NET in this cycle and estimate its potential valuation after its listing, based on similar AI projects and decentralized computing power projects in the current market.
Project positioning:
IO.NET is a decentralized computing network that builds a bilateral market centered around chips. The supply side consists of distributed computing power from GPUs (mainly GPUs, but also CPUs and Apple’s iGPUs) distributed globally, while the demand side consists of artificial intelligence engineers who want to complete AI model training or inference tasks.
On the official website of IO.NET, it is written as follows:
Its mission is to integrate millions of GPUs into its DePIN network.
Compared to existing cloud AI computing service providers, its main selling points are:
Flexible combination: AI engineers can freely select and combine the chips they need to form “clusters” to complete their computational tasks.
Rapid deployment: No need for weeks of approval and waiting (as is the case with centralized vendors such as AWS), deployment can be completed within seconds to start the task.
Low-cost service: The cost of the service is 90% lower than mainstream vendors.
In addition, IO.NET plans to launch services such as an AI model store in the future.
Product mechanism and business information:
Product mechanism and deployment experience:
Similar to Amazon Web Services, Google Cloud, and Alibaba Cloud, the computing service provided by IO.NET is called IO Cloud. IO Cloud is a distributed and decentralized chip network that can execute Python-based machine learning code and AI and machine learning programs.
The basic business module of IO Cloud is called Clusters. Clusters are GPU clusters that can self-coordinate to complete computational tasks. AI engineers can customize the clusters they want based on their needs.
IO.NET’s product interface is very user-friendly. If you want to deploy your own chip cluster to complete AI computing tasks, you can start configuring your desired chip cluster after entering its Clusters product page.
Page information:
https://cloud.io.net/cloud/clusters/create-cluster