Author: Haotian
After reviewing several popular projects in the Crypto+AI track over the past month, I discovered three notable trend changes, along with brief project introductions and comments:
1) Project technical paths are becoming more pragmatic, focusing on performance data rather than pure conceptual packaging;
2) Vertical niche scenarios are becoming the expansion focus, with generalized AI giving way to specialized AI;
3) Capital is more concerned with business model verification, with projects having cash flow being clearly more favored;
Appendix: Project Introduction, Highlight Analysis, Personal Comments:
1, @yupp_ai
Project Introduction: A decentralized AI model evaluation platform that completed a $33 million seed round in June, led by a16z, with Jeff Dean participating as an investor.
Highlight Analysis: Applying human subjective judgment advantages to AI evaluation weaknesses. By crowdsourcing ratings for 500+ large models, user feedback can be converted to cash (1000 points = $1), attracting data purchases from companies like OpenAI, with real cash flow.
Personal Comments: A project with a relatively clear business model, not a pure cash-burning approach. However, preventing sybil attacks is a major challenge, and anti-sybil attack algorithms need continuous optimization. From the $33 million funding scale, capital is clearly more interested in projects with monetization verification.
2, @Gradient_HQ
Project Introduction: A decentralized AI computing network that completed a $10 million seed round in June, led by Pantera Capital and Multicoin Capital.
Highlight Analysis: Using Sentry Nodes browser plugin, it already has some market consensus in the Solana DePIN field. Team members from Helium, newly launched Lattica data transmission protocol and Parallax inference engine, made substantive explorations in edge computing and data verifiability, reducing latency by 40%, supporting heterogeneous device access.
Personal Comments: The direction is right, perfectly capturing the AI localization "sinking" trend. However, handling complex tasks still faces efficiency challenges compared to centralized platforms, and edge node stability remains an issue. Nevertheless, edge computing is a new demand emerging from web2 AI's internal competition and a distributed framework advantage of web3 AI, looking forward to practical performance-driven product implementation.
3, @PublicAI_
Project Introduction: A decentralized AI data infrastructure platform that uses token incentives to encourage global users to contribute multi-domain data (medical, autonomous driving, voice, etc.), with cumulative revenue exceeding $14 million, establishing a million-level data contributor network.
Highlight Analysis: Technically integrating ZK verification and BFT consensus algorithm to ensure data quality, using Amazon Nitro Enclaves privacy computing technology to meet compliance requirements. Interestingly, they launched HeadCap brain wave collection device, expanding from software to hardware. The economic model is well-designed, with users earning $16 + 500,000 points for 10 hours of voice annotation, and enterprise data service subscription costs reduced by 45%.
Personal Comments: The project's greatest value seems to be addressing real needs in AI data annotation, especially in high data quality and compliance-demanding fields like medical and autonomous driving. However, the 20% error rate is still higher than traditional platforms' 10%, and data quality fluctuation is an ongoing issue to resolve. The brain-computer interface direction has interesting potential but also high implementation difficulty.
4, @sparkchainai
Project Introduction: A distributed computing network on the Solana chain, completing $10.8 million in funding in June, led by OakStone Ventures.
Highlight Analysis: Using dynamic sharding technology to aggregate idle GPU resources, supporting large model inference like Llama3-405B, with costs 40% lower than AWS. The tokenized data trading design is interesting, directly transforming computing power contributors into stakeholders and incentivizing more network participation.
Personal Comments: A typical "aggregating idle resources" model, logically sound. However, the 15% cross-chain verification error rate is indeed quite high, and technical stability needs further refinement. Nevertheless, it has advantages in scenarios like 3D rendering with lower real-time requirements. The key is whether the error rate can be reduced, otherwise, even the best business model will be dragged down by technical issues.
5, @olaxbt_terminal
Project Introduction: An AI-driven cryptocurrency high-frequency trading platform that completed a $3.38 million seed round in June, led by @ambergroup_io
Highlight Analysis: MCP technology can dynamically optimize trading paths, reducing slippage, with measured efficiency improvement of 30%. Catering to the #AgentFi trend, it finds an entry point in the relatively blank DeFi quantitative trading niche, filling a market demand.
Personal Comments: The direction is correct, and DeFi indeed needs smarter trading tools. However, high-frequency trading requires extremely low latency and accuracy, and the real-time coordination between AI prediction and on-chain execution still needs verification. Additionally, MEV attacks pose a significant risk, and technical protection measures must keep pace.