- AiNews.com
- Posts
- Near Protocol to Build Record 1.4T Parameter Open-Source AI Model
Near Protocol to Build Record 1.4T Parameter Open-Source AI Model
Image Source: ChatGPT-4o
Near Protocol to Build Record 1.4T Parameter Open-Source AI Model
Near Protocol has announced plans to build the world’s largest open-source artificial intelligence model, a 1.4 trillion-parameter AI, unveiled during the Redacted conference in Bangkok, Thailand. This model, which would be 3.5 times larger than Meta’s open-source Llama model, represents a pioneering step in the decentralized AI landscape.
A Crowdsourced Approach to AI Development
Near Protocol’s ambitious project will rely on crowdsourced research and development from a vast community of contributors on the new Near AI Research hub. Beginning Nov. 10, participants can start by training a smaller 500-million parameter model. Only top contributors will have the chance to advance, working on progressively larger and more complex models across seven stages. To ensure privacy and motivate contributors, Near will use encrypted Trusted Execution Environments, providing secure environments where participants can earn rewards as the model develops.
Funding and Monetization Strategy
The substantial costs associated with training such a large AI model—estimated around $160 million—will be funded through token sales. Near Protocol’s co-founder, Illia Polosukhin, highlighted the sustainable business model supporting this project at the Redacted conference, explaining, “Tokenholders get repaid from all the inferences that happen when this model is used...people can actually reinvest back into the next model as well.”
This self-sustaining approach ensures continuous development, enabling contributors and tokenholders alike to participate in the long-term vision of the project.
A Technical Feat: Distributed Computing for AI
Training a model of this magnitude presents significant challenges, particularly in computing power. Near AI co-founder Alex Skidanov, previously associated with OpenAI, acknowledged that creating a decentralized network of GPUs to support such large-scale distributed training is a major hurdle. Traditional distributed AI training relies on high-speed interconnects, which aren’t yet feasible on a decentralized network. However, Skidanov mentioned promising research from DeepMind suggesting that these challenges may soon have solutions.
A Push for Decentralized AI in a Centralized World
Polosukhin emphasized the need for decentralized AI, arguing that centralized AI poses a risk of monopolizing control over the technology. “If AI is controlled by one company, we effectively are going to do whatever that company says,” he stated. This philosophy aligns with the principles of Web3, which seeks to avoid single-entity control over crucial technologies.
Adding to this perspective, guest speaker Edward Snowden described a future where centralized AI could transform society into a surveillance state. He advocated for digital sovereignty, emphasizing the need for systems “enforced through math” to protect individual rights online.
What This Means
Near Protocol’s initiative to build a 1.4 trillion-parameter AI model showcases the potential of decentralized technology to challenge the dominance of centralized AI. By leveraging blockchain’s core principles of decentralization and community-driven development, Near aims to prevent any single entity from controlling AI’s future. If successful, this project could set a precedent for collaborative AI development, empowering diverse contributors to help shape the direction of AI innovation.
Editor’s Note: This article was created by Alicia Shapiro, CMO of AiNews.com, with writing, image, and idea-generation support from ChatGPT, an AI assistant. However, the final perspective and editorial choices are solely Alicia Shapiro’s. Special thanks to ChatGPT for assistance with research and editorial support in crafting this article.