With the relaunch of Subnet 9 on the Bittensor network — imagine it as a mobile app on an app store — Macrocosmos is making the first significant move toward a more democratic future for artificial intelligence. The secret to achieving this milestone lies in the subnet's new acronym, IOTA: Incentivized Orchestrated Training Architecture. This framework enables anyone with a graphics processing unit, regardless of its size, to assist in training advanced AI models.
Based on a novel "swarm" method, which serves as a theoretical pre-training strategy for AI, Macrocosmos' breakthrough addresses critical challenges related to data and model compression, as detailed in their white paper released on Friday. At its heart lies a vision that reimagines how intelligence is constructed and who participates in that process.
"We are focused and passionate about creating competitive decentralized technologies capable of rivaling centralized labs," wrote Macrocosmos CTO Steffen Cruz in a post on X.
Before diving into swarm training, let's grasp the main differences between traditional AI and decentralized AI. Simply put, decentralized AI means AI model training doesn't occur in one location or under one company's control. Instead, it's spread globally across homes, labs, campuses, and servers worldwide. Just like Bitcoin decentralized money from centralized banks, Bittensor and Macrocosmos aim to democratize intelligence itself.
This matters because AI is increasingly influencing our lives. It determines the news we see, the products we're offered, how we shop, interact, work, and even get hired. Centralizing this power in a few secretive computing systems poses risks to privacy, fairness, and innovation. By opening these systems to public involvement, decentralized AI provides a new form of alignment—where users become co-creators.
"Not only is this a new research endeavor for Macrocosmos and Bittensor, but it's something deeper and more personal to us," Cruz noted. "We are scientists, researchers, and developers."
Understanding The AI Swarm And Pre Training
Swarm training, implemented by Macrocosmos via IOTA, draws inspiration from nature. Much like swarms of bees, schools of fish, or flocks of birds navigating without central control, this innovative subnet allows thousands of independent machines to collaborate on training a single large AI model.
Rather than requiring each network participant to download and run the entire model—a costly and impractical demand—Macrocosmos employs model parallelism. Each subnet member—referred to as a miner since their actions "mine" monetary incentives benefiting the whole network—trains just a segment of the model, usually a few layers of the neural network. As data moves through these individual layers, each miner processes its part and forwards the output. A swift reverse check evaluates how far off the model is and adjusts miner payouts accordingly.
This method isn't merely more efficient than centralized approaches—it's more inclusive. Unlike traditional methods requiring top-tier hardware, this architecture lets both low- and high-compute participants contribute meaningfully. This dismantles barriers keeping open-source communities on the fringes of AI model training.
Centralized Versus Decentralized AI Training
To comprehend the distinction between conventional AI model training and what Macrocosmos is doing, this graphic offers a helpful side-by-side comparison:
In centralized training, one model splits into layers tightly interconnected across GPUs within a single data center. Everything optimizes for high-speed local connections. However, this setup is costly, exclusive, and closed.
Conversely, decentralized swarm training disperses different model layers across a global network of contributors or miners. Each participant handles a portion of the workload and shares results with others. The swarm system frequently integrates all parts into a unified model. Instead of relying on massive compute clusters, it harnesses a diverse range of connected devices—from a personal desktop GPU to larger industrial setups.
The result? Lower costs, greater transparency, and an AI model crafted by many, not the few.
What Makes This AI Alchemy Possible
Yet, training models this way presents challenges. Internet bandwidth lags behind fiber optics inside data centers. Decentralized participants may disconnect, attempt to exploit the incentive system, or go offline unexpectedly.
While some issues lie beyond Macrocosmos' control, they've devised an elegant solution for potential problems concerning miner incentives and rewards. Their new IOTA network tackles three major hurdles:
- Bandwidth Bottlenecks: A specialized compression technique reduces data transmission by up to 128 times, enabling synchronization even on home internet speeds.
- Fault Tolerance: A strategy named Butterfly All-Reduce ensures model updates average and verify across numerous independent contributors, preventing any single dropout from halting the system.
- Fair Rewards: Every contributor measures impact accurately, employing a reward system based on how much each person (or computer) aids the network reaching goals by calculating unique contributions. Honest efforts receive transparent and proportional compensation.
Why Decentralized AI Pre Training Matters Now
In this video clip, the firm's co-founders Cruz and Will Squires explain why decentralized training matters and how it opens a new chapter for AI.
"The time has arrived for us to advance as a community and confront new challenges in model training. This is crucial for Bittensor. Competitors are closing in," Cruz stated.
"We outperformed nation-states, rigorously benchmarked our progress, and detailed our findings in our white paper. It was an extraordinary experiment, pushing it far beyond its initial design."
This initiative to distribute AI's computation and ownership to everyone via swarm training envisions a future where AI isn't monopolized by elite entities while scraps trickle down to the masses. It's a collective endeavor we build together.
Macrocosmos is transitioning decentralized training from Big Tech's enclosed gardens into the open. If successful, the next groundbreaking AI model might not originate from OpenAI, Google, or Meta—but rather from a swarm of us.
ForbesMost Americans Want AI Decentralized, Not Controlled By Big TechBy Tor Constantino, MBA
The above is the detailed content of Swarm Intelligence Is Reshaping How AI Gets Trained. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undress AI Tool
Undress images for free

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

Investing is booming, but capital alone isn’t enough. With valuations rising and distinctiveness fading, investors in AI-focused venture funds must make a key decision: Buy, build, or partner to gain an edge? Here’s how to evaluate each option—and pr

Let’s talk about it. This analysis of an innovative AI breakthrough is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities (see the link here). Heading Toward AGI And

Remember the flood of open-source Chinese models that disrupted the GenAI industry earlier this year? While DeepSeek took most of the headlines, Kimi K1.5 was one of the prominent names in the list. And the model was quite cool.

Let’s talk about it. This analysis of an innovative AI breakthrough is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities (see the link here). For those readers who h

By mid-2025, the AI “arms race” is heating up, and xAI and Anthropic have both released their flagship models, Grok 4 and Claude 4. These two models are at opposite ends of the design philosophy and deployment platform, yet they

For example, if you ask a model a question like: “what does (X) person do at (X) company?” you may see a reasoning chain that looks something like this, assuming the system knows how to retrieve the necessary information:Locating details about the co

Clinical trials are an enormous bottleneck in drug development, and Kim and Reddy thought the AI-enabled software they’d been building at Pi Health could help do them faster and cheaper by expanding the pool of potentially eligible patients. But the

The Senate voted 99-1 Tuesday morning to kill the moratorium after a last-minute uproar from advocacy groups, lawmakers and tens of thousands of Americans who saw it as a dangerous overreach. They didn’t stay quiet. The Senate listened.States Keep Th
