It all began with a disruptive desk. This wooden cubicle belonged to a young AI researcher who was just starting his PhD journey at a university in northern England in 2015. The researcher, Ben Fielding, had assembled a sizable machine packed with early GPUs to advance his work in AI. Unfortunately, the noise from this machine was quite bothersome to his lab colleagues. To accommodate it, Fielding squeezed it under the desk, but its size meant he had to uncomfortably position his legs to the side.
Fielding had some unconventional theories, focusing on how “swarms” of AI — groups comprised of various models — could interact and learn from each other, potentially enhancing the overall system. The challenge? He felt constrained by the limitations of that noisy machine beneath him. He recognized he was at a disadvantage. “Google was also pursuing this research,” Fielding reflects today. “They had thousands of GPUs in data centers. Their experiments weren’t outlandish. I understood the methodologies… I had multiple proposals, but I couldn’t execute them.”
A decade ago, it occurred to Fielding that compute limitations would always pose a challenge. He realized back in 2015 that if compute was a significant constraint in academia, it would definitely be a barrier once AI reached a broader audience.
What was the answer?
Decentralized AI.
In 2020, Fielding co-founded Gensyn with Harry Grieve, years before decentralized AI became a trendy topic. Initially, the project focused on creating decentralized computing – a subject I’ve discussed with Fielding multiple times in interviews and panels at various events – but the vision reaches further: “The network for machine intelligence.” They aim to develop solutions across the entire tech spectrum.
Fast forward ten years from Fielding’s bothersome desk, and the initial tools from Gensyn are now available to the public. Recently, Gensyn introduced the “RL Swarms” protocol (which has its roots in Fielding’s PhD research) and just launched its Testnet, integrating blockchain technology into the mix.
In a recent conversation leading up to the AI Summit in Toronto, Fielding provides an overview of AI Swarms, discusses how blockchain fits into the equation, and emphasizes that all innovators – not just large tech firms – “should retain the right to develop machine learning technologies.”
This interview has been succinctly edited for clarity.
Congratulations on the Testnet launch. Can you outline what it entails?
Ben Fielding: It represents the introduction of our initial minimum viable product (MVP) features for blockchain integration along with what we have rolled out so far.
What were the original features before incorporating blockchain?
We launched the RL [Reinforcement Learning] Swarm a few weeks back, focusing on post-training reinforcement learning within a peer-to-peer network.
To simplify it, think of it this way: when a pre-trained model undergoes reasoning training – like DeepSeek-R1 – it critiques its own thought processes and improves over time based on its objectives. This allows it to enhance its responses.
We expand upon this concept by suggesting, “Model self-critique is valuable, but what if they could also interact and critique each other’s reasoning?” When numerous models connect and communicate, they can begin to share insights, ultimately leading to an overall improvement of the entire swarm.
That clarifies the rationale behind the name “Swarm.”
Exactly. This training approach enables multiple models to work together in parallel, enhancing the results of a meta-model constructed from them, while each individual model continues to enhance itself. For instance, if someone were to connect a model on a MacBook to a swarm for an hour and then disengage, their local model would have benefited from the swarm’s knowledge while simultaneously contributing to the improvement of the other models. It’s a collaborative training environment that is accessible to any model. That’s essentially what RL Swarm is.
So that’s what you launched recently. How does blockchain factor into this?
The blockchain aspect allows us to advance some foundational elements within the system.
For someone unfamiliar with the term “foundational elements,” could you elaborate?
Certainly! By foundational elements, I refer to elements that are closely tied to the underlying resources themselves. If you consider the software stack, you have the GPU infrastructure in a data center, drivers operating atop the GPUs, and so forth.
So foundational elements are essentially the core, lowest levels in the tech architecture. Am I correct?
Yes, that’s right. The RL Swarm showcases what can be achieved, serving as a somewhat rudimentary demonstration of large-scale and scalable machine learning. For over four years, Gensyn has been dedicated to building the necessary infrastructure. Presently, we’re at a stage where this infrastructure is in its beta phase, ready for implementation. We just need to illustrate to the world the transformative possibilities that can arise when rethinking the approach to machine learning.
It seems you are doing much more than just decentralized computing or even basic infrastructure.
We have three core elements within our infrastructure. Execution – we possess consistent execution libraries, our own compiler, and reproducible libraries for any hardware target.
The second component is communication. Assuming a model can operate on any compatible device, how can they effectively communicate? If everyone adheres to a common standard, communication can resemble how the TCP/IP protocol functions on the internet. We create those communication libraries, and RL Swarm is an example of that.
Finally, there’s verification.
Ah, this is where blockchain plays a role…
Picture a scenario where every device globally executes tasks reliably and can link models together. However, can they trust one another? If I connect my MacBook with yours, they can carry out the same functions and exchange tensors, but can they confirm that the received data was correctly processed on the other device?
In today’s world, we would typically formalize such agreements via contracts to ensure compliance. In the realm of machines, this must occur programmatically. Thus, we develop cryptographic proofs, probabilistic proofs, and game-theoretic proofs to automate that trust process.
This is where blockchain comes into play. It offers all the benefits associated with it, such as persistent identities, payment mechanisms, consensus, etc. With our Testnet, we integrate RL Swarm and other foundational elements along with blockchain components, establishing that when you join a swarm, you possess a persistent identity recorded on a decentralized ledger.
Moving forward, there will be functionality for payments, but for now, you’ll benefit from a trust consensus mechanism that resolves disputes. This represents an MVP for the future Gensyn infrastructure, which will evolve with new components over time.
Can you give us a sneak peek of what’s on the horizon?
Once we transition to the main-net, the entire software and infrastructure will be operational, utilizing blockchain for trust, payments, consensus, and identity. This marks the initial stage, where we introduce identity verification; upon joining a swarm, you can register as an identifiable entity, known by all participants without needing to check a centralized server.
Now, let’s dream a bit. What’s your vision for one year, two years, or five years from now? What is your ultimate objective?
Certainly. The grand vision is to make all resources supporting machine learning seamlessly and programmatically accessible to everyone. Machine learning faces significant constraints due to its reliance on core resources. This creates barriers for centralized AI companies, but such limitations need not persist. If we can create the right software, it can be open-source. We believe that Gensyn is responsible for constructing the underlying infrastructure necessary to facilitate this, ensuring that everyone has the right to develop machine learning technologies.