The future of intelligence will be distributed, not centralized.
Artificial Super Intelligence will emerge from a civilization of models, not a single one.
Sovereignty, modularity, and federation are the path forward.
It will be distributed — billions of nodes, each learning in context, collaborating securely, and refining together. This is how we get to Artificial Super Intelligence: not from a single monolithic model in a datacenter, but from a living network of models, sovereign and resilient acting as a civilization of models.
Most of the work we do to build this future happens quietly.
We’ve been developing runtimes, distributed networks, AI frameworks, and modular tooling designed to make AI sovereign and edge-ready. Day by day, the pieces are coming together.
This is a glimpse into how we’re thinking at webAI — and every so often, we share a piece of that progress publicly.
Last week, we released one such glimpse: Federated Learning with Ad-hoc Adapter Insertions: The Case of Soft-Embeddings for Training Classifier-as-Retriever.
It’s a technical paper, but the essence is simple: we found a way to let federated retrieval models adapt locally without breaking privacy budgets or communication constraints. The mechanism — ad-hoc soft embedding adapters — is small, lightweight, and surprisingly powerful.
It’s not the whole story, but it’s one of the steps.
We’ve seen strong results — accuracy improvements from 12% → 99.9%, 2.6× faster training in distributed settings, backed by formal convergence and privacy guarantees.
But more importantly, this work validates a direction we’ve been investing in for years: modular, federated, sovereign AI.
This paper is not representative of the complete work. It’s a milestone on a much larger journey.
We’ve made a series of internal breakthroughs — in retrieval, federation, modular runtimes, and distributed orchestration — that together point to one conclusion:
Distributed AI is how we reach ASI.
What excites us is not just this result, but what comes next: connecting these components into a network that can learn faster, adapt locally, and operate everywhere — from a smartphone to a datacenter, to an IoT node.
We’ll keep most of our work under wraps until it’s ready. But from time to time, we’ll continue to share pieces of the journey publicly.
Full paper is here: arXiv:2509.16508