Pluralis Research

Pluralis Research

Home
Archive
About
Beyond Top-K: Pipeline Parallelism Over Slow Networks
A progress update on the core problem in Protocol Learning.
May 2, 2025 • Sameera Ramasinghe
SWARM Parallel with Asynchronous Updates
We significantly improve training reliability, robustness and speed of asynchronous pipeline-parallel training.
May 1, 2025 • Yan Zuo and Gil Avraham

April 2025

Efficient Asynchronous Low-Bandwidth Training on Heterogenous GPUs
We show asynchronous methods can surpass the synchronous method in DiLoCo training while supporting heterogenous GPUs.
Apr 30, 2025 • Thalaiyasingam Ajanthan

March 2025

A Third Path: Protocol Learning
Developing the true open-source AI
Mar 19, 2025 • Alexander Long

October 2024

Article 2: Protocol Learning, Protocol Models and the Great Convergence
Two enormous, previously disparate fields converge and a path towards the largest models to ever be trained is opened.
Oct 13, 2024 • Alexander Long

July 2024

Decentralized Training Looms
Collaborative Training of foundation models is closer to actualization than broadly understood. The popular view that low bandwidth node-to-node…
Jul 8, 2024 • Alexander Long
© 2026 Pluralis · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture