NVIDIA Expands Spectrum-X With Open MRC Protocol for AI Scale

0


Ted Hisokawa
May 06, 2026 12:21

NVIDIA’s Spectrum-X Ethernet integrates the new MRC protocol, optimizing AI network performance for hyperscale data centers like OpenAI and Microsoft.





NVIDIA has unveiled a major upgrade to its Spectrum-X Ethernet platform, introducing the Multipath Reliable Connection (MRC) protocol as an open standard. Designed for gigascale AI, MRC optimizes network performance and resilience, addressing the growing need for robust data center infrastructure in AI development. The announcement comes as Spectrum-X gains traction with industry leaders like OpenAI, Microsoft, and Oracle.

At its core, Spectrum-X Ethernet is built to handle the massive bandwidth and low-latency demands of advanced AI workloads. With the addition of MRC, the platform now allows data to flow across multiple network paths dynamically. This improves throughput, load balancing, and fault tolerance in AI factories, where thousands of GPUs must work in synchronized harmony.

“Deploying MRC in the Blackwell generation was very successful,” said Sachin Katti, head of industrial compute at OpenAI. He highlighted how MRC’s design significantly mitigates network slowdowns, enabling efficient training of large-scale AI models. OpenAI’s own AI training clusters rely on this technology to keep workloads running smoothly, even under extreme demands.

Microsoft’s AI data center, Fairwater, and Oracle’s Abilene infrastructure have also integrated MRC, further underscoring its growing industry adoption. These hyperscale environments are purpose-built for training leading-edge large language models (LLMs) and benefit from the Spectrum-X Ethernet platform’s ability to provide consistent, high-performance networking at scale.

Why MRC Matters for AI Factories

MRC’s innovation lies in its ability to distribute network traffic intelligently. By load-balancing data across multiple paths, it ensures that GPUs maintain high utilization rates without bottlenecks. If a network path becomes congested or fails, MRC reroutes traffic in microseconds, preventing disruptions that could derail long training runs. This level of resilience is critical for AI factories where even brief downtimes can have significant operational and financial costs.

The protocol also incorporates advanced telemetry and control features, giving administrators granular visibility into network behavior. This simplifies troubleshooting and ensures smooth operation in environments that often span tens of thousands of GPUs.

NVIDIA’s Ethernet Advantage

NVIDIA has been positioning Spectrum-X Ethernet as a powerful alternative to InfiniBand, which has traditionally dominated AI networking. The platform leverages hardware like Spectrum-4 Ethernet switches and BlueField-3 SuperNICs to deliver high bandwidth and low latency, tailored specifically for AI workloads. Recent advancements, such as the introduction of Spectrum-XGS in February 2026, have further expanded its capabilities, making it a viable choice for hyperscale data centers.

Unlike InfiniBand, which operates as a proprietary system, Spectrum-X Ethernet embraces open standards. The release of MRC as part of the Open Compute Project signals NVIDIA’s commitment to fostering a more collaborative ecosystem. This could accelerate adoption across industries and reinforce Ethernet’s role in AI infrastructure.

Scaling to Gigascale AI

NVIDIA’s multiplane network design, enabled by Spectrum-X’s hardware, is another key differentiator. By supporting independent network planes with hardware-accelerated load balancing, it allows data centers to scale without sacrificing performance. This architecture is critical as the industry moves toward AI super factories, where infrastructure must grow to accommodate increasingly complex models and datasets.

Looking ahead, the integration of Spectrum-X Ethernet and MRC positions NVIDIA as a leader in AI networking. As demand for AI training infrastructure surges, technologies that deliver not just speed but also resilience and intelligence will become indispensable. For organizations like OpenAI and Microsoft, which are at the forefront of AI development, this combination could prove to be a game-changer.

With Spectrum-X Ethernet, NVIDIA is not just keeping pace with AI’s rapid evolution—it’s setting the standard for what’s next.

Image source: Shutterstock


Credit: Source link

Leave A Reply

Your email address will not be published.

Please enter CoinGecko Free Api Key to get this plugin works.