SUBSCRIBE NOW
IN THIS ISSUE
PIPELINE RESOURCES

The Race to Build AI-optimized Data Center Networks


Many of these questions don’t have objectively correct answers...AI applications themselves, along with the technologies supporting them, will continue to evolve. So, even the best-laid plans of today will likely change considerably over the next several years and beyond.
market ultimately settle? According to a recent Dell’Oro Group forecast, the answers differ for front- and back-end data center networks. The analyst expects that network ports providing front-end connectivity to AI clusters will remain Ethernet, with data ingestion requirements initially driving the transition to next-generation speeds. By 2027, Dell’Oro expects that one third of front-end Ethernet ports will be 800G or higher.

Back-end infrastructures will evolve more quickly, with operators adopting next-generation speeds at a triple-digit compound annual growth rate (CAGR). By 2027, nearly all back-end network ports will be 800G or higher. Here though, where lossless transmission is essential, Dell’Oro forecasts that interface technologies will remain mixed, with Ethernet and InfiniBand coexisting for the foreseeable future.

Weighing Infrastructure Options

Given the pace at which customers are adopting AI applications, and the lead time needed to build new data centers, operators have little choice but to move forward with new fabric strategies now. However, while uncertainty remains regarding the best approaches to support future AI workloads, we can draw one big conclusion now: one size will not fit all. Different operators will follow different paths depending on a variety of factors unique to their business and technology strategies.

The size of a given deployment, the number of clusters it will support, and of course, price, will all influence infrastructure decisions. Yet operators will also need to consider many other factors. What kinds of AI applications and workloads does the operator plan to focus on? For example, will they cater to compute- and time-intensive model-training or outsource that phase of AI applications? How complex do they expect their customers’ applications and workloads to be? What will those workloads’ bandwidth and load-balancing requirements look like, and how important will low and deterministic latency be for their processing? Where does a given operator fall on the question of standardized versus proprietary interface technologies? How important is it to maintain a diversified multivendor supply chain? And how comfortable are they with the future roadmaps of the technologies they’re considering?

Many of these questions don’t have objectively correct answers. And beyond the variability of individual data center strategies, AI applications themselves, along with the technologies supporting them, will continue to evolve. So, even the best-laid plans of today will likely change considerably over the next several years and beyond.

Looking Ahead

Even with some questions still unanswered, data center operators have little choice but to push ahead with new network infrastructures. The market for AI applications has grown so huge, so quickly, that it would be foolish to do anything else. To actually deliver on the customer expectations that accompany exploding AI demand, however, rigorous testing and validation becomes absolutely essential.

Data center operators and their vendors must be able to validate next-generation Ethernet products, assure multi-vendor interoperability, and test timing and synchronization with a precision that legacy tools simply can’t support. Vendors will also need the ability to emulate the unique network behavior and traffic patterns in AI clusters with lifelike accuracy. Fortunately, a new generation of testing and emulation solutions designed for the speeds and scale of AI network infrastructures has already emerged to reflect these changing requirements. With a bit of foresight and informed decision-making from data center operators and vendors, the transformative potential of AI applications really can be practically limitless.



FEATURED SPONSOR:

Latest Updates





Subscribe to our YouTube Channel