As artificial intelligence (AI) continues to reshape industries, the role of AI in network planning has become a pertinent question for enterprises. While the use of AI promises to drive additional network traffic, many organizations may be uncertain whether this necessitates a major overhaul of their existing data center networks. Drawing from industry insights and enterprise experiences, this article examines the nuanced impact of AI on network planning and infrastructure, particularly in the context of self-hosted AI systems, and explores the implications for future network strategies.
The Current State of AI in Network Planning
The intersection of AI and network planning has garnered attention, but a significant portion of enterprises has yet to fully consider the impact of AI on network traffic. In a survey of nearly 100 network planners, only eight respondents acknowledged having thought about how AI might affect network traffic and plans. This raises the question of whether the majority are underestimating AI’s potential influence on network infrastructure. The reality is that AI’s impact on enterprise networks depends largely on the extent to which organizations plan to self-host AI models.
As of 2023, only 16 out of 91 enterprises surveyed had specific plans for AI hosting, and just eight had implemented self-hosting that year. This number is expected to rise sharply in 2024, with 77 enterprises indicating plans to integrate self-hosted AI solutions. This anticipated growth is driving increased interest from AI and network equipment vendors, including industry giants like Cisco and Juniper, who are actively promoting their AI network capabilities. This shift underscores the growing importance of AI in network planning as enterprises begin to recognize the need to accommodate AI-driven workloads.
AI’s Influence on Enterprise Networks and Traffic
The impact of AI on enterprise networks is not a one-size-fits-all scenario; it varies based on the specific AI workloads and the infrastructure supporting them. The majority of AI models rely on specialized hardware, such as GPUs, which necessitate dedicated servers within data centers. However, the extent to which these specialized AI servers impact network traffic depends on the scale and nature of the AI operations.
Generative AI models, like those developed by ChatGPT, Google, and Microsoft, have dominated public discourse. However, enterprises are increasingly cautious about adopting these models due to concerns over data security, copyright issues, and the environmental impact of energy-intensive AI processing. Instead, many organizations are exploring lightweight, large-language-model approaches that apply generative AI techniques to enterprise-specific data. This trend towards enterprise-focused AI solutions suggests that the number of specialized AI servers in data centers may be limited, reducing the potential strain on network infrastructure.
The dominant technology for networking within AI clusters is InfiniBand, a high-speed, low-latency solution favored by companies like NVIDIA for connecting GPUs in large AI data centers. However, for most enterprises, Ethernet remains the preferred technology for data center networking. Ethernet’s ubiquity and reliability make it a suitable choice for AI workloads, particularly when the focus is on enhancing analytics rather than supporting mass-market AI applications. This suggests that while AI will influence network planning, it may not necessitate a shift away from traditional Ethernet-based networks.
Planning for AI-Driven Workloads in Enterprise Networks
As enterprises integrate AI into their operations, network planners must consider how to effectively manage AI-driven workloads without disrupting existing network traffic. An AI cluster can be thought of as a massive virtual user community that requires access to vast amounts of data from the enterprise’s repository for training and inference tasks. Ensuring a high-performance data path to this repository is crucial to prevent AI workflows from congesting other critical network functions.
For enterprises with multiple data centers or complex user environments, the challenge is further compounded by the need to balance AI processing across geographically dispersed locations. To address this, network planners may need to augment data center interconnect (DCI) paths to accommodate the additional traffic generated by AI workloads. This approach helps mitigate the risk of congestion and ensures that AI operations do not interfere with other high-volume data flows, such as traditional analytics and reporting.
According to the eight enterprises that have already implemented self-hosted AI solutions, the primary strategy for managing AI traffic is to prioritize the shortest, fastest connections available. By mapping AI workflows and strategically placing AI clusters near the enterprise’s primary data sources, organizations can minimize the impact of AI on network performance. This approach not only optimizes data flow but also reduces the need for costly network upgrades, making it a practical solution for enterprises looking to integrate AI without overhauling their existing infrastructure.
The Role of AI in Enhancing Enterprise Analytics
AI’s potential to enhance enterprise analytics is a key driver of its adoption in network planning. AI and analytics share a common reliance on enterprise databases, meaning that integrating AI into the network must consider the location of these critical data sources. Placing AI clusters near the enterprise’s primary analytics applications ensures that AI workflows are streamlined and that data access is as efficient as possible.
In practice, this often means deploying AI clusters in proximity to major databases, where analytics applications are already running. By using robust Ethernet connections within the AI cluster and to the database hosts, enterprises can achieve the performance levels needed to support AI-driven analytics without compromising other network functions. However, network planners must remain vigilant in monitoring AI usage and traffic, particularly in environments where AI adoption is rapidly expanding. Unchecked AI usage can lead to network congestion and drive the need for additional capacity, highlighting the importance of proactive network management.
Future Considerations for AI and Network Planning
As AI continues to evolve, its impact on network planning will become increasingly significant. While AI usage is expected to drive additional network traffic, it is unlikely to require a complete overhaul of existing data center networks. Instead, enterprises should focus on understanding how AI integrates with their current infrastructure and on optimizing network paths to accommodate AI-driven workloads.
The future of AI networking for enterprises hinges on the relationship between AI usage and data center clusters. As more organizations adopt AI, the need for a nuanced approach to network planning will become apparent. This includes carefully considering where AI clusters are placed, how they interact with existing data center resources, and how AI traffic is managed across the network. For vendors like Cisco and Juniper, the ability to provide solutions that address these challenges will be critical to capitalizing on the growing demand for AI-capable network infrastructure.
In conclusion, while AI is poised to transform enterprise networks, its impact will be shaped by how organizations choose to implement and manage AI-driven workloads. By taking a strategic approach to network planning, enterprises can harness the power of AI without compromising network performance, ensuring that their infrastructure is ready to support the next generation of AI innovations.