Cisco shifts focus to AI with new infrastructure options


Whereas most individuals consider Cisco as an organization that hyperlinks infrastructure parts in information facilities and the cloud, it isn’t the primary firm that involves thoughts when discussing GenAI. Nonetheless, at its current Associate Summit occasion, the corporate made a number of bulletins aimed toward altering that notion.

Particularly, Cisco debuted a number of new servers outfitted with Nvidia GPUs and AMD CPUs, focused for AI workloads, a brand new high-speed community swap optimized for interconnecting a number of AI-focused servers, and several other preconfigured PODs of compute and community infrastructure designed for particular functions.

On the server aspect, Cisco’s new UCS C885A M8 Server packages as much as eight Nvidia H100 or H200 GPUs and AMD Epyc CPUs right into a compact rack server able to every part from mannequin coaching to fine-tuning. Configured with each Nvidia Ethernet playing cards and DPUs, the system can operate independently or be networked with different servers right into a extra highly effective system.

The brand new Nexus 9364E-SG2 swap, primarily based on Cisco’s newest G200 customized silicon, gives 800G speeds and enormous reminiscence buffers to allow high-speed, low-latency connections throughout a number of servers.

Probably the most fascinating new additions are within the type of AI PODs, that are Cisco Validated Designs (CVDs) that mix CPU and GPU compute, storage, and networking together with Nvidia’s AI Enterprise platform software program. Basically, they’re fully preconfigured infrastructure techniques that present a neater, plug-and-play answer for organizations to launch their AI deployments – one thing many corporations starting their GenAI efforts want.

Cisco is providing a variety of various AI PODs tailor-made for numerous industries and functions, serving to organizations eradicate a number of the guesswork in deciding on the infrastructure they want for his or her particular necessities. Moreover, as a result of they arrive with Nvidia’s software program stack, there are a number of industry-specific functions and software program constructing blocks (e.g., NIMs) that organizations can use to construct from. Initially, the PODs are geared extra in the direction of AI inferencing than coaching, however Cisco plans to supply extra highly effective PODs able to AI mannequin coaching over time.

One other key facet of the brand new Cisco choices is a hyperlink to its Intersight administration and automation platform, offering corporations with higher machine administration capabilities and simpler integration into their current infrastructure environments.

The online result’s a brand new set of instruments for Cisco and its gross sales companions to supply to their long-established enterprise buyer base.

Realistically, Cisco’s new server and compute choices are unlikely to enchantment to huge cloud prospects who had been early purchasers of this kind of infrastructure. (Cisco’s switches and routers, however, are key parts for hyperscalers.) Nonetheless, it is turning into more and more clear that enterprises are involved in constructing their very own AI-capable infrastructure as their GenAI journeys progress. Whereas many AI software workloads will possible live on within the cloud, corporations are realizing the necessity to carry out a few of this work on-premises.

Particularly, as a result of efficient AI functions should be skilled or fine-tuned on an organization’s most dear (and certain most delicate) information, many organizations are hesitant to have that information and fashions primarily based on it within the cloud.

In that regard, although Cisco is a bit late in bringing sure parts of its AI-focused infrastructure to market, the timing for its almost certainly viewers may very well be excellent. As Cisco’s Jeetu Patel commented throughout the Day 2 keynote, “Knowledge facilities are cool once more.” This level was additional strengthened by the current TECHnalysis Analysis survey report, The Clever Path Ahead: GenAI within the Enterprise, which discovered that 80% of corporations engaged in GenAI work had been involved in operating a few of these functions on-premises.

Finally, the projected market progress for on-site information facilities presents intriguing new potentialities for Cisco and different conventional enterprise {hardware} suppliers.

Whether or not on account of information gravity, privateness, governance, or different points, it now appears clear that whereas the transfer to hybrid cloud took almost a decade, the transition to hybrid AI fashions that leverage cloud and on-premises assets (to not point out on-device AI functions for PCs and smartphones) might be considerably sooner. How the market responds to that speedy evolution might be very fascinating to watch.

Bob O’Donnell is the president and chief analyst of TECHnalysis Analysis, LLC, a market analysis agency that gives strategic consulting and market analysis companies to the expertise {industry} {and professional} monetary neighborhood. You may comply with Bob on Twitter @bobodtech



Leave a Reply

Your email address will not be published. Required fields are marked *