The Edge Is the New Data Center, According to Cisco

Power scarcity and latency are forcing AI out of hyperscale facilities and into factories, hospitals, and retail floors

Global demand for AI computing is outpacing supply at every level of the stack. 

The race has shifted from building larger models to finding where those models can actually run, where power is available and latency is low. That search is pushing compute outward from the cloud to the edge.

Power defines the new geography

The scale of the build-out is unprecedented. Amin Vahdat, Google Cloud’s vice president for systems infrastructure, said this cycle is “100 times what the internet was.” Cisco’s Jeetu Patel compared it to “the space race and the Manhattan Project all put into one.” Both point to a common constraint: power. “Data centers are being built where the power is available,” Vahdat said in the same interview. The old rule of locating compute near population centers no longer applies.

Cisco’s AI Readiness Index quantified the gap. Only 13 percent of companies have infrastructure capable of scaling AI workloads, and just 21 percent report sufficient access to GPUs. More than half say their networks can’t meet AI bandwidth or latency requirements. Most face what Cisco calls “AI infrastructure debt”, that is, aging data centers, inconsistent governance, and limited visibility across systems. The problem is structural: power and networking costs are rising faster than compute efficiency, while demand for localized inference keeps accelerating.

Cisco expects 75 percent of enterprise data to be created and processed at the edge by 2027. Its response is the Unified Edge platform, launched in November 2025. The system integrates compute, GPUs, networking, and storage in a modular chassis that can be rack- or wall-mounted and supports redundant cooling and power. It connects over integrated 25 Gb networking and runs zero-trust security, telemetry through ThousandEyes and Splunk, and centralized management via Intersight. The goal is to bring real-time inference and agentic AI workloads to remote sites without shipping data back to the cloud.

Examples span industries. In retail, local vision models detect shelf outages in milliseconds instead of seconds. In healthcare, patient data stays within facility boundaries for regulatory compliance. In manufacturing, predictive-maintenance agents monitor machinery in real time. Each case part of the same shift: moving intelligence closer to where it acts.

Investors have noticed. UBS upgraded Cisco to Buy after the company reported more than $2 billion in AI-related orders in FY 2025 and forecast a campus-network refresh across Fortune 500 enterprises. The note cited surging demand for “AI-ready” switching, edge compute, and power-dense racks: a hardware cycle not seen since the early internet boom. Cisco’s revenue mix is already tilting toward recurring software and security subscriptions layered on top of these infrastructure deployments, linking physical and digital expansion.

Owning the systems, not the API

What’s emerging is a reversal of the last decade’s cloud logic. Instead of outsourcing compute, enterprises are beginning to own it again, because compliance, latency, and control now matter more than elasticity. Vahdat calls it a return to co-design, where hardware and software evolve together, as Google once did with Borg and BigTable. Patel describes Cisco’s approach as integration “from the physics to the semantics,” arguing that networking will determine the performance ceiling for AI workloads.

Cisco’s partnership with NVIDIA illustrates this alignment. Their Hyperfabric AI and N9100 switches use Cisco Silicon One or NVIDIA Spectrum-X chips to link clusters across sites hundreds of kilometers apart, allowing two facilities to function as one logical data center. Each 51.2 Tbps switch supports both NX-OS and SONiC environments, offering enterprises flexibility in how they operate. The joint architecture cuts deployment time for AI clusters from months to weeks and supports neocloud and sovereign-cloud requirements where regional compliance dictates data locality.

The Secure Enterprise Network Architecture, announced the same week, extends these capabilities to branches and campuses. It introduces Wi-Fi 7 access points, SD-WAN routers, and identity-based access control managed through Intersight. Together, they create an operational framework for distributed inference that remains observable and policy-compliant. Cisco calls this AgenticOps: networks that automate self-healing and provisioning through AI agents working alongside human operators.

Competitors are positioning differently. HPE’s Ezmeral Edge Platform and Dell’s NativeEdge offer converged compute for similar workloads, while hyperscalers are pushing managed stacks such as AWS Outposts and Google Distributed Cloud. Cisco’s differentiation lies in its integrated networking and security model, which is an increasingly decisive factor for AI performance and governance.

The implications for enterprises are that AI that once lived in centralized data centers moves outward into factories, stores, and hospitals. Inference workloads now depend on local power, secure connectivity, and consistent governance. Regulatory pressure adds urgency: processing data where it’s generated reduces cross-border risk under the EU AI Act and California’s CPRA.

Ownership also reshapes internal culture. Cisco reports its own engineers using AI for code migration, debugging, and documentation: a “cultural reset,” as Patel called it in the a16z interview. The same pattern will apply to operations teams managing distributed infrastructure: continuous iteration, faster cycles, and integration of human and machine oversight.

📣 Want to advertise in AIM Media House? Book here >

Picture of Mukundan Sivaraj
Mukundan Sivaraj
Mukundan covers the AI startup ecosystem for AIM Media House. Reach out to him at mukundan.sivaraj@aimmediahouse.com or Signal at mukundan.42.
Global leaders, intimate gatherings, bold visions for AI.
CDO Vision is a premier, year-round networking initiative connecting top Chief
Data Officers (CDOs) & Enterprise AI Leaders across major cities worldwide.

Subscribe to our Newsletter: AIM Research’s most stimulating intellectual contributions on matters molding the future of AI and Data.