When companies talk about generative AI, they usually talk about models. What gets less attention is the plumbing those models need: searchable, indexed, and governed data that can actually feed those systems.
Dell Technologies just unveiled a major update to its AI Data Platform, positioning it as the backbone for enterprise AI. The rollout introduced a Data Search Engine built with Elastic, a Data Analytics Engine created with Starburst, a new Agentic Layer + MCP Server, and GPU-accelerated vector search through NVIDIA cuVS
Inside the platform that feeds AI
At its core, the AI Data Platform separates storage from compute and unifies four building blocks: storage engines, data engines, cyber-resilience, and professional services.
The storage layer is powered by Dell’s PowerScale and ObjectScale systems, both now integrated with NVIDIA’s GB200 and GB300 NVL72 hardware: a design built for large-scale GPU environments and AI workloads.
Dell claims the PowerScale F710, which has achieved NVIDIA Cloud Partner certification, delivers up to five times less rack space, 88 percent fewer network switches, and 72 percent lower power consumption than competing systems from VAST Data or Pure Storage
The Data Search Engine, developed with Elastic, indexes billions of files across PowerScale and ObjectScale using metadata, enabling natural-language search and real-time discovery. It’s designed for RAG workflows and incremental file ingestion to keep vector databases up to date.
GPU acceleration for this search comes from NVIDIA cuVS, a CUDA-based library that speeds up vector search and clustering. Dell says this delivers hybrid keyword + vector queries with lower latency and higher throughput: another vendor claim yet to be independently tested.
On the analytics side, the Data Analytics Engine, built with Starburst, allows distributed SQL queries across multiple data sources, from spreadsheets and databases to lakehouses. Its Agentic Layer uses LLMs to automate documentation and generate insights, while the MCP Server enables multi-agent applications.
Arthur Lewis, President of Dell’s Infrastructure Solutions Group, said:
“The Dell AI Data Platform is purpose-built to simplify data complexity, unify pipelines and deliver AI-ready data at scale.”
Openness as strategy, and proof inside Dell
Rather than building every piece in-house, Dell has chosen to integrate partner technologies. It calls this an “open, modular” strategy: Elastic for search, Starburst for analytics, NVIDIA for compute, and argues that customers want flexibility, not lock-in.
That openness distinguishes Dell from rivals like VAST Data, which markets an all-in-one “AI Operating System,” or Pure Storage, which focuses on DGX-validated flash systems. Dell’s platform is broader: part on-prem, part hybrid, and explicitly partner-built.
The modular design also mirrors Dell’s internal overhaul. Under a secretive program known as Project Maverick, the company is overhauling roughly 4,700 applications, 70,000 servers, and more than 10,000 databases to create a unified data backbone, described internally as “critical” to Dell’s modernization and AI strategy.
Dell hasn’t officially linked Maverick to the AI Data Platform, but the timelines line up: Maverick’s first phase goes live in early 2026, the same period when Dell will release the Agentic Layer and MCP Server. The connection is that Dell is already stress-testing its own architecture.
Investor response has reinforced that story. Following the October announcement, Dell’s stock rose 2.4 percent and analysts raised price targets, citing rising enterprise AI demand . The company has projected about $20 billion in AI server shipments for fiscal 2026.
Dell also cites IDC’s 2025 AI Infrastructure Tracker, which ranks it as the world’s No. 1 provider of AI infrastructure.
Still, execution risks remain. The platform’s performance depends on high-end NVIDIA GPUs, and those chips are still in short supply as hyperscalers secure most of the inventory (Reuters, Oct 2025). If GPU bottlenecks persist, Dell’s rollout schedule, including ObjectScale S3-over-RDMA tech preview in December 2025 and the Data Analytics Engine in February 2026, could slip.
Competitively, Dell faces pressure from every side. VAST Data and Pure Storage are moving up the stack into data management; HPE is expanding its hybrid GreenLake AI platform; and public clouds like AWS, Google, and Microsoft continue to absorb enterprise workloads with managed AI services.
Dell’s response is to own the hybrid middle ground, offering enterprises an on-premise, sovereign AI foundation that still connects to the cloud when needed. For regulated sectors like finance, defense, and healthcare, that’s an attractive compromise.
Michael Dell has described his leadership philosophy as such, “If you don’t have a crisis, make one to get people excited, motivated, and to drive the necessary change.”
That sense of urgency runs through the company’s current transformation, internally through Project Maverick, externally through the AI Data Platform. Both are designed to push Dell to move faster than its legacy might allow.
The company now calls the AI Data Platform “a critical component of the Dell AI Factory.” The ObjectScale S3-over-RDMA feature is due for tech preview in December 2025, while the Data Analytics Engine Agentic Layer and MCP Server are scheduled for February 2026.
If Dell ships on time and performance matches its claims, the company could anchor itself as an integrator that solves enterprise AI’s hardest bottleneck: data readiness.