Asia

Europe

Julia Talevski
Editor ARN | Reseller News

Dell launches AI Factory updates with managed services

New products aim to speed AI enterprise adoption

Dell Technologies World 2024
Credit: Dell Technologies World 2024

As organisations continue to make strides in deploying AI, Dell Technologies has revealed significant updates to its AI Factory and its core infrastructure portfolio during Dell Tech World 2025.

Since launching the Dell AI Factory a year ago, success has been measured with over 200 new releases and 3000 customer wins and counting.

During the global conference, Dell emphasised the importance of an open ecosystem and managed services to simplify enterprise AI adoption.

Managed services for the Dell AI Factory with Nvidia were a key highlight that aims to offer a purpose-built set of services to simplify AI for businesses, enabling and navigating AI complexities.

“We manage the full stack of NVIDIA technologies, ensuring seamless integration across hardware, software and their use cases. This eliminates operational hurdles so organisations can focus on outcomes, rather than the infrastructure, software and data that’s required to power these important AI applications,” Dell Technologies senior director of professional services marketing Matt Toolan said during a media pre-briefing.

“We know that AI complexity and resource gaps are a major challenge for our customers. This is why this solution is designed to address this head-on. For businesses struggling with limited in-house expertise or stretched teams, we provide the skills and bandwidth needed to manage and optimise the customer’s AI operations.

“By handling all the heavy lifting, we free up the internal resources to focus on their key strategic initiatives to ensure reliability.”

The solution provides 24/7 proactive monitoring, reporting, version upgrades and patching, helping teams overcome resource and expertise constraints by providing cost-effective, scalable and proactive IT support.

“Customers can confidently pursue innovation without downtime concerns,” he said.

“Many of our customers are doing PoCs in the cloud, and they need managed services to handle the complexity of their production environments on-prem, and moving from that cloud environment to on-prem is being a major roadblock for them.”

New products aim to speed AI adoption

Dell Technologies senior vice president of product marketing Varun Chhabra said the new updates across AI, PCs, servers, storage, networking solutions and integrations with AI ecosystem partners as well as services updates, were all designed to do what the Dell AI Factory was created for “to help speed up the deployment and adoption of AI within enterprise organisations.”

Within the past year, Chhabra said 65 per cent of organisations successfully transitioned AI proof of concepts to production, signalling a maturity in adoption of AI within enterprises, as well as operational readiness.

Chhabra said there was nothing more important to AI deployment success or initiatives than the data that is powering AI efforts.

Enhancements were also announced to the Dell AI Data Platform to support large scale AI deployments featuring ObjectScale, PowerScale and PowerEdgeXE servers including Project Lightning, which was revealed last year, and will be available in the market this year.

“We’ve made significant progress with it. Our internal testing now shows that Project Lightning outpaces parallel file systems by up to two times faster and offers 67 per cent faster data access than its nearest competitor,” he said.

“We believe that when Project Lightning will roll out later this year, it will be the world’s fastest parallel file system, which will really help with organisations that are looking to take advantage of high performance AI workloads like training, inferencing or large scale agentic AI workloads.”

Getting into the specific advances of the Dell AI Platform for NVIDIA environments, Varun said it was delivering a single integrated stack that combines Dell’s infrastructure and storage solutions, including PowerScale and ObjectScale, as well as data engines combined with Nvidia’s technologies like agentic AI workflows.

“This powerful combination packages all of the hardware and the software needed by customers to run their NVIDIA workloads into one turnkey solution,” he said.

Chhabra explained the platform is built on a disaggregated architecture which allows businesses to scale compute, storage and networking and process those independently, empowering organisations to adapt and grow the AI data platform as required.

“This solution is really aimed at building and helping customers adopt AI faster than they have ever before, to power their agentic AI and other AI workloads,” he said.

According to Dell, 79 per cent of production workloads are now running outside the public cloud in data centres on PCs or at the edge.

“Data, as we have been saying for quite a few years now, is really the foundation of a successful AI deployment,” he said. “89 per cent of organisations believe that data is key to their AI strategy, driving a need for organisations to prioritise the use of data protection as well as integration of data across various systems.”

New professional services are being rolled out to help customers implement data as a product strategy and streamline data lifecycle management for greater efficiency.

“One of the challenges that customers see with the ecosystem is that AI technology is rapidly evolving, and there are a lot of different vendor solutions to keep track of,” he said.

“We have been taking a lot of effort over the last two years to make sure that we’re making it easier than ever before for customers to be able to deploy the latest ecosystem innovations on top of Dell infrastructure.”

An open ecosystem of ISV providers and open-source innovation in the software layer has also been tested and validated, so customers can deploy the latest innovations from AMD and Dell.

“Our ISV ecosystem is expanding by leaps and bounds as well,” he said.

Dell is collaborating with a host of AI ecosystem players such as Cohere to provide autonomous workflows; Google Gemini and Google Distributed cloud on-premises on PowerEdge XE9680 and XE780 servers; the ability to build agent-based enterprise AI applications with Llama using Meta’s latest Llama Stack and models; run scalable AI agents with Glean; and deploy customisable AI applications and workflow solutions jointly engineered by Dell and Mistral AI.

The new Dell AI Platform with Intel helps enterprises deploy a full stack of high performance, scalable AI infrastructure using Intel Gaudi 3 AI accelerators.

New AI PCs

On the product front, Dell introduced the new Pro Max Plus notebook, featuring Qualcomm’s AI 100 PC inference card.

This builds upon the three new product families that were released earlier in the year including Dell Pro, Dell Pro Max and Dell Pro Max high performance AI PCs.

Dell Pro Max Plus was revealed during Dell Tech World that includes the industry’s first ever enterprise grade discrete NPU in a mobile form factor, leveraging Qualcomm’s AI 100 PC inference card to “supercharge inferencing at the edge.”

“We’re focused on making AI innovation more portable and certainly more powerful,” he said.

“There are lots of advantages in general, of running large AI models on the AI PC itself. This really enables very fast responses, low latency responses, and with data staying local as well.”

Julia Talevski travelled to Dell Tech World in Vegas as a guest of Dell Technologies.


Julia Talevski

With years of experience covering the latest technology trends and business news across the IT channel, Julia Talevski has been keeping the IT industry connected in Australia and New Zealand. She is currently the editor for ARN and Reseller News, responsible for keeping the community engaged at every touch point through our newsletters, websites and main events such as EDGE, WIICTA and Innovation Awards.

More from this author