Log in

Quick registration

Dell partners with Nvidia to enter derivative AI market

Author:techradar Time:2023/05/30 Read: 5892
Dell Technologies has entered the generative artificial intelligence market. The computer hardware and software provider has launched a joint project with AI provider Nvidia […]

Dell Technologies has entered the generative artificial intelligence market.

The computer hardware and software provider launched a joint project with AI provider Nvidia to make generative AI capabilities available to local Dell customers to meet their business needs.

Dell revealed the partnership, called Project Helix, on May 23 at the Dell Technologies World 2023 conference in Las Vegas.

Customers will have access to Dell servers such as the PowerEdge XE9680 and PowerEdge R760xa for training and AI inference, as well as Nvidia H100 Tensor Core GPUs and Nvidia networking. The project also includes Nvidia AI Enterprise software, which supports Nvidia's large-scale language model framework, NeMo and Nemo Guardrails for building responsible generative AI chatbots.

AI in a box

"What Dell is trying to do is basically give you AI in a box," said Jack Gold, an analyst at J. Gold Associates.

Compared with cloud providers such as Google or Microsoft, Dell focuses on enterprises that need their own infrastructure.

"Dell is also investing in hardware and saying, 'We'll sell you the whole box with the software,'" Gold said.

While other vendors, such as stand-alone SambaNova systems, also offer complete generative AI packages (whether hardware or software), Dell's differentiating factor is what drives its Apex services portfolio, Gold said.

However, more providers are expected to offer such services soon, he continued. "But if you're a Dell house, if you're buying from Dell, buying from Dell is attractive because now they're giving you capabilities that you didn't have before."

some disadvantages

For businesses looking for a native generative AI infrastructure, the initial expense is a drawback. However, the cost is still likely to be lower than running Llm on the cloud.

Dell is also throwing in hardware and saying, 'We'll sell you the whole box with the software.'
jack goldAnalyst at J. Gold Associates, LLC

"In the long run, it might be more affordable, depending on what you're doing, because you're not paying the service provider's margin," said Cambrian AI analyst Karl Freund. "So while the upfront cost of doing that is more High, but in any type of large-scale operation, it will be cheaper to operate on-site."

Another downside could be security, Gold said. Businesses will have to trust that what Dell and Nvidia provide will be safe and not feel at risk of losing data in the event of a breach.

Businesses may also have to rely on Dell's IT team to keep the infrastructure running smoothly, Gold added.

However, it is also an advantage for businesses that need more talent, he said. Businesses that need more IT staff to manage their infrastructure will appreciate Dell's team helping them with this task.

The main challenge for Dell will be proving how its infrastructure stands out despite being one of the largest OEMs.

"Every OEM will find a way to do this, but it may not be obvious to IT users," said Daniel Newman, an analyst at Futurum Research. "This will be Dell and others proving their differentiation."

According to the vendor, Project Helix systems will be available through traditional Dell channels, with an Apex flexible consumption option starting in July.

The Dell conference started on May 22 and will be held in person almost until May 25.

Esther Ajao is a news writer covering artificial intelligence software and systems.

Leave a Reply


copyright © www.scitycase.com all rights reserve.
Beijing ICP No. 16019547-5