Log in

Quick registration

Microsoft pushes for dominance in AI race

Author:tech target Time:2023/05/30 阅读:2267
Microsoft’s effort to make it easy for enterprise IT users to integrate ChatGPT’s generative AI technology into business, consumer and developer applications moved forward Tuesday, […]

Microsoft's campaign to make it easy for enterprise IT users to integrate ChatGPT's generative AI technology into business, consumer and developer applications moved forward Tuesday, as the cloud giant seeks to dominate the AI arms race.

The vendor rolled out a series of updates to ChatGPT and Bing browser, a new plug-in platform, and secured partnerships with key technology players including AI hardware/software provider Nvidia.

Product presentations and other developments appeared at Microsoft Builds, the vendor's developer conference held in person in Seattle and streamed to a virtual audience.

"It's about really creating opportunities for developers to reach all users across all these surface areas," said Microsoft CEO Satya Nadella, referring to the vendor's myriad of developer and business applications.

ChatGPT and Bing

One way the tech giant is helping developers reach a wider audience is by making Bing search part of OpenAI's ChatGPT .

This change will not only provide ChatGPT users with Microsoft's search engine, but it will also update ChatGPT, which was previously only trained on data until 2021. Users of the paid service can immediately ask ChatGPT for real-time queries, bridging the main gap of ChatGPT.

Users of the free version will later access the latest ChatGPT through the plugin, Microsoft said. The supplier did not set a specific date.

The move to make Bing search part of ChatGPT is logical, especially since a large language model like ChatGPT is more helpful when it can connect with the world in real time, said Forrester Research analyst Rowan Curran.

The closer ties between Bing Search -- a longtime competitor to Microsoft's archrival Google Search -- also makes sense, since Microsoft is bringing a number of new plugins to its ecosystem, Curran added.

extension

Microsoft revealed that it has adopted the same open plug-in standard as its partners use in Generate AI OpenAI.

Developers now have access to more than 50 plugins on a new platform that enables users to plug into consumer and business applications, including ChatGPT, Bing, Microsoft 365 Copilot and Kinetics 365 Copilot.

Developers can also create, test and deploy their plugins.

These plugins basically allow these models to be... not just brains in a box. They allow them to have hands and feet and other ways of interacting with the world.
Rowan CurranForrester Research Analyst

"The plugin is 100% necessary to make these large language models useful at deployment time," Curran said. "These plugins basically allow these models...not just a brain in a box. They allow them to have hands and feet and other ways to interact with the world."

Instead of just asking LLMs questions, developers and other users can interact with and perform actual operations in the real world through plugins.

The plug-in strategy is also an effective strategy for the tech giant to offer AI in its portfolio, especially through its Copilot app, said Karl Freund, founder and analyst at Cambrian AI.

Microsoft defines Copilots as applications that use AI and LLMs to help users perform cognitive tasks.

By using plug-ins to connect to different Copilot applications, "Microsoft is becoming the leading cloud provider for AI applications for applications," Freund said.

While Google will continue to be a worthy competitor, it doesn't have the co-pilot functionality that Microsoft uses across a wide range of applications, he added.

In addition to building bridges through plugins for its Copilot app, Microsoft is forging key new strategic partnerships.

strategic partnership

One of these is an alliance with Nvidia, one of the most dynamic independent players in AI.

Microsoft's Azure Machine Learning service will now integrate with Nvidia AI enterprise software. The integration will help Azure customers build, deploy and manage custom applications using 100 Nvidia AI frameworks and tools, while being powered by Nvidia's AI platform.

"It's a natural for Microsoft Azure," Freund said.

The partnership offers Microsoft users the same benefits as Nvidia Enterprise AI -- a platform for building and customizing AI products, he said.

Enterprises that want to build domain-specific versions of ChatGPT can use Nemo, Nvidia's cloud-native framework for building LLMs, to do so.

Despite Microsoft's close relationship with OpenAI, the enterprise AI integration with Nvidia fills some hardware and software gaps that software vendors and research organization OpenAI can't.

"Companies need to find a path to identify the best AI opportunities from their proprietary data and build an AI strategy that handles everything from training to model development to governance," said Futurum Research analyst Daniel Newman.

"Nvidia provides enterprises with a more complete set of tools that can work with ChatGPT and other open source Llms," he continued.

At the same time, "Microsoft needs to use Nvidia chips on their Azure platform, and that's mostly what they have to do to make sure that both their OpenAI and Copilot work successfully," said Jack Gold, an analyst at J. Gold Associates.

While Nvidia works with several cloud providers, including Google, there is no conflict of interest. Nvidia positions itself as an AI "weapon provider," building and deploying "weapons" for use by cloud providers, Freund said.

Microsoft also revealed a new responsible AI initiative. The cloud provider's new Azure AI content security service aims to help developers create safe online environments, featuring models that can detect inappropriate content in images and text. Content Security is now in preview.

Esther Ajao is a news writer covering artificial intelligence software and systems.

Leave a Reply


copyright © www.scitycase.com all rights reserve.
Beijing ICP No. 16019547-5