Nvidia and AMD are bringing Microsoft’s Copilot Plus AI features to gaming laptops

Nvidia and AMD are gearing up to launch gaming laptops that include the AI Copilot Plus features that Microsoft just announced for Qualcomm-powered laptops. At Computex today, Nvidia briefly teased that “RTX AI PC” laptops are on the way from Asus and MSI that will eventually include Copilot Plus PC features.

“Newly announced RTX AI PC laptops from ASUS and MSI feature up to GeForce RTX 4070 GPUs and power-efficient systems-on-a-chip with Windows 11 AI PC capabilities,” says Nvidia in a blog post. Nvidia confirmed to The Verge in a briefing that these laptops will come with AMD’s latest Strix CPUs.

AMD hasn’t officially detailed its AMD Strix laptop CPUs yet, which it will undoubtedly announce during its own keynote later today. Nvidia has also dropped a hint that the first AMD-powered Copilot Plus PCs might not get Microsoft’s round of AI features at launch, though.

Nvidia’s RTX AI laptops with Copilot Plus features are coming soon.
Image: Nvidia

“These Windows 11 AI PCs will receive a free update to Copilot+ PC experiences when available,” says Nvidia in a blog post. That suggests that Microsoft might not be ready to launch Recall and the other AI-powered Windows features on AMD chips, or that there could be some period of exclusivity for the Windows on Arm Qualcomm-powered hardware that launches on June 18th. Either way, we’ve reached out to Nvidia to clarify what its brief mention of this free update means.

Nvidia is also in a battle of sorts to remain relevant for AI-powered tasks on laptops. While Microsoft is pushing ahead with offloading AI models to NPUs, Nvidia is gearing up to make its GPUs useful in this AI battleground on PC. Nvidia is leaning hard into its branding of “RTX AI laptops,” noting that its GPUs are more capable of running heavier AI workloads than an NPU.

The RTX AI Toolkit arrives in June.
Image: Nvidia

It’s even launching an RTX AI Toolkit in June that includes tools and SDKs for model customization, optimization, and deployment. These tools will take something like Meta’s Llama 2 model and optimize it to run with far less VRAM requirements and with more performance.

Nvidia is also collaborating with Microsoft on the underlying AI models that are being built into Windows 11. “The collaboration will provide application developers with easy application programming interface (API) access to GPU-accelerated small language models (SLMs) that enable retrieval-augmented generation (RAG) capabilities that run on-device powered by Windows Copilot Runtime,” says Nvidia.

Microsoft announced its Windows Copilot Runtime at Build last month, and Nvidia says its work to accelerate AI models using RTX GPUs will be released in developer preview later this year. Microsoft’s Windows Copilot Runtime is designed to make it easy for developers to add AI-powered features to their apps, all while relying on NPU hardware to accelerate those features, or GPUs from Nvidia soon.

Since NPUs are currently at the ~40 TOPS performance mark and Nvidia’s PC GPUs can handle more than 1,000 TOPS for AI acceleration, there are clearly some big differences in performance for developers to think about here. NPUs are designed for smaller models and important high power efficiency in laptops, but GPUs can do a good job handling larger models with higher performance in PC desktops where battery life isn’t relevant.

It’s going to be interesting to watch this AI battle on PC shakeout, especially as Microsoft holds the keys to lighting up these experiences natively in Windows for Nvidia, AMD, Intel, Qualcomm, and its many OEM partners.

Source link

Denial of responsibility! NewsConcerns is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a Comment