Microsoft has recently unveiled its plans to launch NPU-optimized editions of its DeepSeek-R1 software. This update will leverage the AI-specific hardware present in Copilot+ PCs. As noted on the Windows Blog, this enhancement will initially be available on Qualcomm Snapdragon X PCs and later expand to include Intel Core Ultra 200V (Lunar Lake) and additional processors. The first version to be rolled out will be the DeepSeek-R1-Distill-Qwen-1.5B, a compact model validated by a UC Berkeley AI research group as the smallest effective model, with its larger counterparts of 7 billion and 14 billion parameters set to follow.
The refinement of DeepSeek means it requires 11 times less computational power compared to its Western counterparts, positioning it as an ideal option for consumer devices. Moreover, it incorporates the Windows Copilot Runtime, allowing developers to integrate on-device DeepSeek APIs directly into their applications.
Microsoft also states that this NPU-optimized DeepSeek variant will provide “highly competitive response times and data processing rates, while having minimal impact on battery life and PC resource consumption.” Users of Copilot+ PCs can thus anticipate performance levels comparable to major models like Meta’s Llama 3 and OpenAI’s o1, while also enjoying superior battery life.
However, the deployment of DeepSeek on Copilot+ PCs appears to be more targeted towards developers and programmers rather than everyday users. This strategy might be Microsoft’s way of prompting more developers to create applications that exploit the capabilities of AI-powered PCs, especially since many consumers remain skeptical about the necessity of such technology and often choose these devices by default due to limited alternatives.
The decision by Microsoft to initially favor Qualcomm Snapdragon X PCs with this release is intriguing. Despite launching Copilot+ with these chips in July, the latest mainstream laptops from Intel and AMD now also come equipped with built-in NPUs. AMD has even provided guidelines on how its Ryzen AI CPUs and Radeon GPUs can support the software, claiming that the RTX 7900 XTX performs better with DeepSeek than the RTX 4090.
Regardless of these specifics, the potential that DeepSeek introduces for AI applications is still thrilling. Being open source, it allows virtually anyone to download, run it locally, and contribute to the enhancements and efficiencies initially established by the base model.