Today’s most impressive AI tools—OpenAI’s ChatGPT, Microsoft’s Bing, Google’s Bard, and Adobe’s Photoshop, for example—run in data centers full of powerful and expensive servers. But Intel on Monday revealed details about its Meteor Lake PC processor, which could help your laptop play a bigger role in the artificial intelligence revolution.
Meteor Lake, scheduled to ship to PCs later this year, includes circuitry that speeds up some AI tasks that would otherwise drain your battery. For example, it can improve the AI that recognizes you to better blur or replace the background during video conferences, said John Rayfield, Intel’s head of customer AI work.
AI models use methods inspired by the human brain to recognize patterns in complex real-world data. By running AI on a laptop or phone processor instead of in the cloud, you can get benefits like better privacy and security, as well as faster response time because you don’t have network delays.
What’s not clear is how much AI work will actually move from the cloud to computers. Some software, such as Adobe Photoshop and Lightroom, make extensive use of AI to find people, skies, and other objects in photos and many other image editing tasks. Apps can recognize your voice and transcribe it into text. Microsoft is building an AI chatbot called Windows Copilot directly into its operating system. But most computing work today exercises the more traditional parts of the processor, its central processing unit (CPU) and graphics processing unit (GPU) cores.
There is an opportunity to build and they will come. Adding AI acceleration directly into the chip, as has already happened with smartphone processors and Apple’s M-series Mac processors, could encourage developers to write more software using AI capabilities.
However, GPUs are already pretty good at speeding up AI, and developers don’t have to wait for millions of us to upgrade our Windows PCs to take advantage of it. The GPU offers the highest AI performance on a PC, but the new AI-specific accelerator is good for low power, Rayfield said. Both can be used simultaneously for best performance.
Meteor Lake key chip for Intel
Meteor Lake is important for other reasons as well. It’s designed for lower-power operations, perhaps the biggest competitive weakness compared to Apple’s M-series processors. It has improved graphics acceleration which is critical for gaming and also important for some AI tasks.
The processor is also key to Intel’s long-running turnaround efforts. It is the first major chip built with Intel 4, a new manufacturing process that is essential to catch up with chip leader Taiwan Semiconductor Manufacturing Co. (TSMC) and Samsung. And it uses a new advanced manufacturing technology called Foveros, which allows Intel to stack multiple “chiplets” more flexibly and economically into one more powerful processor.
Chipmakers are racing to get in on the AI revolution, few as successful as Nvidia, which earlier in May reported a quarterly decline thanks to explosive demand for its top-of-the-line AI chips. Intel also sells AI chips for data centers, but focuses more on economics than performance.
In its computer processors, Intel calls its AI accelerator the vision processing unit, or VPU, a product family and name that stems from its 2016 acquisition of AI chipmaker Movidius.
These days, a variation called generative AI can create lifelike images and human-sounding text. Although Meteor Lake can run one such image generator, Stable Diffusion, large AI language models like ChatGPT simply don’t fit on a laptop.
However, there is much work to be done to change this. Facebook’s LLaMA and Google’s PaLM 2 are large language models designed to scale down to smaller “client” devices like computers and even phones with much less memory.
“AI in the cloud … has challenges with latency, privacy, security, and it’s fundamentally expensive,” Rayfield said. “Over time, as we can improve computational efficiency, more of this migrates to the client.”