There's no avoiding AI and LLMs this year. The technology is being stuffed into everything, from office software to phone apps. Nvidia, Qualcomm, and others are happy to push the notion that this machine learning must be performed on an accelerator, be it a GPU or an NPU. Arm this week made the case that its CPU cores, used in smartphones and more throughout the world, are ideal for running AI software. For this week's Kettle, our journalists discuss the merits of running AI workloads on CPUs, NPUs, and GPUs; the power and infrastructure needed to do so, from personal devices to massive datacenters; and how this artificial intelligence is being used – what with Palantir's AI targeting system being injected into the entire US military.
With US Editor Iain Thomson; Editor in Chief, Chris Williams; Systems Editor, Tobias Mann; and Reporter, Brandon Vigliarolo