Whilst there’s loss in the power supplies of GPUs, most buck regulator circuits are 95+% efficient. So on a 400W GPU, maybe you’re losing 20W to regulation. You also lose some to resistive voltage drop between the regulators and the GPU, but that will be a fraction of the regulator drop.
Better regulation closer to the core may save some watts, but it’s such a small slice of the power pie. The main part is the GPU itself and its memory.
The savings the headline is talking about is all about dynamic voltage and frequency, which GPUs (and CPUs) already do and only give savings when the chip is below maximum load.
Whilst there’s loss in the power supplies of GPUs, most buck regulator circuits are 95+% efficient. So on a 400W GPU, maybe you’re losing 20W to regulation. You also lose some to resistive voltage drop between the regulators and the GPU, but that will be a fraction of the regulator drop.
Better regulation closer to the core may save some watts, but it’s such a small slice of the power pie. The main part is the GPU itself and its memory.
The savings the headline is talking about is all about dynamic voltage and frequency, which GPUs (and CPUs) already do and only give savings when the chip is below maximum load.