Efficient power delivery is achieved with startup PowerLattice’s voltage regulator chiplets, boosting AI data center performance.
No.
Did you tell them that? They’re gonna waste a lot of energy cause they think it will.
We need analog and probabilistic chips to make this efficient.
The comments never disappoint.
Whilst there’s loss in the power supplies of GPUs, most buck regulator circuits are 95+% efficient. So on a 400W GPU, maybe you’re losing 20W to regulation. You also lose some to resistive voltage drop between the regulators and the GPU, but that will be a fraction of the regulator drop.
Better regulation closer to the core may save some watts, but it’s such a small slice of the power pie. The main part is the GPU itself and its memory.
The savings the headline is talking about is all about dynamic voltage and frequency, which GPUs (and CPUs) already do and only give savings when the chip is below maximum load.
And any reductions in power consumption would be eliminated by adding more compute capacity.
How about we just get rid of LLMs entirely? Surely that would save even more power.
Yeah, but where’s the profit in that?
I mean … there appears to be no profit in LLMs either but they are still cranking along …
I assumed I didn’t need a /s, but I guess I did
Employing the chiplets can reduce 50 percent of power needs for an operator, effectively doubling performance, the company claims.
Reducing power consumption is good but those efficiency gains don’t directly translate to better performance. That claim is a stretch at best.
The regulator is designed to be installed in very close proximity to the chip it powers. I wonder what kind of impact it would have on the chips ability to disapate heat since the regulator itself would be generating heat as well.
no it wont, they still have to peddle AI as a product to buyers, right now its only c-suites and ceo obsessing overit.





