Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It still feels insane that a CPU can draw that much power. I mean that's like over 8 amps at 12 volts for doing... well nothing really. It's 100 watts of pointless resistance losses.

Something that could've powered like half of an electric scooter at top speed. And GPUs are like 3-4 times worse.



With its 8 amps—more like 80A at typical core voltages—it retires potentially hundreds of billions of instructions per second. That is a long way from nothing in my book.


Well sure, but would you count that as actual "work" in terms of physics? I think not, at least not to any degree that should be notable.

There's no practical work being done in a CPU, it's just throwing away charge so it can redirect it around and do some calculations in the process, like a river flowing downhill with some mechanical logic gates in the stream.


Maybe we'll look back at this and have a 100x improvement in energy use.

I'm not sure we should be surprised - changes cost energy. Lots of clock pulses and lots of transistors change state at GHz frequency.


The main problem I guess is that x64/x86 aren't designed for power efficiency as much as just pure speed. RISC does a lot better, and biological computers are on a few orders of magnitude more efficient. So there's definitely a lot of room for improvement.


Incandescent light bulbs ...


It's worse than you think.

During switching both the n and p transistors in a cmos circuit are open simultaneously for a brief period of time. During this time you are essentially shorting your power supply to ground!


"Dear god."

"There's more."

"NO."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: