View Single Post
Old 2007-10-30, 20:41   #7
sdbardwick's Avatar
Aug 2002
North San Diego County

23×5×17 Posts

Originally Posted by Anonymous View Post
I can understand how power usage could rise for a laptop when it's doing DC work, becuase laptops usually step down the CPU clock when it's not under heavy use, to save power, but how would a desktop use more power doing distributed computing? Do some desktops clock down the CPU when it's not being used, too?
More transistors in the CPU being used more of the time results in more current leakage = more power used.[/oversimplification]

Last fiddled with by sdbardwick on 2007-10-30 at 20:42
sdbardwick is offline   Reply With Quote