Originally Posted by Cathar
The older silicon processes were less energy hungry because they suffered less from thermal leakage, and what thermal leakage they did have was confined to a relatively small number of transistors (1/100th the amount of today's CPU's).
As CPU manufacturing processes have shrunk and more transistors get packed into the same area, it's my understanding that the CPU manufacturers are doing all that they can to keep a lid on CPU thermal output as it is. This is the main problem, that being transistor density. Each transistor that's in use leaks some amount of energy as heat, and (speaking extremely simplistically) when you stick 100x a many transistors in the same area, what was a 1W problem before now becomes a 100W problem.
Again, hugely simplified, but overall it does highlight the nature of the problem that CPU manufacturers face. As it stands, it would appear that thermal issues appear to be the main reason why CPU's can't be made any faster. Sure, there are breakthroughs here and there, and some CPU's run fairly cool, but the moment you try to ramp the speed up to the point that the CPU process should be capable of supporting, heat skyrockets, so instead we have CPU's that run fairly cool only because they run somewhat slowly compared to what they can do. For example, Conroe's running at 2.4GHz stock, but given sufficient cooling will do 4-5GHz.