Not very away in time, ENIAC required about 170 kW on average because it used very power demanding devices for the Boolean logic. Vacuum tubes were replaced for semiconductors and the new era of computing successfully started. Since the eighties, microprocessors are getting faster, but sometimes they also consume more power creating a sort of deceiving mechanism.
The first Pentium III, with a clock of 450 MHz was at about less than 35 Watts and if you compare it with the last one, of 1.4 GHz the result is astonishing as they consume more or less the same power, even less the higher clock speed by a couple of watts.
Although right now these figures aren't being strictly respected, and this is the reason because power supplies for personal computers are quickly raising their power rating.
Some Pentium IV HT series augment to 115 Watts almost being a multiplication of its little brother back in the last period. For example: Pentium III 1.4 GHz – 31.2 W is more or less the same ratio of Pentium IV 3.6 GHz – 115W, isn't it? Tuning up a little better, those Pentium III stacked would create more clock speed at less power requirements.
Maybe the update in marketing policies and hardware infrastructure worth for the record analyzing the change from Intel Core i7-970 Gulftown 3.2 GHz, consuming 130 W to the Core i7-980X Extreme Edition Gulftown 3.333 GHz but with six Cores, also 130 W. It makes obvious how multicore processors don't have the total speed in each processor, that way they would consume more than 500W! Even here, with a lower nanometer scale (32nm) for the transistors the old Pentium III 1.4 GHz version has a better figure. A higher processor cache should speed up different tasks, but also multicore technology could be a nightmare if the software doesn't support it and forces the user to use a single core for the task.
Either way, what's happening with these numbers? Is the market going other time to the ENIAC mark? Supercomputers are now built stacking CPUs, thus everyone knows that doubling the power the processing capability also gets multiplied by two.
Clock speed might be confusing to the consumer as well, with Hertzs meaning different things. Could they? Not always with lists like these http://www.cpubenchmark.net/ where users can check the processor mark and judge by themselves if the electricity bill should get more expensive and us guiltier for an irresponsible use of resources. To play with the global energy consumption for maintaining a virtual sense of hardware evolution and sales seems a lamentable idea. Electricity means now more global pollution in the type of CO2 emissions, when not nuclear waste or extreme risk in the factories so let's remember it.