Moore's Law isn't what it used to be. Processor speeds have virtually been standing still for the past two years. In October 2003, Intel's fastest 32-bit processor was 3.2 GHz. Today, two years on, it is 3.8 GHz, an increase of 19%. That's a far cry from the doubling predicted by Moore's law.
Don't think that Moore's law is inevitable, and that some new discovery on the horizon will see processor speeds resume the trend as if nothing happened. It shouldn't be suprising that performance increases are becoming more difficult, as the technology approaches physical limits. I wouldn't be suprised if the speeds of our present processors remined quite acceptable for several years to come.
-- Tim Starling
I think this whole discussion of moore's law is misplaced. Moore's law (even as [ab]used colloquially) doesn't imply that current processors are going to become less acceptable in the future. A machine which can handle X hits per second today should still be able to handle X hits per second 3 years from today. For hardware to "become obsolete before it breaks down" would require quite a tremendous advance in technology (or a really shitty system scalability).
I guess in a colo facility where you pay for every cubic inch of space that's somewhat less true, though. If you can replace two older servers with a single one that's faster, you save on hosting costs, and in theory that could make up for the cost of the hardware over the expected lifetime of it. My intuition is that new technologies aren't coming out that fast, but to measure whether they were would require looking at the cost spread out over the full useful lifetime of the server.
Of course this all presumes that processor speed is a limiting factor in the first place. I have no idea if that's true or not.
All that being said, Daniel's model seems to factor most of this in. In fact, it might very well factor all of it in. I have a tough time sorting through all the various documents on the wiki to really understand what's going on :).