Gaming Perf: Are we hitting a plateau?

Generational perf gains not coming like they used to be?

Historically, being a PC gamer meant that you had to invest in your hardware. In order to keep up with the kid next door, or even just to experience in full what the original Unreal had to offer, graphically. Spending thousands every couple years to refresh your gaming rig with the best graphics card/processor combo that any consumer would ever need, so brag your epeen about being able to play whatevertheheck game at ever increasing resolution with more shadows, more NPCs, and nifty little graphical quirks. I remember when playing Doom at 30 FPS was a feat, and that was at 320×240 back in 1994, or when Descent suddenly became a whole new game after I popped in 3dfx’s newest Voodoo2-chipset (does anyone remember the loop back cables that you used to need?). I remember it had a whopping 8 MB of RAM; ‘Who would ever need that much’, I thought. Mind you, this was on a Pentium 90 Mhz, with a massive 64 MBs of RAM, my second computer, which was a beast, when everyone around me had 386SX at best (this was in Korea).

Sometime in the last decade, the paradigm changed. I say this while typing from my home desktop computer, an 8-year old behemoth (Core 2 Quad, 2.66 GHz, 8 GB, Radeon 7850), that nonetheless happily chugs along the latest games that I’m interested in playing. A decade ago, I’d have been well into my second upgrade cycle, but I’m finding little want given it’s current performance, gaming or otherwise.

I’m not alone here.

This is also evidenced in the current “next gen” systems, the Xbox One and Playstation 4, both of which tend to fall behind even mid-range PCs of today. Given that these systems are expected to have 10+year functional lifespans, it seems that Microsoft and Sony are making the bet that pure number crunching power isn’t what will keep them on top.

Just some thoughts.