A gamer and a graphic artist at the costumers site HalloweenCostumes.com teamed up to create a mashup of the animated character Lara Croft through nineteen years of the Tomb Raider video game series; they realized it also illustrated the unfolding of Moore's Law on console and PC video hardware more convincingly than the usual logarithmic-scale bar and line charts.
When the series started in the mid-90s, the small number of polygons and simple shading models used to render the character were painfully obvious. Contrast that with the nearly lifelike renderings of Lara from the 2013 and 2014 editions, which take advantage of orders of magnitude more capable hardware to employ sophisticated modeling and rendering techniques, not to mention gameplay.
More verbose histories of Tomb Raider can be found here (2008), here (2011) and here (2013).
(Score: 0) by Anonymous Coward on Sunday February 22 2015, @03:59AM
Don't mention gameplay. Just don't. We all know every elite gamer is a graphics bigot.
(Score: 1, Insightful) by Anonymous Coward on Sunday February 22 2015, @04:34AM
All power to them. More money thrown at NVIDIA and AMD GPUs has resulted in beefier GPUs making their way into supercomputers (along with Intel Xeon Phi) and promoting more parallel computing. Now NVIDIA is looking to put GPUs into self-driving cars [theregister.co.uk]. The "elite gamer" buys the expensive 4K screen one year, that screen is $600 the next, soon to be $200. Your phone will have the power of a Titan eventually.
As for gameplay, there are plenty of games with both great visuals and gameplay, plenty of pixel art titles to choose from if you're into that, and plenty of Kickstarter-funded Indie titles challenging the dominance of EA/Activision/whatever.