Probably the worst coded game in existance. Nvidia and AMD are somehow at fault or dragging their heels because one (of sometimes two*) poorly coded game in three years brought GPU's to their knees? The fact that you can count the number of games using CryEngine 2 on the fingers of a Mickey Mouse hand- and have fingers to spare for future releases should tell you how unoptomised the games default settings are. And if that doesn't then maybe
seeing the massive jump in framerate once the game IQ setting are optimised should be somewhat apparent
Following your example, Intel and AMD are also stagnating with CPU vectorization because of GTA IV's voracious appetite for core speed and memory bandwidth?
(* The other game being Metro 2033 when checking the tacked-on DX11 IQ settings)
True. It's not like AMD have been using energy efficiency and it's attendant lower heat output as a marketing point for the last year or so.......oh.
BTW: Repeating your posting might inflate your post count, but it doesn't add a whole lot to the debate.