I think you'll find that 16, 32 x AA etc are forced through the driver in general, which is probably a little outside the scope of a general review (no offense TS). I think the general idea was to test the cards at a level playable for all, using game menu settings which I think was achieved. My own personal queries regarding game selection aside, I think that the settings in general couldn't have been racked up too much higher without severely limiting the playability of the games. For instance The next step up in IQ for Metro 2033 is basically 4xMSAA/16xAF ~20fps for the GTX480, ~14fps GTX470, ~3fps HD5850/5870...maybe a big % win for the GTX's but not playable by any means....a textbook example of a hollow victory Crysis Warhead at the tested 4xAA/0xAF is what I would consider barely playable for the lower two cards. Going up a step to 4xAA/16xAF makes that a certainty. As for 16+, I don't know if nVidia and ATI can be directly comparisoned-as you probably know. While the new antialiasing for nVidia looks pretty good having seen it first hand, how do you compare an nVidia setting in say Modern Warfare 2 @ 16 (or 32) x S+TMSAA/16xAF (hybrid AA) with anything thats available to an ATI card ?