Inferior my arse nothing like a snob 1440p is really good and of course there are better resolution and that for me would be 4k over the inferior 1600p.Great article, thanks!
Nice to see some benchmarks at 1600p rather than the inferior, but more prevalent 1440p.
Should the article also take into account the price differences world wide to accommodate your wish? There has to be a stopping point somewhere, why not stop where they did?I think this article is lacking as VERY little attention was paid to the pricing of the cards. Probably not so simple to do, but the that one sentence and one row in a table really doesn't cut it. I find this strange, as reviewing the evolution of Nvidia cards should also take into account the price increase for the high-end cards...
Should the article also take into account the price differences world wide to accommodate your wish? There has to be a stopping point somewhere, why not stop where they did?
1440p or 16:9 actually displays more screen than 1600p or 16:10 in games. Every game I have come across is coded this way.Inferior my arse nothing like a snob 1440p is really good and of course there are better resolution and that for me would be 4k over the inferior 1600p.Great article, thanks!
Nice to see some benchmarks at 1600p rather than the inferior, but more prevalent 1440p.
Would it be possible to do the testing since the earliest of cards like GeForce 256 and Geforce 2/3? I know it's impossible to test using the same set of games, but using multiple generations of games with overlap in few GPU generations it should be possible to establish a reasonable conclusion on the relative power of GPUs.
I know it would be a lot of work, but it would be just so interesting. Pretty please?
To what end? I am not sure digging up an old AGP system to show how badly these 16 year old GPU's perform in any game made in the last decade would be good use of my time
To what end? There's already a reason stated in the article. "Whether you are looking to upgrade from an older GPU or simply appreciate the statistical significance of the data presented, ..."
I haven't see it done anywhere. But wouldn't it help appreciate the advance GPU's have made over all these years? Besides, it's simply interesting. Why else this was done in the first place?
Those line graphs tell a big story. The steep up-turn for the 1080 is very apparent. I wonder if GPU's can keep that momentum in the next generation of cards or will they flatten out in minor evolutionary upgrades as the 780 - 980 did? Only time will tell I suppose. But it's good to see the chips are in a much better place.
1600p and 1440p have the same pixel density. They are equal. The difference is 1600p is 16x10 and 1440p is 16x9, so the former is more square and the latter is more widescreen.Great article, thanks!
Nice to see some benchmarks at 1600p rather than the inferior, but more prevalent 1440p.
Gee, how many other people will make the same request with THEIR favorite games? Do you expect Techspot to test 200 games? If the games you mentioned are as demanding as Crysis 3 or Mtro Redux, use yer noggin and figure that the frame rates would be similar. It's not that complicated...Why not test these cards using these games: Dirt Rally, Grid, Game Stock Car Extreme to name a few. They are very demanding on frame rates as well as all the other games. Last year racing gamers spend 265 million on games. Never mind how much they spent on computer gear.So they are worthy of being tested.
What's the point? That's like asking Car and Driver to test a '74 Dodge Dart. It's meaningless in 2016.Would it be possible to do the testing since the earliest of cards like GeForce 256 and Geforce 2/3? I know it's impossible to test using the same set of games, but using multiple generations of games with overlap in few GPU generations it should be possible to establish a reasonable conclusion on the relative power of GPUs.
I know it would be a lot of work, but it would be just so interesting. Pretty please?
Wow- do you think Techspot is your butler, fulfilling your every whim no matter how stupid or unpopular? Use your imagination- they sucked then and they really suck now. Or better yet, start your own website and do these utterly meaningless (and NON-interesting) tests yourself. Your arrogance is shocking.If you are rocking a 16 year old GPU you probably aren't interested in gaming, or modern computing for that matter. You are talking about a DX7 part here that uses an interface I haven't seen in a decade. How would we even go about making a meaningful comparison?
Outside of people making "classic" computers, nobody is. But that would not be the point. There's value in charting progress in itself. Or as you put it, "appreciate the statistical significance of the data presented"
You can't compare directly via fps. But you can do it relatively. Take lowest couple of cards, like 480 and 580 or even with 680 and retest them, along with older cards like 280, 9800gtx, etc or whatever makes sense in terms of dx3d or interface, with a different set of games from 2007-08, like company of heroes, bioshock, etc. down until performance drops off. Then pick a few lowest generations of that set, and retest them again with another, still older set of games. Those couple of cards will serve as a bridge to stitch a graph of performance along X axis, and compare all the cards all the way back to 256, like the chart on page 6 does now. Except that on Y axis it will not be possible to show unbroken fps. It would need to have separate fps counter for each set of games.
It's basically GeForce 256, 2, 3, 4, 5, 6, 7, 8, 9 and 200 series, + however equivalent releases are within series. It shouldn't be more than a few different sets, unless I'm horrifically wrong about agp/dx3d/etc support in cards and games. I'm sure it's a pain to source all those old cards, MOBOs and CPUs and drivers, but if successful, comparison would be just so epic.
FALSE. That's a myth- and you bought it hook, line and sinker- like the typical AMD fanboy. Why haven't my four year old SLI 670s slowed down then? They haven't- and unlike yourself, I own Nvidia cards and have been monitoring their framerates since day one, and I revisit games like Crysis now and then.Nvidia cripples old graphic card performance with new drivers. So if latest driver version was used with every card, including Keplers, Nvidia's development looks way too rosy.
Nvidia never made an 800 series for desktop. They made an 8000 series, but that was before the oldest card in this line up.Wonder why they skipped over the 800 range?
I hardly ever use AA- it's usually not necessary on a high-res monitor. So yeah- it's VERY real. Take care now.Techspot please, in´st a real comparison, nobody uses a high-end card with no AA settings, a real comparison needs AA settings, because higher settings shows less diferences in performance, low settings is just a marketing tool
have great day
this article was published before the Ti came out ( I think before it was even confirmed?)Where's the GTX 1080 Ti?
OTOH the GTX 400/500s are still relevant because of dx11 and more recently dx12 support.
So how about treat all hardware that are relevant(=usable for the latest SW) with every feature article like the list with the worst GPUs?
800 range only really existed in the form of mobile parts. Though they probably were some OEM only parts which was just rebranded 700 cards.Wonder why they skipped over the 800 range?