How Does the GTX 1080 Ti Stack Up in 2019? 39 Game Benchmark

Julio Franco

Posts: 9,095   +2,045
Staff member
GTX1080Ti held onto it's value very strongly, in that sense if you bought one over two years ago you are still probably pretty pleased about it. It's not like you really have anywhere else to trade up to except to a very expensive 2080Ti. I wouldn't be at all surprised to see people hang onto these cards for another year.

We're onto 3 year life cycles for many GPUs these days, before true replacements. Far cry from the olden days where your card was replaced by something much faster in typically less than two years. Usually to play games like Far Cry!
 
I wouldn't be at all surprised to see people hang onto these cards for another year.
Absolutely my plan, I'll keep it until a proper replacement comes out that doesn't require me to get a mortgage out to afford one.

Slightly off topic but if Nvidia's next cards (Ampere) are destined for 7nm, It's worth waiting that year anyway as there's potential for quite a change in performance compared to going to the 2080Ti.
 
During this summer, I played through Crysis, Crysis Warhead and Crysis 2 on a Core i7 laptop with a GTX 1060 and 16GB DDR4 of RAM.

I was able to play in the highest settings on 1080p and the game looked absolutely marvelous in every conceivable way.

Unless you are going for ultra high 4K settings, the 1050Ti and 1060 are all you really need for gaming nowadays, but the advertisers out there want to convince you that unless you're playing a game in ultra high at the highest settings and watching your FPS to make sure it doesn't drop below 60 that your computer is "inferior" and needs to be upgraded.

I bought a 2080tiFTW3 for my computer mostly for future-proofing purposes, but I've realised that most developers are targeting low end systems with low end CPU and GPU. Steam shows that most users have a 1060 or 1050Ti (which is mostly due to the lower price of entry off-the-shelf gaming PC's during the past 3 years where cryptocurrency inflated the price of hardware). I could have gotten away with just a 2080 or just about anything less to game in 1440p on my 34" curved Alienware gaming monitor.

The 1080Ti is still the penultimate powerhouse of the last generation. Even the 1080 in my newer laptop was powerful enough to run most games in full quality.

What really annoys me however is that I feel the lowest end RTX card should have been more powerful than 1080Ti especially when you want to justify these prices.
 
Hehe sorry but I cannot resist, why are we comparing a last year flagship to this year's mid range again? 8-D if anything one would expect a flagship to flagship comparison, unless you guys are assuming a flagship owner won't be looking to upgrade to this year flagship again for some reason.. Jf be splashed last year then more likely then jot he will be look ing to splash again ;-) but what do I know ;-)
 
Why are we comparing a last year flagship to this year's mid range again?
It's not last years Flagship, It came out in March 2017. It's last year's Last year :laughing:

Edit: That's crazy now I think about it, it's now closer to 3 years old than 2 years old and only 2 GPU's in the world can be considered able to "beat it" performance wise and neither are worth the money!
 
Last edited:
Have to disagree with author about Control though, the game looks amazing when you turn on RTX that is. I haven't been this impressed with graphical details since Far Cry or the introduction of HDR in Half Life 2, also the destructibility of environments is awesome. Really with all recent games the only way to increase graphical clarity was to increase the polygon count and it's getting boring fast, just static non-interactive objects every where. In Control after a gunfire you look back at all the destructions you have caused and it looks so real. Maybe Tim will need to have a look at Control RTX and DLSS implementation in the near future.
 
Note to authors of GPU reviews: Please include details of OC status or even clock speeds of the various cards. For example, in this article we have PowerColor RX 5700 XT Red Devil - an OC part, the MSI RTX 2070 Super X Trio - stock non-OC part, and the MSI GTX 1080 Ti Gaming X Trio - an optionally OC'd part (who knows whether it was tested in an OC configuration, gaming config or something else?).
 
This article shows us that the HUGE cash influx of what destroyed the video game market, crypto, did nothing to improve our industry. It simply was soaked into their pocketbooks. We are still where we were at years ago, and video card companies are still gouging us. It is a sad time to be a gamer.
 
Hehe sorry but I cannot resist, why are we comparing a last year flagship to this year's mid range again? 8-D if anything one would expect a flagship to flagship comparison, unless you guys are assuming a flagship owner won't be looking to upgrade to this year flagship again for some reason.. Jf be splashed last year then more likely then jot he will be look ing to splash again ;-) but what do I know ;-)
It's probably to show how much GPU has progressed. Today's upper midrange options (2070, 5700XT) at ~$400-$450 MSRP are equal to last generation's top tier option (1080 Ti) at $700-$750 MSRP.

It's also a decent price for price comparison of getting an older used top tier GPU vs getting a new current gen upper mid tier GPU. Since they are all going for around $400-$500, this is useful for people with $500 comparing used vs new GPUs.
 
Should you sell you 1080ti for $350? lol I would not sell my 1080ti ftw3 as low as that. I rather hold to it than do charity work!

$350-$400 is the fair market value for a used card w/o warranty at the 1080Ti's level of performance nowadays considering new Turing and Pascal cards with warranties that cost ~$400-$450 perform roughly comparable to 1080Ti. 1080Ti owners don't have to sell it, but people in the market to buy a GPU really should not pay more than $350-$400 for a used card with that level of performance.
 
This article shows us that the HUGE cash influx of what destroyed the video game market, crypto, did nothing to improve our industry. It simply was soaked into their pocketbooks. We are still where we were at years ago, and video card companies are still gouging us. It is a sad time to be a gamer.
AMD's cards are greatly improved. They jumped from the RX400 series Polaris with performance efficiency roughly comparable to the GTX900 Maxwell to the RX5000 series RDNA with performance efficiency comparable to RTX2000 Turing (basically skipping over the Pascal generation). That's a pretty good jump in power efficiency.
 
$350-$400 is the fair market value for a used card w/o warranty at the 1080Ti's level of performance nowadays considering new Turing and Pascal cards with warranties that cost ~$400-$450 perform roughly comparable to 1080Ti. 1080Ti owners don't have to sell it, but people in the market to buy a GPU really should not pay more than $350-$400 for a used card with that level of performance.
Why would you assume last year flagship doesn’t have a warranty anymore? Most brands gave 2, 3 even five years warranty on those.. ;-)
 
Why would you assume last year flagship doesn’t have a warranty anymore? Most brands gave 2, 3 even five years warranty on those.. ;-)

The warranty you're referring to usually only applies to the original purchaser and/or after the OP registers the product - it's not for the buyer/second hand purchaser buying the used GPU. Few companies have transferable warranties, so that original warranty for the original purchaser won't do the second hand buyer any good most of the time.

IIRC, even for the companies that do/did have transferable warranties (eg. XFX), the process is tedious and often not followed, as the original user needs to register the product with a proper invoice when he/she first bought it, and then transfer the warranty to the new user on the website. For EVGA, I think they may have a warranty that follows the product (for GPUs?) that allows transfering warranties, but I think the original owner still needs to register the original product when they bought it with the original invoice so the countdown timer starts. If the original user doesn't register with the original invoice, then second hand users still might not get the benefit of the warranty from what I understand.

Also, I believe the 1080Ti came out over 2 years ago, so it's not last year's flagship. The Turing 2000 came out last year, so last year's flagship is the RTX2080.
 
Last edited:
Why would you assume last year flagship doesn’t have a warranty anymore? Most brands gave 2, 3 even five years warranty on those.. ;-)

The warranty you're referring to usually only applies to the original purchaser and/or after the OP registers the product - it's not for the buyer/second hand purchaser buying the used GPU. Few companies have transferable warranties, so that original warranty for the original purchaser won't do the second hand buyer any good most of the time.

IIRC, even for the companies that do/did have transferable warranties (eg. XFX), the process is tedious and often not followed, as the original user needs to register the product with a proper invoice when he/she first bought it, and then transfer the warranty to the new user on the website. For EVGA, I think they may have a warranty that follows the product (for GPUs?) that allows transfering warranties, but I think the original owner still needs to register the original product when they bought it with the original invoice so the countdown timer starts. If the original user doesn't register with the original invoice, then second hand users still might not get the benefit of the warranty from what I understand.

Also, I believe the 1080Ti came out over 2 years ago, so it's not last year's flagship. The Turing 2000 came out last year, so last year's flagship is the RTX2080.

Exactly. And why would anyone want to deal with all that hassle with an older, used car anyways? People would have to be dense to spend more than $350 on a used card. I wouldn't even pay that much. My 1070 is still plenty fast enough to play at 1440p, and 'Moore's Law' is over so until something truly groundbreaking releases (and developers actually code for it), I see no reason to upgrade.
 
Review does little to shed any light on anything.

Based on today's results, we wouldn't pay more than $400 for a used GTX 1080 Ti, in fact they’d only start to tempt us around the $350 mark.

Based on the results, they should tempt you at $400.
It's faster by 10% and most likely always will be.
The 5700XT is about maxed out in stock form, a GTX 1080Ti will overclock 10-15%, just like the GTX 1080 did in your review.
https://www.techspot.com/review/1174-nvidia-geforce-gtx-1080/page8.html

And yes, AMD's software is still inferior. You still, in 2019, hear from lots of folks about performance and stability issues with AMD drivers....and not just with games, video and audio issues are still quite prevalent. Your also getting less features/refinement, and with AMD you still need to add a GPU to run PhsyX.


If someone said, you can have either GPU for $400 today, its a no brainer, take the GTX 1080Ti. It's faster, has more features and more overclock room.
But you can only get them for $500 on average so the 5700XT is the better buy. That being said, with all of AMD's shortcomings, I would only pay $300-$350 for a 5700XT.
Not trying to rain on AMD, the 5700XT or Navi, its performance is impressive, but that is a $409 2019 GPU.
 
Last edited:
This article shows us that the HUGE cash influx of what destroyed the video game market, crypto, did nothing to improve our industry. It simply was soaked into their pocketbooks. We are still where we were at years ago, and video card companies are still gouging us. It is a sad time to be a gamer.
AMD's cards are greatly improved. They jumped from the RX400 series Polaris with performance efficiency roughly comparable to the GTX900 Maxwell to the RX5000 series RDNA with performance efficiency comparable to RTX2000 Turing (basically skipping over the Pascal generation). That's a pretty good jump in power efficiency.
RDNA is nice, I just wish AMD would stop dragging their feet on a higher end chip. I'm not paying $400 for a rx5700 when its only 25-30% faster then my vega 64. Now, if they had, say, a $550 or $600 RX5800xt with ~4000 cores and 12GB of VRAM, now were talking.

As it is I wish I had spent the $700 on a 1080ti when they were new, given we are 3 years in and they still kick arse.
 
...with AMD you still need to add a GPU to run PhsyX.

I thought PhysX these days runs on the modern CPUs just fine if you don't have an nVidia GPU. nVidia just enabled PhysX to run on its GPU in addition to the CPU - which was big back in the day - but these days not so much.

Is there a more recent video (that one was from 2012) that shows a comparison with running a PhysX enabled game using an AMD GPU and CPU vs nVidia GPU and Intel/AMD CPU? Seems like the reviews turn off PhysX completely when testing, but wound be interesting to see the real impact.
 
This article shows us that the HUGE cash influx of what destroyed the video game market, crypto, did nothing to improve our industry. It simply was soaked into their pocketbooks. We are still where we were at years ago, and video card companies are still gouging us. It is a sad time to be a gamer.
AMD's cards are greatly improved. They jumped from the RX400 series Polaris with performance efficiency roughly comparable to the GTX900 Maxwell to the RX5000 series RDNA with performance efficiency comparable to RTX2000 Turing (basically skipping over the Pascal generation). That's a pretty good jump in power efficiency.

Honestly that was the biggest surprise of the year. I did not expect them to get this level of performance and still be power efficient.

The 5700 XT is the best value card of the three tested here.

Why would you assume last year flagship doesn’t have a warranty anymore? Most brands gave 2, 3 even five years warranty on those.. ;-)

The warranty you're referring to usually only applies to the original purchaser and/or after the OP registers the product - it's not for the buyer/second hand purchaser buying the used GPU. Few companies have transferable warranties, so that original warranty for the original purchaser won't do the second hand buyer any good most of the time.

IIRC, even for the companies that do/did have transferable warranties (eg. XFX), the process is tedious and often not followed, as the original user needs to register the product with a proper invoice when he/she first bought it, and then transfer the warranty to the new user on the website. For EVGA, I think they may have a warranty that follows the product (for GPUs?) that allows transfering warranties, but I think the original owner still needs to register the original product when they bought it with the original invoice so the countdown timer starts. If the original user doesn't register with the original invoice, then second hand users still might not get the benefit of the warranty from what I understand.

Also, I believe the 1080Ti came out over 2 years ago, so it's not last year's flagship. The Turing 2000 came out last year, so last year's flagship is the RTX2080.

Exactly. And why would anyone want to deal with all that hassle with an older, used car anyways? People would have to be dense to spend more than $350 on a used card. I wouldn't even pay that much. My 1070 is still plenty fast enough to play at 1440p, and 'Moore's Law' is over so until something truly groundbreaking releases (and developers actually code for it), I see no reason to upgrade.

1. The 10xx series is more reliable. 20xx series had space invader issues and Nvidia didn't exactly come out and say they fixed the issue. I still see it popping up on reddit and forums.

2. I guess I must be dense then because my last 3 graphics cards were 780 Ti, 980 Ti, 1080 Ti. All used. Not a single one of them had an issue. Not that it matters, they've all been covered under EVGA's transferable warranty anyways.

Review does little to shed any light on anything.

Based on today's results, we wouldn't pay more than $400 for a used GTX 1080 Ti, in fact they’d only start to tempt us around the $350 mark.

Based on the results, they should tempt you at $400.
It's faster by 10% and most likely always will be.
The 5700XT is about maxed out in stock form, a GTX 1080Ti will overclock 10-15%, just like the GTX 1080 did in your review.
https://www.techspot.com/review/1174-nvidia-geforce-gtx-1080/page8.html

And yes, AMD's software is still inferior. You still, in 2019, hear from lots of folks about performance and stability issues with AMD drivers....and not just with games, video and audio issues are still quite prevalent. Your also getting less features/refinement, and with AMD you still need to add a GPU to run PhsyX.


If someone said, you can have either GPU for $400 today, its a no brainer, take the GTX 1080Ti. It's faster, has more features and more overclock room.
But you can only get them for $500 on average so the 5700XT is the better buy. That being said, with all of AMD's shortcomings, I would only pay $300-$350 for a 5700XT.
Not trying to rain on AMD, the 5700XT or Navi, its performance is impressive, but that is a $409 2019 GPU.

The 5700 XT most certainly is not tapped out. A quick google search will tell you this.

https://www.google.com/search?client=firefox-b-1-d&q=5700+xt+overclocking

Most cards get from 2.1 - 2.2 GHz. That's 16-20%. Techspot got 2.1 GHz without even changing the powerplay tables.

"Your also getting less features/refinement, and with AMD you still need to add a GPU to run PhsyX."

What a bunch of hooplah. AMD has MORE features packed into it's drivers by far. It's only because AMD open sources many of it's features that Nvidia is even able to add a few of them, like freestyle for example. And really? PhysX? That complete tirefire?

This article shows us that the HUGE cash influx of what destroyed the video game market, crypto, did nothing to improve our industry. It simply was soaked into their pocketbooks. We are still where we were at years ago, and video card companies are still gouging us. It is a sad time to be a gamer.
AMD's cards are greatly improved. They jumped from the RX400 series Polaris with performance efficiency roughly comparable to the GTX900 Maxwell to the RX5000 series RDNA with performance efficiency comparable to RTX2000 Turing (basically skipping over the Pascal generation). That's a pretty good jump in power efficiency.
RDNA is nice, I just wish AMD would stop dragging their feet on a higher end chip. I'm not paying $400 for a rx5700 when its only 25-30% faster then my vega 64. Now, if they had, say, a $550 or $600 RX5800xt with ~4000 cores and 12GB of VRAM, now were talking.

As it is I wish I had spent the $700 on a 1080ti when they were new, given we are 3 years in and they still kick arse.

AMD not releasing higher end chips likely has more to do with 7nm not being mature enough yet. Small chips are one thing, bigger chips require a more mature node.
 
AMD's cards are greatly improved. They jumped from the RX400 series Polaris with performance efficiency roughly comparable to the GTX900 Maxwell to the RX5000 series RDNA with performance efficiency comparable to RTX2000 Turing (basically skipping over the Pascal generation). That's a pretty good jump in power efficiency.

Fair enough. Efficiency. Is that it, or is there anything else improved comparable to what the companies gained? Price/performance is what I was aiming for my point.
 
This article shows us that the HUGE cash influx of what destroyed the video game market, crypto, did nothing to improve our industry. It simply was soaked into their pocketbooks. We are still where we were at years ago, and video card companies are still gouging us. It is a sad time to be a gamer.
AMD's cards are greatly improved. They jumped from the RX400 series Polaris with performance efficiency roughly comparable to the GTX900 Maxwell to the RX5000 series RDNA with performance efficiency comparable to RTX2000 Turing (basically skipping over the Pascal generation). That's a pretty good jump in power efficiency.
RDNA is nice, I just wish AMD would stop dragging their feet on a higher end chip. I'm not paying $400 for a rx5700 when its only 25-30% faster then my vega 64. Now, if they had, say, a $550 or $600 RX5800xt with ~4000 cores and 12GB of VRAM, now were talking.

As it is I wish I had spent the $700 on a 1080ti when they were new, given we are 3 years in and they still kick arse.

AMD not releasing higher end chips likely has more to do with 7nm not being mature enough yet. Small chips are one thing, bigger chips require a more mature node.
AMD jumped the gun again. The node isnt mature enough to do a 350~ mm2 GPU, the drivers still have major bugs 2 months in, AIBs took a month and a half to release, AMD really should have waiting until the end of september to launch these things.

It's not like a 5800xt would be massive, given how small the 5700xt is. Even a 3072 core Navi to give the 2080 super a run for its money would be nice.
 
GTX1080Ti held onto it's value very strongly, in that sense if you bought one over two years ago you are still probably pretty pleased about it. It's not like you really have anywhere else to trade up to except to a very expensive 2080Ti. I wouldn't be at all surprised to see people hang onto these cards for another year.

We're onto 3 year life cycles for many GPUs these days, before true replacements. Far cry from the olden days where your card was replaced by something much faster in typically less than two years. Usually to play games like Far Cry!

GPUs pretty much always had a 3-5 year update cycle, ussually every other card series to third you upgraded despite a few example of huge gains in a generation.

Keep in mind the cheaper the SKU the more often you would buy to stay at the curve, the more expensive models always afforded people the luxury of time, it's pretty much the way it has always been. The 1080ti was 700-800$ and to be honest it will probably be best upgraded at the 3000 series and as NVidia is going with a release every 2 years it would put your upgrade at the 4 year point but 3 years isnt bad either, it would still put it at a 250$-200$ degradation per year. Which isn't bad at all for a computer part.

CPUs Mobos and ram were a bit different I mean technically you could still be using a x58/z68 and just be feeling the pinch of when you should upgrade
 
GPUs pretty much always had a 3-5 year update cycle, ussually every other card series to third you upgraded despite a few example of huge gains in a generation.

Keep in mind the cheaper the SKU the more often you would buy to stay at the curve, the more expensive models always afforded people the luxury of time, it's pretty much the way it has always been.

We have seen a noticeable slow down in lithography advances particularly the past 5 years.

Gone are the days where you can go from a Geforce 6800 Ultra to a much faster ATi X1800XT 18 months later to a much faster 8800GTX 13 months after that. The space of just two and a half years, you had about three times the performance.

Move that window to now and you're looking at the gap between a GTX1080Ti (March 2017) and a 2080Ti. Not even in the same league of advance. 30 percent faster?

Smaller advances, further apart. That's just how it is going to be for the foreseeable future.
 
Back