How Does the GTX 1080 Ti Stack Up in 2019? 39 Game Benchmark

What a bunch of hooplah.
Here's what is not hoopla.

G-sync is superior to Freesync.
PhsyX (albeit different tech is out now...my point is, this was just another feature Nvidia had, and AMD did not)
AMD doesn't offer Ray Tracing support (yet, again, another tech brought to life my Nvidia for AMD to copy, just like G-Sync)
AMD has more issues, bugs and stability issues across the board, from old to new games.
AMD has more issues with sound and video. A recent poll on overclock.net had overwhelming results for AMD users with various little hiccups, latency, stability issues amoung other things, compared to folks on Nvidia cards.
There is a reason Nvidia GPUs, from low-mid-high range absolutely wipe the floor with AMD in steam results. They are superior, in every way. And its more then just about raw FPS.


AMD has less features that matter, and the comparative features are always years/months behind, or not as polished.
https://www.techradar.com/news/comp...idia-who-makes-the-best-graphics-cards-699480

Still, GeForce Experience boasts the game optimization features we’re all crazy for. So when you don’t know what settings are best for your computer in The Witcher 3, Nvidia takes care of the heavy lifting for you.

AMD users can download and install Raptr’s Gaming Evolved tool to optimize their gaming experience. However, the add-on is less than ideal considering its biggest rival’s audience can accomplish nearly everything from within GeForce Experience. That includes using Nvidia Ansel to take way cool in-game photos at resolutions exceeding 63K (16 times that of which a 4K monitor can display).

Nvidia also has a leg up when it comes to streaming games whether it’s to another gaming PC with at least a Maxwell-based GPU, as well as the company’s self-made tablets and set-top box. Not to mention, Nvidia also has a cloud-based gaming service call GeForce Now available to Windows 10 and MacOS users.

And, of course you can’t talk about Nvidia in 2019 without mentioning ray tracing. When Team Green announced its Turing line of graphics card, it made huge claims about revolutionizing gaming with real-time ray traced lighting, shadows and reflections. Games with these features have been out for a while now, and while they certainly look great, these effects drain performance, even from cards designed for them

AMD has MORE features packed into it's drivers by far.
Yeeeessss they do.:joy:

Because it delivers driver updates and optimizes games in addition to letting you broadcast gameplay and capture screenshots as well as videos directly from its easy-to-use interface, Nvidia GeForce Experience is posited as the one PC gaming application to rule them all.

Meanwhile, AMD’s newly announced Radeon Software Adrenalin 2019 Edition aims to overtake Nvidia’s solution. The latest update is stacked features including automatic overclocking (that doesn’t need tensor cores) and stream games to your mobile device.
Ohh look AMD is copying Nvidia's Geforce Experience.
Shocker.
They must be copying/replicating/reacting to Nvidia's move once again because their drivers are packed with MORE features by far.:p
Hey, AMD has come along way, I will admit.
But when it comes to the overall gaming experience, if your gaming on AMD, your gaming on 2nd fiddle. It is what is is.
 
Last edited:
Fair enough. Efficiency. Is that it, or is there anything else improved comparable to what the companies gained? Price/performance is what I was aiming for my point.
So far they've only released their mid/upper mid range stuff. The process is probably not mature yet for them to release the upper range and lower/lower-mid range stuff (and they might have to get rid of the excess Polaris stuff too). IIRC, Lisa Su said more RDNA RX5000 cards are coming, but we have to wait.

So far, the RX5700 regular is a pretty good competitor at the $350 price tag as it provides performance between the RTX2060 regular and RTX2060 Super performance at RTX2060 regular price tags. I assume their sub-300 dollar models will have bigger bang for buck as that is where most of the price conscious consumers are at.
 
During this summer, I played through Crysis, Crysis Warhead and Crysis 2 on a Core i7 laptop with a GTX 1060 and 16GB DDR4 of RAM.

I was able to play in the highest settings on 1080p and the game looked absolutely marvelous in every conceivable way.

Unless you are going for ultra high 4K settings, the 1050Ti and 1060 are all you really need for gaming nowadays, but the advertisers out there want to convince you that unless you're playing a game in ultra high at the highest settings and watching your FPS to make sure it doesn't drop below 60 that your computer is "inferior" and needs to be upgraded.

I bought a 2080tiFTW3 for my computer mostly for future-proofing purposes, but I've realised that most developers are targeting low end systems with low end CPU and GPU. Steam shows that most users have a 1060 or 1050Ti (which is mostly due to the lower price of entry off-the-shelf gaming PC's during the past 3 years where cryptocurrency inflated the price of hardware). I could have gotten away with just a 2080 or just about anything less to game in 1440p on my 34" curved Alienware gaming monitor.

The 1080Ti is still the penultimate powerhouse of the last generation. Even the 1080 in my newer laptop was powerful enough to run most games in full quality.

What really annoys me however is that I feel the lowest end RTX card should have been more powerful than 1080Ti especially when you want to justify these prices.

GTX 1660 is about the same price as a 1060 6gb with better performance
 
This is not an accurate representation of what's available in 2019. RTX comparison? Sure. But, 2070 Super is falling pretty dang short of a 2080 Super, let alone the 2080ti. Of the 2080ti variants, there's huge gaps in performance, especially in terms of FPS when overclocking and using other methods to getting the most out of these cards. Difference in FPS between founders 1080ti and 2080ti cards on stock settings of 4k are 20fps. That may not seem like a lot at first, but considering that overclocking my 2080ti gained an additional 25fps at 4k, that's a total of 45 FPS without erros/artifacting, which is definitely enough to become a deal breaker for most. Great job on the comparison itself between these cards and showing the real world data. My only issue is that it's not a good representation of what's available in 2019, especially when considering all aspects. Thabks for your hard work with this comparison.
 
Why would you assume last year flagship doesn’t have a warranty anymore? Most brands gave 2, 3 even five years warranty on those.. ;-)

The warranty you're referring to usually only applies to the original purchaser and/or after the OP registers the product - it's not for the buyer/second hand purchaser buying the used GPU. Few companies have transferable warranties, so that original warranty for the original purchaser won't do the second hand buyer any good most of the time.

IIRC, even for the companies that do/did have transferable warranties (eg. XFX), the process is tedious and often not followed, as the original user needs to register the product with a proper invoice when he/she first bought it, and then transfer the warranty to the new user on the website. For EVGA, I think they may have a warranty that follows the product (for GPUs?) that allows transfering warranties, but I think the original owner still needs to register the original product when they bought it with the original invoice so the countdown timer starts. If the original user doesn't register with the original invoice, then second hand users still might not get the benefit of the warranty from what I understand.

Also, I believe the 1080Ti came out over 2 years ago, so it's not last year's flagship. The Turing 2000 came out last year, so last year's flagship is the RTX2080.

Exactly. And why would anyone want to deal with all that hassle with an older, used car anyways? People would have to be dense to spend more than $350 on a used card. I wouldn't even pay that much. My 1070 is still plenty fast enough to play at 1440p, and 'Moore's Law' is over so until something truly groundbreaking releases (and developers actually code for it), I see no reason to upgrade.


I disagree. I had a GTX 1070, and recently sold it to a friend in order to buy a used GTX 1080 ti. Both ASUS ROG Strix. I think it was well worth the upgrade, but prices for video cards are a little different in Canada. We tend to see higher prices than just the conversation rate, and if we order from American sites we run the risk of customs fees making it not worth it. Anyway, I think I bought the 1070 2.5 years ago for about $640 CAD after tax. Sold it to my buddy for $350 just the other day and bought a used but still under warranty 1.5 year old 1080 ti for $600 CAD. So basically 2.5 years later I upgraded for $250. My buddy is happy as it's a nice upgrade for him and I get 50-60% performance boost. I consider this well worth it if you want to play at 4k or VR, and in the past year I've picked up both VR and a 4k HDR tv.

For reference, in Canada a Strix 2070 super is $864 after tax and Strix 2080 ti is $1920 after tax. While I'd probably take the 2070 super for $700 total over the used 1080 ti at $600, I'm not paying $264 more for what's currently an inferior GPU to the 1080 ti. The 2080 ti is obviously out of the question, stupidly expensive.
 
Here's what is not hoopla.

G-sync is superior to Freesync.
PhsyX (albeit different tech is out now...my point is, this was just another feature Nvidia had, and AMD did not)
AMD doesn't offer Ray Tracing support (yet, again, another tech brought to life my Nvidia for AMD to copy, just like G-Sync)
AMD has more issues, bugs and stability issues across the board, from old to new games.
AMD has more issues with sound and video. A recent poll on overclock.net had overwhelming results for AMD users with various little hiccups, latency, stability issues amoung other things, compared to folks on Nvidia cards.
There is a reason Nvidia GPUs, from low-mid-high range absolutely wipe the floor with AMD in steam results. They are superior, in every way. And its more then just about raw FPS.

1. By what metric? G-Sync and FreeSync are identical. Go and look at top monitors reviews here on techspot that have a variant of both. Often times they are exactly the same in implementation. The only difference is the Nvidia one is $200+ more

2. PhsyX is a less then pointless feature, often times causing more headaches then good. I remember trying to play BL2 with PhsyX and the massive stuttering it introduce. Or how about Sacred 2 and the slideshow that was with phsyX on, went from 120 FPS to 52 FPS. Saying PhysX is an important feature now is like leaving a dump in you house and telling the people you are selling it to that it's a "feature". No, it's crap that doesn't help.

3. I didn't realize Nvidia invented ray tracing. Oh wait, it didn't lol. Another feature to copy? Like how Nvidia copied AMD's adaptive sharpening source code for it's freestyle sharpening? Hmm... Oh or how about Eyefinity, it took Nvidia two years to get a decent competitor to multi-monitor gaming. Both companies take ideas from each other, don't act like Nvidia is some saint.

4. You base this on what evidence? Nothing, which isn't a surprise.

5. Once again no link and no evidence. The confirmation bias is strong with this one. Do you regularly pursue the forums looking specifically for things that are anti-AMD? It's scientific fact that those who identify with a side, in this case you with Nvidia, are more likely to remember and search for negatives to justify decisions made for their own "side". You will find examples of only AMD doing bad things because you want to find just that. If your hyperbolic word choice wasn't already evidence enough. You don't want the same for Nvidia. Psychology 101.

6. You are going to based your "Nvidia has more features" comment on an article that considers geforce now as a feature? Here's the little nugget

"Nvidia also has a leg up when it comes to streaming games whether it’s to another gaming PC with at least a Maxwell-based GPU, as well as the company’s self-made tablets and set-top box. Not to mention, Nvidia also has a cloud-based gaming service call GeForce Now available to Windows 10 and MacOS users."

I think both you and the author of that article have a fundamental misunderstanding of what is a feature of a video card and what is a feature of other products. Nvidia's tablet and set top box are not part of the graphics card nor are they include. Neither is Geforce experience. I don't know who I'm more disappointed in, you the enthusiast who should know better or the author of that article for publishing such a piece.

Ohh look AMD is copying Nvidia's Geforce Experience.
Shocker.
They must be copying/replicating/reacting to Nvidia's move once again because their drivers are packed with MORE features by far.:p
Hey, AMD has come along way, I will admit.
But when it comes to the overall gaming experience, if your gaming on AMD, your gaming on 2nd fiddle. It is what is is.

You clearly haven't tried AMD's driver package if you believe they are copying GeForce experience, the two are nothing alike and take a completely different approach. Just at the most basic level, GeForce experience is a piece of software that runs besides the driver while AMD's drivers are just that, drivers. Not a separate piece of software. If you don't even know this you haven't even tried AMD's drivers and have no standing to compare one to the other. Like your other comments, you are merely picking details out of context to fit your narrative with no relation to reality.
 
Thats your reply?
Saying Nvidia copied from other places, Freesync is just as good as Gsync( its not), and that drivers are just drivers?
All you had to say was 'amstech, your right'.
The rest is just whimpering that counters nothing I stated.
 
Subtitle - 2.5 year old Nvidia card dominates latest AMD offering, doesn't come in crappy blower design

9% isn't dominating. Neither is leaving out that it's a 2.5 year old flagship vs a new midrange card. Likewise, it also performs equally well against Nvidia's midrange offerings as well. AMD doesn't perform particularly poor here nor is this kind of progression for video cards anything out of the ordinary. You more or less took a mundane article and stretched it to the furthest hyperbole would carry you.
 
Subtitle - 2.5 year old Nvidia card dominates latest AMD offering, doesn't come in crappy blower design

9% isn't dominating. Neither is leaving out that it's a 2.5 year old flagship vs a new midrange card. Likewise, it also performs equally well against Nvidia's midrange offerings as well. AMD doesn't perform particularly poor here nor is this kind of progression for video cards anything out of the ordinary. You more or less took a mundane article and stretched it to the furthest hyperbole would carry you.

Its the best card AMD makes so its AMD's flagship. And you can buy a gently used 1080ti for what a 5700 cost. Or you could until this article came out. eBay sellers must be high giving right now.

Worse, the 5700 and 5700 XT are pretty much all AMD has and AIBs are slow rolling their versions because AMD kneecapped them on profit. The 570/580/590 are 2 1/2 years old and are getting killed by the 1650/1660/1660ti because people don't want a space heater in their box. Vega 56 and 64 are EOLed because they were disasters. Heck people are paying more for a 1050ti than RX 580.

It shows is that AMD is 2 1/2 years behind Nvidia and that is why Intel is moving into the GPU market. Nvidia doesn't even have to try hard right now. Hopefully Intel will push things forward.
 
Its the best card AMD makes so its AMD's flagship. And you can buy a gently used 1080ti for what a 5700 cost. Or you could until this article came out. eBay sellers must be high giving right now.

Worse, the 5700 and 5700 XT are pretty much all AMD has and AIBs are slow rolling their versions because AMD kneecapped them on profit. The 570/580/590 are 2 1/2 years old and are getting killed by the 1650/1660/1660ti because people don't want a space heater in their box. Vega 56 and 64 are EOLed because they were disasters. Heck people are paying more for a 1050ti than RX 580.

It shows is that AMD is 2 1/2 years behind Nvidia and that is why Intel is moving into the GPU market. Nvidia doesn't even have to try hard right now. Hopefully Intel will push things forward.

I might be inclined to agree if it did not bear a mid range product model number. Ultimately the one who decides if a card is a flagship or not is AMD, after all they are the one's who designate their own top product and they definite know whether or not it is in fact the flagship. AMD have made no mention of the 5700 XT being their flagship. An end user couldn't possibly make such a determination.

AIB models are slow to release because AMD didn't give them months in advance to prepare their designs. This isn't the first time this has happened and it likely won't be the last. This is nothing unusual, even Nvidia launched a few generations when it wasn't a comfortable market leader like this. Waiting for AIB models is only something the company in the lead can do.

The article shows in no uncertain terms that AMD have completely closed the gap with Navi. In fact the 5700 XT is HALF the die size of the 2070 super, HALF. It's only 9% slower. That certainly doesn't seem like they are behind to me.
 
We have seen a noticeable slow down in lithography advances particularly the past 5 years.

Gone are the days where you can go from a Geforce 6800 Ultra to a much faster ATi X1800XT 18 months later to a much faster 8800GTX 13 months after that. The space of just two and a half years, you had about three times the performance.

Move that window to now and you're looking at the gap between a GTX1080Ti (March 2017) and a 2080Ti. Not even in the same league of advance. 30 percent faster?

Smaller advances, further apart. That's just how it is going to be for the foreseeable future.

Yes and be prepared for it to pick up a bit, UVL/EUV advancements are the holdup, as Intel learned you can only multi pattern so many times before it's useless.
 
During this summer, I played through Crysis, Crysis Warhead and Crysis 2 on a Core i7 laptop with a GTX 1060 and 16GB DDR4 of RAM.

I was able to play in the highest settings on 1080p and the game looked absolutely marvelous in every conceivable way.

Unless you are going for ultra high 4K settings, the 1050Ti and 1060 are all you really need for gaming nowadays, but the advertisers out there want to convince you that unless you're playing a game in ultra high at the highest settings and watching your FPS to make sure it doesn't drop below 60 that your computer is "inferior" and needs to be upgraded.

I bought a 2080tiFTW3 for my computer mostly for future-proofing purposes, but I've realised that most developers are targeting low end systems with low end CPU and GPU. Steam shows that most users have a 1060 or 1050Ti (which is mostly due to the lower price of entry off-the-shelf gaming PC's during the past 3 years where cryptocurrency inflated the price of hardware). I could have gotten away with just a 2080 or just about anything less to game in 1440p on my 34" curved Alienware gaming monitor.

The 1080Ti is still the penultimate powerhouse of the last generation. Even the 1080 in my newer laptop was powerful enough to run most games in full quality.

What really annoys me however is that I feel the lowest end RTX card should have been more powerful than 1080Ti especially when you want to justify these prices.

IIRC the performance improvement of 2080Ti vs 1080Ti was significantly less than 1080Ti vs 980Ti and 980Ti vs 780Ti. Not to mention the absurd pricing relative to those great cards.
 
Is clear the only way nvidia will sell rtx cards is to not support Pascal with new drivers for new games. This article makes that very clear.

Nvidia knows that they will get caught diminish performance on old titles, but anything new is fair game.
 
Review does little to shed any light on anything.

And yes, AMD's software is still inferior. You still, in 2019, hear from lots of folks about performance and stability issues with AMD drivers....and not just with games, video and audio issues are still quite prevalent. Your also getting less features/refinement, and with AMD you still need to add a GPU to run PhsyX.

Hardware Physx hasn't been a thing since 2015, when we got Batman Arkham Knight and debatably Fallout 4.

With the kind of destruction we have in games today, with Quantum Break and Control being highlights and general hardware accelerated particles, hardware acceleration for Physx doesn't seem like a thing anymore for a few years now.
 
I've got the 1080ti. It's not even overclocked. The card is brilliant.
The 1080ti has on major major advantage. Its got 11gb vram. There are many games at 4k ultra that need over 8gb vram, Inc gears 5. Only two cards can run these games at this level. The 1080ti and the 2080ti
I am running gears 5 at insane settings at 4k with ultra textures pack at a solid 60fps.
All other cards apart from the 2080ti are physically unable to run gears 5 at these settings due to severe vram limitations.
The 2080 is unable to even run nodded gta 5 at ultra settings 4k because more than 8gb vram is needed
So for the present. If you game at 4k ultra and want access to all games then only the 2080ti and 1080ti will do that. Everything else is inferior.
The 2080 is a great card but it's not got enough vram for me.
If 4k is the future then a two year old card is better than anything new. Apart from the 2080ti.
 
Its the best card AMD makes so its AMD's flagship. And you can buy a gently used 1080ti for what a 5700 cost. Or you could until this article came out. eBay sellers must be high giving right now.

Worse, the 5700 and 5700 XT are pretty much all AMD has and AIBs are slow rolling their versions because AMD kneecapped them on profit. The 570/580/590 are 2 1/2 years old and are getting killed by the 1650/1660/1660ti because people don't want a space heater in their box. Vega 56 and 64 are EOLed because they were disasters. Heck people are paying more for a 1050ti than RX 580.

It shows is that AMD is 2 1/2 years behind Nvidia and that is why Intel is moving into the GPU market. Nvidia doesn't even have to try hard right now. Hopefully Intel will push things forward.

I might be inclined to agree if it did not bear a mid range product model number. Ultimately the one who decides if a card is a flagship or not is AMD, after all they are the one's who designate their own top product and they definite know whether or not it is in fact the flagship. AMD have made no mention of the 5700 XT being their flagship. An end user couldn't possibly make such a determination.

AIB models are slow to release because AMD didn't give them months in advance to prepare their designs. This isn't the first time this has happened and it likely won't be the last. This is nothing unusual, even Nvidia launched a few generations when it wasn't a comfortable market leader like this. Waiting for AIB models is only something the company in the lead can do.

The article shows in no uncertain terms that AMD have completely closed the gap with Navi. In fact the 5700 XT is HALF the die size of the 2070 super, HALF. It's only 9% slower. That certainly doesn't seem like they are behind to me.

“Half” so right where Moore’s Law would predict. Oh wait, Moore’s Law would put half at the same performance, not slower.

There is nothing complex about the 5700 series that requires tons of engineering. Let me qualify that - the 5700 does take a ton of extra cooling and considering AMD’s history with hot cards, that is saying something. There won’t be any small form factor 5700 cards. Or medium ones. AIBs are slow rolling because margins are to thin.

By your logic, AMD hasn’t had a “flagship” card in quite a while. In the real world, the best of your product line is, by default, your flagship. Saying “it’s not their best card because they named it middle” is like saying “I have a girlfriend but she lives in Canada you don’t know her”
 
Is clear the only way nvidia will sell rtx cards is to not support Pascal with new drivers for new games. This article makes that very clear.

Nvidia knows that they will get caught diminish performance on old titles, but anything new is fair game.
Faaaaaacts bruv.
And it's sad tbh, I've owned both AMD and Nvidia cards. Mobile and desktop. AMD always loses with its drivers. I would know, I have modded my AMD drivers and got a HELL of a kick in performance, back in the day.. From 40fps average to around 100fps. Sad...
And on Nvidia's older cards, same story. Modded my bros alienware 17r2 GPU vBios on the 980m.. super sad.. fortnite on the latest drivers couldn't bare to run 40fps. And I was like what? It was running better than this before 10 series and 20 series dropped. So. After the modded vbios, 150fps average on fortnite..

WTF KIND OF BS IS THAT? I love nvidia and respect AMD but all these companies do is low ball us in the end, to get us to upgrade. I'm sure the 1080ti is underperforming in some aspects compared to these because it is being forced into limitation. No way should a 2070 super be on par or close to it.
I'd say next gen should definitely beat it or match it at low to mid end. Crazy.
 
Why would you assume last year flagship doesn’t have a warranty anymore? Most brands gave 2, 3 even five years warranty on those.. ;-)

The warranty you're referring to usually only applies to the original purchaser and/or after the OP registers the product - it's not for the buyer/second hand purchaser buying the used GPU. Few companies have transferable warranties, so that original warranty for the original purchaser won't do the second hand buyer any good most of the time.

IIRC, even for the companies that do/did have transferable warranties (eg. XFX), the process is tedious and often not followed, as the original user needs to register the product with a proper invoice when he/she first bought it, and then transfer the warranty to the new user on the website. For EVGA, I think they may have a warranty that follows the product (for GPUs?) that allows transfering warranties, but I think the original owner still needs to register the original product when they bought it with the original invoice so the countdown timer starts. If the original user doesn't register with the original invoice, then second hand users still might not get the benefit of the warranty from what I understand.

Also, I believe the 1080Ti came out over 2 years ago, so it's not last year's flagship. The Turing 2000 came out last year, so last year's flagship is the RTX2080.

Exactly. And why would anyone want to deal with all that hassle with an older, used car anyways? People would have to be dense to spend more than $350 on a used card. I wouldn't even pay that much. My 1070 is still plenty fast enough to play at 1440p, and 'Moore's Law' is over so until something truly groundbreaking releases (and developers actually code for it), I see no reason to upgrade.


I disagree. I had a GTX 1070, and recently sold it to a friend in order to buy a used GTX 1080 ti. Both ASUS ROG Strix. I think it was well worth the upgrade, but prices for video cards are a little different in Canada. We tend to see higher prices than just the conversation rate, and if we order from American sites we run the risk of customs fees making it not worth it. Anyway, I think I bought the 1070 2.5 years ago for about $640 CAD after tax. Sold it to my buddy for $350 just the other day and bought a used but still under warranty 1.5 year old 1080 ti for $600 CAD. So basically 2.5 years later I upgraded for $250. My buddy is happy as it's a nice upgrade for him and I get 50-60% performance boost. I consider this well worth it if you want to play at 4k or VR, and in the past year I've picked up both VR and a 4k HDR tv.

For reference, in Canada a Strix 2070 super is $864 after tax and Strix 2080 ti is $1920 after tax. While I'd probably take the 2070 super for $700 total over the used 1080 ti at $600, I'm not paying $264 more for what's currently an inferior GPU to the 1080 ti. The 2080 ti is obviously out of the question, stupidly expensive.

I feel your 'price pain'.

In the UK, my dual 1080Ti's cost me 2,100 GBP. (900 each for the GPUs, and ~150 each for the EK waterblocks & aftermarket pads). So imagine my head shaking when I hear people talking like they are worth $400ish USD. I can't imagine me changing these for many years to come. And like a perfectly functional old Audi that some claim is worth no money, I'd rather hang on to something useful that meets my needs rather than dump it for chump change.

My RTX2070 can't hold a candle to them either. If I start gaming on the RTX2070 I feel a 'uugh', then pull the cable from the montior and then immediately switch PCs. Then it is smooth as butter again.
 
If you're looking for a 2 slot 5700XT, ASRock Challenger does the trick. Is 2slot height, less than 305mm length, unsure of width, but does fit my Ghost S1 and will fit my Sentry 2.0.

Currently in a NXT H200
Less heat than my monster ASUS ROG laptop, though fan noise is just about equal to that. Since I'm either listening to Spoty or in Discord (prob both) on speakers or headset, the noise is a non factor for me.

If anyone is wondering about an ITX build
3700x cpu
16gb 3600 ram
5700xt ASRock Challenger
x570 I Gigabyte Aorus Mobo
1tb M.2 NVME
cost me about 1200 USD but I got lucky w/CPU and GPU (microcenter <3)
I think MSRP on these parts is closer to 1400
Preformance metrics are flawless for my uses, mostly playing game at ultra 1440p w/discord+audible+spotify+youtube+chrome+aida64+rainmeter+wallpaper engine+steam running in the background.

Also I agree that 350 for 1080ti if best value, barring that, 5700xt for 400
If the Nvidia "features" are worth it to whoever reads this, that's your cost analysis to do, but nothing in this tech game is future proof. To me the outstanding, longlasting performance of the 1080ti is an outlier. But what do I know? I'm just a shiny penny looking for a shoe to get stuck to.
 
I disagree. I had a GTX 1070, and recently sold it to a friend in order to buy a used GTX 1080 ti. Both ASUS ROG Strix. I think it was well worth the upgrade, but prices for video cards are a little different in Canada. We tend to see higher prices than just the conversation rate, and if we order from American sites we run the risk of customs fees making it not worth it. Anyway, I think I bought the 1070 2.5 years ago for about $640 CAD after tax. Sold it to my buddy for $350 just the other day and bought a used but still under warranty 1.5 year old 1080 ti for $600 CAD. So basically 2.5 years later I upgraded for $250. My buddy is happy as it's a nice upgrade for him and I get 50-60% performance boost. I consider this well worth it if you want to play at 4k or VR, and in the past year I've picked up both VR and a 4k HDR tv.

For reference, in Canada a Strix 2070 super is $864 after tax and Strix 2080 ti is $1920 after tax. While I'd probably take the 2070 super for $700 total over the used 1080 ti at $600, I'm not paying $264 more for what's currently an inferior GPU to the 1080 ti. The 2080 ti is obviously out of the question, stupidly expensive.

Well, firstly, why would you upgrade at all unless you bought a 4K/144MHz monitor (expensive) and/or Valve Index (requires at least a 1070)? And before anyone says 'because it's better', please show me sub-60fps benchmarks with the 1070 on anything other than 4K. I got my brand new EVGA 1070 (P4 5173-KR version) last October from Amazon.com (I'm in Canada too) for $288 USD...worked out to about $417 CDN after all the exchange/fees/tax. Gave my 1060 3GB to my son to replace the 950 I had in his (double speed upgrade for him).

Next, why would you buy a card now? Black Friday deals will be starting right away (October-ish) and there are great deals to be had (see above). Also, your buddy is just plain dumb, sorry. I can get a GTX 1660Ti brand new off Newegg.ca right now for $350CDN (I see they jacked the price up $30 recently...they'll come down), and it is slightly faster than my 1070, not to mention more power efficient and has full warranty. Will likely get that card for at least $50 cheaper right away too.

I guess that's what surprises me the most...that there are suckers out there buying old cards for so much cash when there are better replacements for the same $. Now if I already had a 1080Ti, of course I wouldn't upgrade...more suckers with more disposable $ than brains (the ones that upgrade for no other reason than nerd bragging rights).
 
Last edited:
https://youtu.be/kJiZpNElTT0

Most 1080Ti can be oced 15-20% with the XOC bios (removed power limit) while 5700XT can be oced 7% with wc and power play mod. Sure OC might not be everyone cup of tea but AMD fans have been banging on how overclockable Vega 56 was. Now that 5700XT barely has any OC headroom left it's all about price to performance or performance per mm2 die space hahaha.
 
Back