Nvidia GeForce RTX 3080 Review: Ampere Arrives!

Thanks for the review.

Looks like most of the performance gain moving from the 2080 to the 3080 was achieved with power consumption. TDP is up 49%, which is pretty much the same as the average performance gain.

I'm a bit underwhelmed, though it's certainly a MUCH better value prop than the 2080 Ti was two years ago. And it makes 4K gaming very playable.
 
Rename you site to amd unboxed, intentionally using 3950x which is a massive bottleneck in any resolution under 4K and even @4k depending on a game.
You clearly can't read, there is a 0% difference between the 3950x and the 10900K at 4k. Look at the benchmarks above! Trolls.

TechSpot, one of the most popular games in the world is COD Warzone and you guys don't include benchmarks. It's literally the only game a lot of us play and you ignore it completely. I also don't understand the lack of 1080P testing. You reviewed a 360hz 1080P monitor yesterday. I have a 280hz 1080P monitor and desperately want to see benches! I get some of your editors don't care about high refresh gaming but a LOT of us do!
 
Thanks for the review.

Looks like most of the performance gain moving from the 2080 to the 3080 was achieved with power consumption. TDP is up 49%, which is pretty much the same as the average performance gain.

I'm a bit underwhelmed, though it's certainly a MUCH better value prop than the 2080 Ti was two years ago. And it makes 4K gaming very playable.

1- Average peformance gain is 68% in 4K. 1440p is clearly CPU limited. Also power consumption test was done at 4K, so 1440p is irrelevant

2- RTX 3080 consume 27.8% higher than RTX 2080
https://static.techspot.com/articles-info/2099/bench/Power_PCAT.png

While performing 68% faster. In other words, RTX 3080 is much more efficient than RTX 2080

However, the gap is much lower if you compare it to RTX 2080 Ti but RTX 3080 replaces RTX 2080 and not RTX 2080 Ti.
 
1- Average peformance gain is 68% in 4K. 1440p is clearly CPU limited. Also power consumption test was done at 4K, so 1440p is irrelevant

2- RTX 3080 consume 27.8% higher than RTX 2080
https://static.techspot.com/articles-info/2099/bench/Power_PCAT.png

While performing 68% faster. In other words, RTX 3080 is much more efficient than RTX 2080

However, the gap is much lower if you compare it to RTX 2080 Ti but RTX 3080 replaces RTX 2080 and not RTX 2080 Ti.
Well, you could argue that the 3080 replaces the 2080Ti since both use the 102 die. 2080 (and Super) used the 104 die just like the 3070.

But yes, price wise it is a 2080 replacement, and the performance per $ improvement over Turing is impressive. This is definitely the upgrade 1080Ti users have been waiting for.

The 3080 FE seems like a very nice card for the money. Am curious how well AIB cards hold up in comparison.
 
Thanks very much for this article.
Ok, I definitely can't ignore that power draw in the Total System Power consumption graph. I seriously need to reassess my current power supply. Gonna need a bigger boat
 
Using this calculator, https://www.calculatorsoup.com/calculators/algebra/percent-difference-calculator.php all of your % differences are all wrong

Death Stranging @ 1440p
3080 = 157
2080 = 111
% difference = 34% (not 41% you claim, but the 3080 is 41% better then the 1080Ti)

Average @ 1440p
3080 = 171
2080 = 115
% difference = 39% (and not 49% faster)

What in the actual f**k is going on here?

157 is 41% greater than 111.

You're calculating the percentage difference, not he percentage increase.
 
Using this calculator, https://www.calculatorsoup.com/calculators/algebra/percent-difference-calculator.php all of your % differences are all wrong

Death Stranding @ 1440p
3080 = 157
2080 = 111
% difference = 34% (not 41% you claim, but the 3080 is 41% better then the 1080Ti)

Average @ 1440p
3080 = 171
2080 = 115
% difference = 39% (and not 49% faster)

What in the actual f**k is going on here?

They are using the following calculation:
Death Stranding @ 1440p
3080 = 157
2080 = 111
% diff = (157-111)/111*100= 41% gain

Average @ 1440p
3080 = 171
2080 = 115
% diff = (171-115)/115*100= 49% gain

the same site you linked has another calculator that does it the way Techspot does: link
 
Last edited:
I am definitely not paying $700 for a graphics card. I might get the RTX 3070, but I think I'm going wait to see what AMD has to offer first. If the tensor cores really don't show that much of an improvement in ray tracing efficiency and its only because of the faster GPU, then the RTX 3070 will probably be right at the RTX 2080 Ti level across the board, except with better power efficiency. My guess is that AMD will not have anything competitive with the RTX 3080, similar to how it is now, but will likely have a card that is competitive and perhaps less expensive than the RTX 3070. The flagship RX 5700 XT is only a 9.75 TF card, so the 3070 is 2X that card, while the 3080 is 3X that card at least on paper. The efficiency of AMDs ray tracing solution will be interesting. Seemingly console developers are stating its a relatively low performance hit. If AMD truly has a solution where ray tracing doesn't cost as much performance wise as on the RTX, AMD might be able to compete even with cards that have less powerful GPUs since the RTX performance hit is so high. It's doubtful that AMD will have such magic up their sleeve, but we shall see.
 
Great review as always! Thank you!
Any guess if the underwhelming improvement in proportional RT performance is due to drivers, or do you think it's inherent to the architecture?
 
The steep power draw has to be down to Samsung's 8nm process. I bet these chips were originally destined for TSMC 7nm before AMD crowded it out, which is probably a slightly better node for high performance GPUs.

Samsung came in and offered Nvidia a cheap manufacturing deal they couldn't refuse.
I can envisage a half node/silicon revision in another year where there is a mild performance gain and moderate power consumption drop with a more refined stepping. Kind of like the Tesla GTX285 series.

Nevertheless the GTX3080 has plenty of muscle at 4K, definitely ready to brush aside new consoles. Probably be on my shopping list since I waited for an upgrade, although only until I have seen what AMD can offer first in another month. Gives time for stock and prices to settle for Nvidia's cards as well anyway.
 
In the cost/frame graph, why is the 2070 Super so high? It has higher performance than regular 2070 and both are listed at $500. For example, in 1440p it gets 106 fps. At $500, that should be $500/106=4.72 $/frame but the listed value is $6.60. Apologies if I missed something, great review!
 
Few things:
1. This is worth upgrading my 1080Ti for, though I am not sure if I will. I think I might be able to wait for the 4x series. I think Cyberpunk will determine that for me on whether I make the jump now or in 12-18 months when 4x series cards come out. My PS5 will probably tide me over well until then.
2. I was thinking we would start to see 4k GPU bound limitations begin to lift for CPU benchmarks, but not yet.
 
I'm impressed, but holding out for the 3090.

Probably be prudent to re-run these benchmarks when updated drivers are released.
I am impressed but I am waiting for a 3060 review because money!

Since there is no delta between Intel using PCIE 3 versus AMD using PCIE 4 are you assuming there is no advantage to PCIE 4? Or could there be a minor boost that allows AMD to be on par at 4k?
 
I’m here at Microcenter now checking for shipments for tomorrow morning.

I will be back at 10AM.

The KING ? has returned.
 
Back