GeForce RTX 3080 vs. Radeon RX 6800 XT: 50 Game Benchmark

Also, plenty of games look better with DLSS than they do on native cause of TAA. Saying you would never use DLSS is akin to saying I don't care about image quality. Well, okay I guess.

As I said to each his own I'm not one that thinks upscaling looks better than native. And you do since you use DLSS so I wouldn't expect otherwise.
 
As I said to each his own I'm not one that thinks upscaling looks better than native. And you do since you use DLSS so I wouldn't expect otherwise.
I can upload 3ss on you, I bet a paycheck youll think the dlss quality is the native one on games with TAA, like let's say, Cod cold war. TAA sucks, that's just a fact
 
I can upload 3ss on you, I bet a paycheck youll think the dlss quality is the native one on games with TAA, like let's say, Cod cold war. TAA sucks, that's just a fact
I don't need to see it as I don't care that much about it. as I said to each his own you like dlss and there is no issue with that.
 
I couldnt imagine buying a flagship GPU for over $1000 and not get good ray tracing support and DLSS. In particular DLSS is awesome, I use it all the time. In some games it even makes the games look better, in most games it makes the motion look much clearer, theres no trails like you get from TAA. The Radeon is odd, it seems best at 1080p but I couldnt imagine spending this much money on a 1080p card, it would cost like 5 times as much as the monitor itself!

Also after years of being infuriated with AMDs driver support I dont want to go Radeon again for at least a little while. Geforce all the way for me, until they inevitably let me down. But its been 3 years back on Nvidia and so far its been vastly superior in terms of driver support to the previous 8 years I spent using Radeon cards.
Thanks for that. I have been with Nvidia about six years now because I really hated AMD's Driver support, and wondered if it had improved over the years. Nvidia is just so easy, never had a fail.
 
Thanks for that. I have been with Nvidia about six years now because I really hated AMD's Driver support, and wondered if it had improved over the years. Nvidia is just so easy, never had a fail.
Rather than listening to that obvious fanboy that rages in every thread or article that dares present anything AMD in anything but a worse light, watch someone that recently has had experiences with both;

 
Rather than listening to that obvious fanboy that rages in every thread or article that dares present anything AMD in anything but a worse light, watch someone that recently has had experiences with both;

I am confused, per the always correct hardware unboxed, a 6900xt cannot touch the new 3080 12gb model, yet this other reviewer has a 6800xt matching or beating it.

now, good points on RT, which match my impressions of it and about dlss, personally, I refuse any nvidia tech that locks me to their hardware, so thats an automatic nope from me.
 
And that TDP on the 3080 is for the FE model at that.
The EVGA FTW3 Ultra variant easily hits 380W under load and will push up to 400 with a simple OC with the option to install a 450W LN2 OC BIOS to it.

/quite happy with it



Again, for FE models.
Typical clocks in game for me are anywhere between 1835-2040Mhz.

I've managed to get a 2100Mhz OC on mine (on air, at 70c die - 82c hot spot), but only for a minute or two before it'll downclock to ~1980 and fluctuate somewhere between if not back up to 2100.

Not sure I really have a point with this post, guess it works in that there's quite a variance between the manufacturers default spec numbers and those of what most users are going to have.
Really is an impressive little piece of hardware they've got going on, considering just how much overhead is in it compared to default specs. Shame prices are so jacked still.

Edit: also works in showing that yes, AMD is the more efficient if you're comparing TDP to clockspeed, but as already mentioned given the differences between the two architectures it's not entirely accurate.
I feel if we are really looking at architecture efficiency, it will be very difficult to measure. Assuming in a game where both the RTX 3080 and RX 6800 XT performs quite closely for pure rasterisation scenario, we can generally observed that the RX 6800 XT tend to draw less power (out of the box) than the RTX 3080. But here, we are looking at efficiency from the standpoint of the full package.
However, if we consider the fact that AMD had a fairly significant advantage because they are on TSMC’s 7nm vs Nvidia on Samsung’s 8nm (which is actually a 10nm), the Nvidia architecture seems more efficient which allowed them to catch up with the fab advantage that AMD enjoyed. Also when RT comes into play, the power consumption relative to performance changes in full favour of Nvidia.
 
I am confused, per the always correct hardware unboxed, a 6900xt cannot touch the new 3080 12gb model, yet this other reviewer has a 6800xt matching or beating it.

now, good points on RT, which match my impressions of it and about dlss, personally, I refuse any nvidia tech that locks me to their hardware, so thats an automatic nope from me.
I think Hardware Unboxed tests with resizable bar / Smart Access Memory disabled, don't they? Maybe that's a difference.
 
I think Hardware Unboxed tests with resizable bar / Smart Access Memory disabled, don't they? Maybe that's a difference.
You are right.

and at the same time, since both gpu support it, maybe they should test it that way.
 
Not a common thing to do but if you just undervolt and cards by - 50mV then they then to shine and this isn't something touched on by most reviewers. Higher clocks, lower Temps, lower power draws and pretty much win win and blows Nvidia out of the water.

 
Not a common thing to do but if you just undervolt and cards by - 50mV then they then to shine and this isn't something touched on by most reviewers. Higher clocks, lower Temps, lower power draws and pretty much win win and blows Nvidia out of the water.
Yeah, cause you can't undervolt nvidia cards. Oh wait...
 
Yeah, cause you can't undervolt nvidia cards. Oh wait...
And what is nvidias performance with undervolting vs amd? Still piss poor in terms of efficiency.

I can hit sustained 2600mhz with undervolt to 1150mV and under 300 watt power draw with my 6900xt red devil ultimate on air.
 
And what is nvidias performance with undervolting vs amd? Still piss poor in terms of efficiency.
Have you actually tried or are you talking out of your ***?

What exactly makes you think that amd are better at undervolting, wishful thinking?
 
You asserted something, I asked for some evidence, and you come back with this? Gotcha, wishful thinking
 
Not a common thing to do but if you just undervolt and cards by - 50mV then they then to shine and this isn't something touched on by most reviewers. Higher clocks, lower Temps, lower power draws and pretty much win win and blows Nvidia out of the water.
Review’s task is to review the product “as is”, I.e. out of the box experience. Any reviews on OC and UV are value add, but does not guarantee that every card can achieve the same results. And reference cards directly from Nvidia and AMD is the best gauge, which out of the box, the AMD RX 6800XT is definitely less power hungry than the RTX 3080. But I see no reason to get so hung up on power consumption since Nvidia cards perform well in some titles and gives them the edge from a performance per watt perspective. Likewise for AMD’s cards. Furthermore, if you are into RT, then there is no doubt that Nvidia cards are better in RT games from a performance per watt perspective.
 
Review’s task is to review the product “as is”, I.e. out of the box experience. Any reviews on OC and UV are value add, but does not guarantee that every card can achieve the same results. And reference cards directly from Nvidia and AMD is the best gauge, which out of the box, the AMD RX 6800XT is definitely less power hungry than the RTX 3080. But I see no reason to get so hung up on power consumption since Nvidia cards perform well in some titles and gives them the edge from a performance per watt perspective. Likewise for AMD’s cards. Furthermore, if you are into RT, then there is no doubt that Nvidia cards are better in RT games from a performance per watt perspective.
Not only that, undervolting depends on the workload you are running (game/settings /resolution) . I mean I can run 2100 mhz (which is kinda the limit for a 3090) at 320 watts runing ac valhalla 1440p, but if I try to run metro exodus at 320 watts the clocks drop to 1600.

The guy basically made up that amd are better at undervolting and when I asked for clarifications he started personally attacking me....absurd
 
Back