AMD is showing off more of its own Radeon RX 6000 benchmarks, edging out the GeForce competition

You're counting games that are not even out yet. Read the article again.
Yes, 34 in total, 25 coming, only 9 are out, but that`s not a handful either. Besides... Cyberpunk2077, CoD:Cold war, Dying light 2, please tell me some big releases that will not support RT.
 
Yes, 34 in total, 25 coming, only 9 are out, but that`s not a handful either. Besides... Cyberpunk2077, CoD:Cold war, Dying light 2, please tell me some big releases that will not support RT.
Despite it's flaws, at current stage anyway, only you seem to be excited over a dragging tech (for now). Reminds me of all the hoopla of Nvidia's "hairworks" those days.

RT still brings down 36% of fps on a 3080 @ 4K.

Let's wait and see if it can actually take off. It might, or might not. For now people are looking for fluid gaming experience at 4K. For you RT might be the ultimate option, but for most, it's not.

It'll take a few more generations of Nvidia cards for RT to settle in, if it's the thing at all, for both the high end and budget cards. If RT is important, it will seep to the budget cards sooner than later.

For now, it's just a tech demo.

And no, I wouldn't make a purchasing decision based on Cyberpunk. A title that I'm not waiting for anyway.
 
Last edited:
I wonder if AMD prepared some designs to counter Nvidia's possible super/TI comeback? God knows what we are going to get from AMD, Nvidia and Intel this winter. It's really exciting times for gamers.
 
Despite it's flaws, at current stage anyway, only you seem to be excited over a dragging tech (for now). Reminds me of all the hoopla of Nvidia's "hairworks" those days.

RT still brings down 36% of fps on a 3080 @ 4K.

Let's wait and see if it can actually take off. It might, or might not. For now people are looking for fluid gaming experience at 4K. For you RT might be the ultimate option, but for most, it's not.

It'll take a few more generations of Nvidia cards for RT to settle in, if it's the thing at all, for both the high end and budget cards. If RT is important, it will seep to the budget cards sooner than later.

For now, it's just a tech demo.

And no, I wouldn't make a purchasing decision based on Cyberpunk. A title that I'm not waiting for anyway.
Hairworks was a proprietary gimmick, RT is supported by both AMD and NVidia and is an important feature for DirectX12. RT does drop framerates by 30-40%, but it is playable for all games that support it on 3080 @4k, albeit some drop to 30fps. On lower resolutions is much better and most people don`t even play in 4k anyway. Calling it a tech demo is lame, but sure, improvements need to be made. A RT benchmark will be a good indicator for futureproofing as it stresses the card a lot and that was, actually, my point.
 
Last edited:
An overclocked result is not comparable. AMD is folling you guys although I acknowledge that their GPU's are really good this time.
 
Great! The competition is there now if only we didn't have this pandemic to mess with prices and stock it would be great.


As for RT, it is still a gimmick. It's ultimately not worth it as it doesn't improve graphics significantly to justify abysmal performance.
 
This is only for sure true in a pre-Smart Access Memory world. With SAM, PCIe bandwidth could have a VERY notable impact on the performance gain it can provide. (Because it provides a faster connection between the CPU & the GPU's VRAM).
That's a very good point, and definitely a factor to look for when the first benchmarks start rolling in.

RT does drop framerates by 30-40%, but it is playable for all games that support it on 3080 @4k
I'm not the gamer some of the rest of you here are, but doesn't essentially every option which increases image quality and/or realism lower frame rates? If frame rate was always the overarching goal, why aren't we all playing at 480p with solid-fill textures?

I'm with you. Calling raytracing no more than a gimmick is exceedingly myopic.
 
That's a very good point, and definitely a factor to look for when the first benchmarks start rolling in.

I'm not the gamer some of the rest of you here are, but doesn't essentially every option which increases image quality and/or realism lower frame rates? If frame rate was always the overarching goal, why aren't we all playing at 480p with solid-fill textures?

I'm with you. Calling raytracing no more than a gimmick is exceedingly myopic.

Putting aside any debate about gimmicks, I don't think it adds enough to the gaming experience to be worth so much focus. In anything with more action than a walking simulator, does anyone actually notice the difference? I mean there's a reason that things like Radeon boost and variable rate shading exist. People don't tend to notice small detail changes when things are moving.

Making an argument that if we prioritised frame rate we would all be playing at 480p is ludicrous. It totally misses the point in how close current technology is to photorealism without ray tracing, and how incrementally small the difference with ray tracing is. Context matters.
 
Having been one of the lucky few to get an RTX 3070 FE and coming from an RX 5700, there is a difference in image quality that goes beyond just framerate. The Nvidia card just has better lighting effects. I noticed this in Jedi Fallen Order, Rise of the Tomb Raider, and even in the Spyro Trilogy while my kids were playing it. I don't know if this will change now that AMD has RT cores, but these are not RT games, so my guess is that lighting effects just run better on Nvidia's architecture.

I think you'll see the RX6800XT and the RTX 3080 performance will be very game dependent. Same for the RX 6800 and RTX 3070, although I think you'll see the RX 6800 outperform the RTX 3070 more often than the two flagships. The way that Nvidia is using its cuda cores will make it so that games that use more integer math will perform better on the AMD cards, while games that use less integer math will perform better on Nvidia cards. The RTX 3070 has 2944 full FP32 cores and 2944 potential FP32 cores, whereas the RX 6800 has 3840 full time FP32 cores. This works quite well for Nvidia when you consider that the RTX 3070 can sometimes outperform the 2080 Ti that has 4352. Interestingly that's about 75% of the full 5888 cuda core potential of the RTX 3070, with Nvidia's estimates that games average about 25% integer math, that makes sense that these cards perform close to one another. But sometimes that is going to be lower, sometimes the RTX 3070 will only be using maybe 60% as cuda cores while 40% of the cores are doing integer math. This means that the RTX 30 series will likely see huge performance differences depending on the game, while the RX 6000 series should be a good bit more predictable in its performance.

For the 3080, with 4352 full time cuda cores and 4352 potential cores paired against 4608 from the AMD card, I could see the 3080 outperforming the RX6800 XT more often than not, even with AMD's core clock advantage. On the other hand the RX6800 has nearly 900 more full time FP32 cores which will be more difficult for the 3070 to overcome in most games.
I think the result will be:
3090 > 6900XT > 3080 > 6800XT > 6800 > 3070, but as far as value is concerned, I think the Nvidia cards will still hold an edge with RTX and DLSS performance.
 
I also agree that RT tech is still kind of immature due to the big hit on performance (compared to what it offers). It's getting better though. And calm down man, nobody calls it an obsolete tech, it just is not yet practical in many of today's conditions. No one will want to activate RT in their competitive shooter, for example. You'd activate it only if your card can produce high enough FPS, during single player. Even then most people would prefer higher FPS for fluid game play.

And no, gaming will advance further beyond 4k and even 8k.

Sure I will enable RT on a shooter. If it holds 120-144fps in my 2K resolution monitor the absolutely. Normal humans (ie, most of us) are not going to get a measurable performance difference at 240-360hz, and those that do are still doing silly things like playing at lower settings so brush and grass doesn't render as far and they can maintain 240fps+ @ 1080p. Most of us much prefer 2k@120+ given the option.

As for 4K, 8K, etc. Ive seen some of that equipment. People don't buy PC monitors big enough to make sense rendering a game at 8k. A 32" 4k panel is 145dpi and a 32" 8k is 275dpi. A young human with perfect vision (20/20 and no other eye issues) can see 300dpi max at 8-10 inches. So a 8K display is likely the max point for useful gaming on a PC simply due to eye resolution. The more distant the display the less pixels needed. Sure you could sit 10 inches from a 60 inch display and see the pixels and so "need" a higher resolution but who would want to play 10" from a display that large? So the larger it gets the further back you move and the less DPI you need to maintain the illusion of not seeing pixels.

But all of that ignores the performance impact. Just above you said no one will turn on RT on a shooter because they want FPS. The exact same calculation applies to resolution but at a much greater amount. The performance cost of 4K over 1080p is immense and we are only now reaching a point where it makes sense (and even then most of us prefer 2K). The jump to 8K will be another quadrupling of performance impact but how much will it improve visual impact? Not a lot. Will it eventually happen? Maybe, though probably via technologies like DLSS. The previous post is absolutely right that RT and similar tech is the way to go now to improve visual fidelity, not raw pixels (except in the case of VR where a visual field of 8-10K is likely going to be needed to truely achieve the zero pixel effect. Though the valve Index gets really close)..
 
As for 4K, 8K, etc. Ive seen some of that equipment. People don't buy PC monitors big enough to make sense rendering a game at 8k. A 32" 4k panel is 145dpi and a 32" 8k is 275dpi. A young human with perfect vision (20/20 and no other eye issues) can see 300dpi max at 8-10 inches. So a 8K display is likely the max point for useful gaming on a PC simply due to eye resolution. The more distant the display the less pixels needed. Sure you could sit 10 inches from a 60 inch display and see the pixels and so "need" a higher resolution but who would want to play 10" from a display that large? So the larger it gets the further back you move and the less DPI you need to maintain the illusion of not seeing pixels.

I'm glad you mentioned dpi vs size. It's all about optimal viewing distance.and viewable area without having to pan.

A 40in 4k monitor at 18in away is not a very pleasurable viewing experience.
 
I'm glad you mentioned dpi vs size. It's all about optimal viewing distance.and viewable area without having to pan.

A 40in 4k monitor at 18in away is not a very pleasurable viewing experience.

I bought a 32" 2K because my eyes are not the best and it seems about right. A 34" or 38" wide screen might be ok too but would probably mean id have to sit a bit further back. At 4K id need either a bigger panel or scale the UI for comfortable use. Unfortunately windows still isn't super hot at UI scaling. That said I think for most people (especially younger folks) 32-34" is the sweet spot for 4K at desktop distances.

Microsoft needs to drastically improve UI scaling across every windows feature to allow comfortable use of a 27-32" 8K panel IMO. And any older programs that didn't play well with scaling would be microscopic!
 
On an AMD x370 chipset motherboard with PCIe 3.0 slots, would it be pointless to install a PCIe 4.0 generation video card?

Techspot recently tested PCIe 4.0 vs. PCIe 3.0 GPU Benchmark; https://www.techspot.com/review/2104-pcie4-vs-pcie3-gpu-performance/

Their conclusion was "In a nutshell, right now PCIe 4.0 does little to improve performance with the RTX 3080. It’s possible that could change with future games, but for now, it’s a non-issue."

"PCIe 4.0 generation video card" will work just fine.
 
If RT sucks for AMD cards that will leave a big question mark despite their (at this time alleged) top performance. Being quiet about it is not a good sign. We`ll see.

Ray Tracing is such a niche feature that it doesn't matter to me. Those that own 3080 cards have stated that they are turning off Ray Tracing to get better frame rates. So Ray Tracing is still a work in progress. I think we are two generations away from having graphic cards that can run games with that feature turned on full time.
 
I am super excited to get my 5900X with a 6800XT on a Asus ROG X570 Crosshair VIII Hero (Wi-Fi) ATX Motherboard and 64GB RAM! :)
 
Last edited:
I am super excited to get my 5900X with a 6800XT on a Asus ROG X570 Crosshair VIII Hero (Wi-Fi) ATX Motherboard and 64GB RAM! :)
Planning to build something similar with the Zen 3 and 6800XT too. I guess this is the only thing seem to brighten up this year.
 
And you can still get one for over $1200!

Yah that part is mind blowing. Ebay is outright broken about 75% of the time from my experience. I remember back when it first started it was a great place to get used items at a deal or find something really bizarre or niche you needed. It was also good for selling niche items or in some cases used items. But the idea of buying something at retail and flipping it on ebay would have been solly and stupid back then.

Now it seems to be a haven for people who are not capable of basic money management or who understand the term "used". Ebay has single handedly warped the used market into something really odd.

I sold my 1080ti for 450$ the day of the 30 series announcements. I listed it as buy it now for a few percent below the average and it got snapped up within hours. So I upgraded from 4 and a half year old GPU to a brand new one thats twice as fast for 300$!
 
Ray Tracing is such a niche feature that it doesn't matter to me. Those that own 3080 cards have stated that they are turning off Ray Tracing to get better frame rates. So Ray Tracing is still a work in progress. I think we are two generations away from having graphic cards that can run games with that feature turned on full time.

I disagree. I'm easily getting 100-120fps in wolfenstien youngblood @2K,ultra,RTX on, DLSS off. With a gsync/freesync monitor thats plenty of FPS for most games. Some players play with 800$+ video cards on 1080p monitors and turn gfx down to medium or low for the "competitive edge" but their far from the majority.

Now that its on consoles and amd hardware I expect to see it explode in use once the major game engines have solid support for it. Its just too easy to use from a developers point of view. I also expect to see it used more in less demanding rpg, strategy, RTS, side scrollers/roguelikes and turn based titles soon.
 
Back