AMD Radeon RX 6600 XT Review: Diminishing Returns

Agreed on that too, in the monthly QA's they even addressed these things, specifically say the Scheduler and CPU overhead as being an AMD talking point, so I'd have also expected them to touch on what AMD has that is a potential talking point or advantage too, especially given the lower end of the scale/price point this card occupies, people are less likely to have a super high-end system powering it compared to high-end cards.
Yup. I am curious to see how a 6600XT and 6700XT vs a 3060 (Ti) would perform as upgrade options for my 2700X + B450 combo. Afaik the board does support SAM with the latest Bios.

The 3060Ti and 6700XT are my target class as I want to move to 1440p, however I do not know what the actual performance benefits would be.

I guess the same goes for the 3060 / 6600XT target group who are probably on lower end hardware. It would suck to upgrade to one of these just to discover that you end up with the same performance as a Polaris card due to limiting factors.

While this is mentioned as a caveat for the 6600XT (PCIe 3 bandwidth ) I do not recall the software scheduler being mentioned in the same way for the 3060 review, but of course I could have missed that.
 
Definitely agree, but the interesting part is that AMD extra features like a resizable bar implementation that actually gives a performance benefit (SAM) and the hardware scheduler (vs. nVidia‘s software scheduler) that should make Radeon cards perform better with older / weaker CPU are neither mentioned nor reviewed.

It could be that e.g. SAM gives no benefit on the 6600XT or that this particular model does not perform better with older CPU but since this was neither addressed nor tested I have no idea.

But either way, these are AMD specific features / advantages just like DLSS and RT are nVidia ones.
Yep. It reminds me of the RTX2060 reviews, which ALWAYS mentioned raytracing, even though their own benchmarks showed that the 2060 wasn't capable of in any useful measure. I include ALL of the tech press in this. We are seeing the same nonsense with BS MRSP figures.
 
Not at all a disaster. 2060 was and still is a fine card, and it set a decent entry level for RT. 2 and a half years on it can still pretty much run everything out there if you’re realistic with your settings.
Did you just say that the RTX 2060 was NOT a disaster for Ray-Tracing?

*NVIDIA FANBOY ALERT!*

You just made an excuse for terrible performance from a GeForce product while trashing a Radeon product for the same thing.

Sorry but I can no longer take your posts seriously.
 
Last edited:
Not a disaster for Ray-Tracing?

*NVIDIA FANBOY ALERT!*
What's with the hysterics?

I reiterate: The 2060 will run every game out there that supports ray tracing, provided you are reasonable with your settings. In fact, despite being released over 2 and a half years ago, and costing less at launch, the 2060 is actually stronger in RT than the 6600 XT. What disaster?
 
Last edited:
Meanwhile in the real here and now. NOT fairy tale MRSP figures
RTX 3060Ti = £585
RTX 3060 (12GB) £590
RX 6600xt = £390
Prices current and available for sale at Ebuyer uk.
Given that the rx6600 is roughly 30% cheaper than the 'mighty' 3060ti and within 10% the performance at 1080p.
I would say it's a bit of a bargain. Steve might want to revise his review.
I had to go and see this for myself... but, sure enough, it's 100% accurate:
Asrock Radeon RX 6600 XT Challenger D 8GB OC Graphics Card: £389.99
 
Yup. I am curious to see how a 6600XT and 6700XT vs a 3060 (Ti) would perform as upgrade options for my 2700X + B450 combo. Afaik the board does support SAM with the latest Bios.

The 3060Ti and 6700XT are my target class as I want to move to 1440p, however I do not know what the actual performance benefits would be.

I guess the same goes for the 3060 / 6600XT target group who are probably on lower end hardware. It would suck to upgrade to one of these just to discover that you end up with the same performance as a Polaris card due to limiting factors.

While this is mentioned as a caveat for the 6600XT (PCIe 3 bandwidth ) I do not recall the software scheduler being mentioned in the same way for the 3060 review, but of course I could have missed that.
I have a crazy idea for you assuming that your display is the standard 60Hz. Try using a big-screen 4K TV. The reason I say this is that televisions seem to have some kind of built-in hardware-based upscaling tech that works incredibly well. When I bought my 4K TV, I ended up getting a (less than $500CAD) Haier 55" from Costco because I did not want a "Smart" TV from Samsung or LG (or anyone else for that matter) because of all the stories of them spying on their owners and also because I was going to have my PC hooked up to it anyway, rendering the "Smart" feature moot anyway. My TV has no ethernet port or WiFi (or any other way to gain access to the web) so it literally can't spy on me.

What I discovered is that, no matter what resolution I set the game (in this case, Far Cry 5) to, the picture I got was 2160p in all it's glory. I didn't realise it at first because I had actually tried setting games to 2160p to see how my R9 Fury could handle it. It looked amazing (as I expected) but (as I also expected), some games were more-or-less unplayable at 4K depending on where in the game map I was. I tried turning it down from 2160p to 1440p and noticed that it still looked amazing with much better frame rates. I was a bit mystified however because it seemed to look exactly the same as 2160p. I went back and forth a couple of times between 1440p and 2160p to see if I was imagining things. As it turns out, I wasn't.

I confirmed this when I turned the res down to 1080p and it looked the same again. I thought to myself "This is impossible because people are always raving about how good 4K looks!" and decided to make a (somewhat) more scientific test. What I did was run the Far Cry 5 in-game benchmark and put my face less than 1m away from the panel so I could focus on specific graphics-related things like tree/foliage textures, vehicles and water.

I ran the benchmark at 1080p, 1440p and 2160p and used FRAPS to ensure that there was at least a frame-rate performance difference between the three scenes. By doing the test this way, I'd know for certain that the PC side of things was doing EXACTLY what it was supposed to. Well, I can say that the (three which became six, which became eight which became ten) benchmark runs all looked 100 identical. Since it was just the benchmark and not actual gameplay, the scene was rendered (seemingly) smoothly at all resolutions but FRAPS told a different story.

At 1080p, the benchmark averaged 75fps while at 1440p, it lost 20fps on average. At 2160p though, it averaged only 30fps although it didn't really stutter at all. I had the graphics settings turned up so that I could see the textures in the foliage and the water and even with my face up-close, I couldn't tell one from the other. I was puzzled because I couldn't understand why a 1080p rendering blown up to fit a 55" panel looked just as good as a 2160p rendering. I dropped 1440p from my test resolutions thinking that maybe it was playing tricks on my eyes by lowering the rendering gradually but back and forth (twice) between 1080p and 2160p was completely indistinguishable despite the fact that I was trying hard to spot anything that wasn't 100% the same.

That's when I realised that "OH YEAH, IT'S A TV!" because TVs have to take an HD (or lower-res) signal and turn it into a resolution that better fits the panel without looking awful. It would seem that my TV does the same thing with what my PC sends it. So with all of these people going on and on about DLSS, I say, just use a goddamn TV instead of a monitor and you'll have it already! There's a limit to this of course because TVs will never render as fast as gaming monitors (because there's no need to be able to do so).

While there are 240Hz TVs out there, I can't comment on whether or not they can upscale like that at 240Hz because I my TV is only 60Hz and I've never used a 240Hz TV for gaming. Having said that, I don't know of any media that is encoded at 240fps because that would just make overly huge video files no reason (unless it was a master copy that doesn't get distributed)..

Sure, a high FPS rate can (and does) give better results in high-speed gaming like e-Sports but for watching encoded video media, it offers no benefit at all. So it's possible that a 240Hz 4K TV may not upscale well at 240Hz because there was no foreseeable reason to do it since I'm sure that they didn't expect someone (like me) to game on a 55" 4K TV.

I have no proof that a 240Hz TV won't upscale well at 240Hz but I figure that if I were making TVs and wasn't expecting them to be used for gaming, I'd only be concerned with how well it upscales at 60Hz since spending extra money for more tech to upscale at 240Hz would be a waste in my eyes. Having said that, I've never been able to understand "corporate logic" so I could be 100% wrong. Regardless, if you already own a 4K TV, it's definitely worth a shot, if only for curiosity's sake.
4K upscaling: everything you need to know about how TVs turn HD into 4K
 
Last edited:
I have a crazy idea for you assuming that your display is the standard 60Hz. Try using a big-screen 4K TV. The reason I say this is that televisions seem to have some kind of built-in hardware-based upscaling tech that works incredibly well. When I bought my 4K TV, I ended up getting a (less than $500CAD) Haier 55" from Costco because I did not want a "Smart" TV from Samsung or LG (or anyone else for that matter) because of all the stories of them spying on their owners and also because I was going to have my PC hooked up to it anyway, rendering the "Smart" feature moot anyway. My TV has no ethernet port or WiFi (or any other way to gain access to the web) so it literally can't spy on me.

What I discovered is that, no matter what resolution I set the game (in this case, Far Cry 5) to, the picture I got was 2160p in all it's glory. I didn't realise it at first because I had actually tried setting games to 2160p to see how my R9 Fury could handle it. It looked amazing (as I expected) but (as I also expected), some games were more-or-less unplayable at 4K depending on where in the game map I was. I tried turning it down from 2160p to 1440p and noticed that it still looked amazing with much better frame rates. I was a bit mystified however because it seemed to look exactly the same as 2160p. I went back and forth a couple of times between 1440p and 2160p to see if I was imagining things. As it turns out, I wasn't.

I confirmed this when I turned the res down to 1080p and it looked the same again. I thought to myself "This is impossible because people are always raving about how good 4K looks!" and decided to make a (somewhat) more scientific test. What I did was run the Far Cry 5 in-game benchmark and put my face less than 1m away from the panel so I could focus on specific graphics-related things like tree/foliage textures, vehicles and water.

I ran the benchmark at 1080p, 1440p and 2160p and used FRAPS to ensure that there was at least a frame-rate performance difference between the three scenes. By doing the test this way, I'd know for certain that the PC side of things was doing EXACTLY what it was supposed to. Well, I can say that the (three which became six, which became eight which became ten) benchmark runs all looked 100 identical. Since it was just the benchmark and not actual gameplay, the scene was rendered (seemingly) smoothly at all resolutions but FRAPS told a different story.

At 1080p, the benchmark averaged 75fps while at 1440p, it lost 20fps on average. At 2160p though, it averaged only 30fps although it didn't really stutter at all. I had the graphics settings turned up so that I could see the textures in the foliage and the water and even with my face up-close, I couldn't tell one from the other. I was puzzled because I couldn't understand why a 1080p rendering blown up to fit a 55" panel looked just as good as a 2160p rendering. I dropped 1440p from my test resolutions thinking that maybe it was playing tricks on my eyes by lowering the rendering gradually but back and forth (twice) between 1080p and 2160p was completely indistinguishable despite the fact that I was trying hard to spot anything that wasn't 100% the same.

That's when I realised that "OH YEAH, IT'S A TV!" because TVs have to take an HD (or lower-res) signal and turn it into a resolution that better fits the panel without looking awful. It would seem that my TV does the same thing with what my PC sends it. So with all of these people going on and on about DLSS, I say, just use a goddamn TV instead of a monitor and you'll have it already! There's a limit to this of course because TVs will never render as fast as gaming monitors (because there's no need to be able to do so).

While there are 240Hz TVs out there, I can't comment on whether or not they can upscale like that at 240Hz because I my TV is only 60Hz and I've never used a 240Hz TV for gaming. Having said that, I don't know of any media that is encoded at 240fps because that would just make overly huge video files no reason (unless it was a master copy that doesn't get distributed)..

Sure, a high FPS rate can (and does) give better results in high-speed gaming like e-Sports but for watching encoded video media, it offers no benefit at all. So it's possible that a 240Hz 4K TV may not upscale well at 240Hz because there was no foreseeable reason to do it since I'm sure that they didn't expect someone (like me) to game on a 55" 4K TV.

I have no proof that a 240Hz TV won't upscale well at 240Hz but I figure that if I were making TVs and wasn't expecting them to be used for gaming, I'd only be concerned with how well it upscales at 60Hz since spending extra money for more tech to upscale at 240Hz would be a waste in my eyes. Having said that, I've never been able to understand "corporate logic" so I could be 100% wrong. Regardless, if you already own a 4K TV, it's definitely worth a shot, if only for curiosity's sake.
4K upscaling: everything you need to know about how TVs turn HD into 4K
I was wondering if I just have lousy vision (I do) but I’ve played several games on my 65 inch Sony 4K tv and couldn’t tell the difference between the 1080p, 1440p and 4K settings either... nice to see that it’s not my bad eyes - but the TV simply upscaling everything to 4K!
 
A couple of takes from my point of view.

Over and over, the 6600xt its called garbage because its only %5 faster than a card that was released at a higher tier (5700xt). Yes, the point of the price between the 2 is valid, the 6600xt should be cheaper, but these days that means nothing.

AMD says this is a 1080p card, period. So anything that the gpu is doing “better” at anything higher than 1080p is a freebie.
I don't know if I'm willing to give AMD the benefit of the their 1080p claims there because they sure as hell weren't pushing the 5700 XT as a "1080p card" and this 6600 XT has performance like what I'd expect if ATi had made a Radeon RX 5700 XTX. A manufacturer cannot claim that a card of one generation is ideal for 1440p and then release more powerful card than that in the next-gen and claim that it's for 1080p.

I'm afraid that AMD has become the victim of their own stupid marketing division because I know for a fact that AMD was pushing the 5700 XT as a 1440p card. That means they can't honestly push a more potent card as being for 1080p.
 
I was wondering if I just have lousy vision (I do) but I’ve played several games on my 65 inch Sony 4K tv and couldn’t tell the difference between the 1080p, 1440p and 4K settings either... nice to see that it’s not my bad eyes - but the TV simply upscaling everything to 4K!
I don't think that your eyes are bad if I'm seeing the same thing. I posted essentially the same thing over a year ago but I guess that at the time, nobody else was doing anything similar. Now that you've experienced the same thing, I actually think that you and I have always had what nVidia was flogging as DLSS built into the hardware of our displays!

So wow, yeah. This is pretty exciting news for me because I thought that maybe I was just nuts (it wouldn't be the first time I thought that either :laughing:) but having you experiencing the same thing makes me think that I may have come across something revolutionary that for some reason, nobody in the tech press has been talking about.

Oh man, you know, things like this are far more exciting than hearing someone like Jensen "The Big" Huang droning on and on about ray-tracing. Despite what the big Huang says, it's television upscaling that "just works"!

When I get home from work today, I'm going to see what AC: Odyssey looks like at 720p because if it's still the same, oh boy, that will be one helluva discovery for people who want to extend the life of their GPUs at this difficult time for people who want/need a new GPU!
:laughing:
 
That means they can't honestly push a more potent card as being for 1080p.
Maybe they are counting on games having bigger textures or maybe the damned RT.

Who knows.

What I know is that class wise, its supposed to replace the 5600XT, not the 5700XT and they are saying its a 1080p, not a 1440p card.

Were I think it would be a real problem is if they claim its a 1440 or 4k card, which its clearly not the case.

All that said, I do respect your opinion and will ask, what do you think they should have done differently?
 
Last edited by a moderator:
2022:

Introducing the AMD Radeon™ RX 7600 XT graphics card, featuring the breakthrough AMD RDNA™ 3 architecture, powered by our new custom supercharged 64-bit memory interface engineered to deliver the ultimate 720p gaming experience. (Minimum PSU Recommendation 850W)
 
Maybe they are counting on games having bigger textures or maybe the damned RT.

Who knows.

What I know is that class wise, its supposed to replace the 5600XT, not the 5700XT and they are saying its a 1080p, not a 1440p card.

Were I think it would be a real problem is if they claim its a 1440 or 4k card, which its clearly not the case.

All that said, I do respect your opinion and will ask, what do you think they should have done differently?
They should have said that the RX 6600 XT is a 1440p card. If the RX 5700 XT can handle 1440p, then so too can the 6600 XT.
 
I was wondering if I just have lousy vision (I do) but I’ve played several games on my 65 inch Sony 4K tv and couldn’t tell the difference between the 1080p, 1440p and 4K settings either... nice to see that it’s not my bad eyes - but the TV simply upscaling everything to 4K!
I have an upfate for you. Assassin's Creed Odyssey looks EXACTLY the same on my TV when set to 720p as it does if I set it to 2160p. Whoa, this could be a big deal going forward! I do believe that I've discovered a LIFE-HACK! :D
 
Last edited:
2022:

Introducing the AMD Radeon™ RX 7600 XT graphics card, featuring the breakthrough AMD RDNA™ 3 architecture, powered by our new custom supercharged 64-bit memory interface engineered to deliver the ultimate 720p gaming experience. (Minimum PSU Recommendation 850W)
Yeah, that's kinda what I was getting at. :laughing:
 
But isn't that the spot for the 6700?
It's whatever they want it to be. Hardware development always out-paces software development anyway. To me, a card isn't categorised as "2160p, 1440p, 1080p, 720p, etc.", a card is vategorised as "Halo-class, top-end, high-end, mid-range, low-end and office/HTPC-class". The card itself is suitable for whatever you can make work on it.

Remember, it's only recently that we used multiple resolutions at the same time. I used to be 300x200, then 640x480, then 800x600, then 1024x768, etc. Everyone had a CRT monitor (which meant uber-fast refresh rates) with a 4:3 aspect ratio which became 16:10 which became 16:9. It used to be that a card could run a game or it couldn't. We didn't have the options to turn things down or up to make it work. That ability is relatively recent in the timeline of the personal computer. This is because PCs weren't considered serious gaming machines by the general public until about 10 years ago. That's why consoles sold so well for so long. Many people had both because their computer wasn't set up for gaming unless it was an Alienware. High-end gaming on PCs was limited to ONLY builders.
 
It's whatever they want it to be. Hardware development always out-paces software development anyway. To me, a card isn't categorised as "2160p, 1440p, 1080p, 720p, etc.", a card is vategorised as "Halo-class, top-end, high-end, mid-range, low-end and office/HTPC-class". The card itself is suitable for whatever you can make work on it.

Remember, it's only recently that we used multiple resolutions at the same time. I used to be 300x200, then 640x480, then 800x600, then 1024x768, etc. Everyone had a CRT monitor (which meant uber-fast refresh rates) with a 4:3 aspect ratio which became 16:10 which became 16:9. It used to be that a card could run a game or it couldn't. We didn't have the options to turn things down or up to make it work. That ability is relatively recent in the timeline of the personal computer. This is because PCs weren't considered serious gaming machines by the general public until about 10 years ago. That's why consoles sold so well for so long. Many people had both because their computer wasn't set up for gaming unless it was an Alienware. High-end gaming on PCs was limited to ONLY builders.
Man, thats a lot of memories in one paragraph!

Which reminds me that on those days, we used to reduce the image size to make the game run faster. Doom was the “biggest offender and the solution was a faster cpu.
Fun times!
 
It's whatever they want it to be. Hardware development always out-paces software development anyway. To me, a card isn't categorised as "2160p, 1440p, 1080p, 720p, etc.", a card is vategorised as "Halo-class, top-end, high-end, mid-range, low-end and office/HTPC-class". The card itself is suitable for whatever you can make work on it.

Remember, it's only recently that we used multiple resolutions at the same time. I used to be 300x200, then 640x480, then 800x600, then 1024x768, etc. Everyone had a CRT monitor (which meant uber-fast refresh rates) with a 4:3 aspect ratio which became 16:10 which became 16:9. It used to be that a card could run a game or it couldn't. We didn't have the options to turn things down or up to make it work. That ability is relatively recent in the timeline of the personal computer. This is because PCs weren't considered serious gaming machines by the general public until about 10 years ago. That's why consoles sold so well for so long. Many people had both because their computer wasn't set up for gaming unless it was an Alienware. High-end gaming on PCs was limited to ONLY builders.
Wasn‘t the great thing about CRT support for multiple resolutions with none of them being ‚native‘ (as opposed to LCD which have a fixed resolution) ?

Still remember using different resolutions per game with my CRT (it was a high end model at the time).
 
AMD took a massive dump on gamers with this card and pandered to miners. Reports are that this is an excellent mining card but clearly it’s a poor gamers card. They have cheaped out so hard they even removed 8 lanes as most miners only need 1. Only gamers are impacted by the move to 8 lanes. This card really smells like AMD have just maximised their TSMC capacity to get the most amounts of products on shelves - quantity over quality. But we knew AMD had de-prioritised gamers at this point as their Radeon drivers are even shoddier than they normally are.

I find it amusing that everyone is talking about 1080p because AMD marketed it. AMD also marketed ray tracing on it but we don’t take that as seriously do we! AMDs market as per is dreadful. Most of these companies have dreadful marketing. I thought we could have learned to ignore it and just judge the products from an engineering perspective but here we go.

This is clearly a 1440p card, it doesn’t do as well compared to the competition at 1440p as it does at 1080p but that doesn’t mean that this card isn’t able to play games at 1440p. Don’t let AMD gaslight you, they know as well as we do that hardly anyone is buying this for 1080p. They just want the reviews to focus on that as thats where the card compares best.

Also the price of $380, yeah it’s high but if you can get one at this price it would be the best value on the market right now. I don’t understand the criticism of this price, it feels like AMD is being more realistic. Reviewers for some reason are stupidly comparing to Nvidias MSRP and not the actual prices Nvidia cards are selling for. Although personally I would spend more on a 3060 so I don’t have to be at the mercy of AMDs frankly appalling driver support.
 
Wasn‘t the great thing about CRT support for multiple resolutions with none of them being ‚native‘ (as opposed to LCD which have a fixed resolution) ?

Still remember using different resolutions per game with my CRT (it was a high end model at the time).
Yes and no.

is true that they could display fine at different resolutions and refresh rates, but they also had their preferred resolution and a max resolution plus refresh rate.

certain combinations of both would produce images that would be hard on your eyes, even inducing headaches.

About the games, for a long time, they were stuck at 320x200 so they were pixelated like hell, even though, the monitors were sold as 640x480 (vga) and they didnt scale up.

Even then, you had to reduce the game window, in Doom for example, to get a decent enough frame rate, which its not really the same as lowering the resolution.
 
Promptly ignored too. Never mind the facts of the matter I guess, and you're the one that gets called a fanboy for it.
Clearly there are some strong feelings out there about Nvidia as a company. Most of which have nothing to do with the end user experience of their products. The dig at the 2060 was both unmerited and just out of place at an article about the disappointment that is the 6600 XT. Anyway, glad you noticed too.
 
Last edited:
Let me waste some more time with this, but I doubt they will understand.

Here are some good points as to why many dont like nvidia:
What is it about that list of grievances, whether they be (partly) true or imagined, that is in any way relevant to this review of the new AMD RX 6600 XT GPU?
 
What is it about that list of grievances, whether they be (partly) true or imagined, that is in any way relevant to this review of the new AMD RX 6600 XT GPU?
As I said, let me waste some time.

The answer confirms it.

No need to waste more.
 
Back