AMD Radeon RX 6600 XT Review: Diminishing Returns

“ Expect to pay a hefty premium due to the shortages, cryptomining and scalping going around. ”

Microcenters around me have shelves filled with AMD cards. It is very easy to get one right now. Unfortunately the scalpel price is quite high and they are well above MSRP.

6900xt Red Devil = $2299
6700xt Red Devil= $929. Hellhound = $939

The Nvidia cards sell above MSRP but the AMD cards sit on the shelf.
 
“ While AMD would like to tell you the 6600 XT is a 1080p gaming graphics card, the fact is many of you will want to use a ~$400 graphics card at 1440p and frankly that’s not asking much. At this resolution the new GPU slips further behind the 3060 Ti, trailing it by 16% and showing it isn't much faster than the standard 3060”

That right there sums it up for me.
 
To me, AMD nailed it with their high end RDNA2 GPUs, ie. 6900 and 6800 series. Everything below starting with the 6700 XT is not good. AMD gimped the GPU specs too much, so much so it performs really poorly as compared to Nvidia. The worst part is not the performance, but the price. There is no bad product, but bad pricing. This card should rightfully be priced around a RTX 3060 if not lower. Performance is competitive in a few titles, but in general, it falls around a RTX 3060. Hardware wise, it’s got lower memory bandwidth, 50% lesser VRAM, and when it comes to value added features like RT, it falls flat. What’s surprising is for AMD to further gimp the card by limiting it to PCIE x8. That’s the dumbest thing they can do when they themselves are still offering PCI-E 3.0 in the form of their recent Ryzen 5000 series APU.
 
Nice to see that AMD won't let Nvidia have all the bad press 'glory': their answer to Nvidia's ridiculously tiny amounts of VRAM is crippling the card for anyone not on pci-e 4.0 which actually given the supposedly midrange aim should be *plenty* of AMD customers still on Ryzen 3000 series.
 
This card's performance makes it essentially an RX 5700 XTX. Sure, it's got ray-tracing capability but that's still just a frill. At this point, I'm still not convinced that ray-tracing will become the standard because it's just not enough of an improvement over rasteristion to justify the extra hardware and coding requirements to implement it. At least, not at this time nor for a good chunk of time looking forward.
 
Last edited:
So AMD has just priced itself out. Big deal, they weren't that good even before. I went for RTX3080Ti not only because it beats 6900XT in most tests, but also because it is excellent in Deep Learning, unlike AMD products, while at the same time on average RTX3080Ti costs less than 6900XT, which I think is an absurd situation, considering you get a superior product.

For me, the combination of Ryzen 9500X + RTX3080Ti is the golden-value middle on the market today, which is why I bought those.
 
Underwhelming, to say the least. $379 MSRP for this? 3060 Ti came out almost a year ago by now, for 399 and beats it with ease plus it has option for DLSS which is now in ~100 games + much better Ray Tracing (this won't be optional soon - Metro Exodus EE requires it, for example, more games will follow)

6600XT suffers ALOT from that bus size, 1440p performance is really bad compared to 1080p. The card chokes. 128 bit bus is simply a mistake, only ultra low-end cards should have a 128 bit bus. 192 bit is bare minimum, 256 bit or more is prefered.

Maybe thats why AMD try and push this as a 1080p beast.. Like 1080p is demanding..
 
This card's performance makes it essentially an RX 5700 XTX. Sure, it's got ray-tracing capability but that's still just a frill. At this point, I'm still not convinced that ray-tracing will become the standard because it's just not enough of an improvement over rasteristion to justify the extra hardware and coding requirements to implement it.

It's a weird case because I would assume that Nvidia jumped the gun by several years and AMD was pressured into rushing it out the door by a couple of years too.

But we're working out of some flawed assumptions: we assume companies would wait until they can offer the feature to the most users possible to make it widespread and speed up adoption and ensure a success. That's a sound model for a new hardware feature that requires new software support right?

Yet Nvidia changed the game by basically brute forcing their way into a feature that wasn't ready: The performance penalty was too high to justify on entry level cards, meaning only the premium high end ones could realistically support it. Instead of waiting, Nvidia decided to invest money on software devs to include the feature faaaar earlier than it would be feasible vs the time investment just so they could push more people into the premium models instead of letting folk settle for the mid range products as usual.

The only miscalculation is that DLSS (And FSR) can be turned on *without* Ray Tracing which basically boosted midrange card performance on pure Rasterization by a lot.
 
This card's performance makes it essentially an RX 5700 XTX. Sure, it's got ray-tracing capability but that's still just a frill. At this point, I'm still not convinced that ray-tracing will become the standard because it's just not enough of an improvement over rasteristion to justify the extra hardware and coding requirements to implement it.
Between the RX 5700 XT and this, the former may use more power, but it is more consistent with performance just by virtue of it having a traditional and wider memory bus. The cache generally helps, but it is so cut down I don’t even know if it’s even meaningful. At 1440p, the 5700 XT is a better option.

RT is likely here to stay, but for lower end cards, it is a pointless discussion, at least for the next few generations of GPUs. If high end cards are struggling with RT on, lower end cards don’t even need to think about it. DLSS and FSR are useless at 1080p since IQ will surely take a big hit by using a lower resolution, and performance gain is limited since it’s going to be heavily CPU bound.
 
Between the RX 5700 XT and this, the former may use more power, but it is more consistent with performance just by virtue of it having a traditional and wider memory bus. The cache generally helps, but it is so cut down I don’t even know if it’s even meaningful. At 1440p, the 5700 XT is a better option.
Not really, Techpowerup have both in their review and 6600XT is like 5% better for 1440p overall. However both cards are too slow for 1440p if you ask me.

6700XT is 35% faster than 6600XT (Reference) at 1440p.
3070 is 40% faster + have option for DLSS in alot of games.

3060 Ti is 22% faster than 6600XT and was released 10 months ago for 20 dollar more. Actually 3060 Ti was cheaper on release than 6600XT is now, because MSRP is pointless when store prices are way higher. Back when 3060 Ti launched, people actually could buy them for MSRP pricing for several months.
 
I'm curious as to why your benchmarks go up to the 6800 / 3070Ti but no further... was this an arbitrary decision? If including those, why not add the 3080/3090/6900 as well? Just for completeness sake... I know you already have the numbers... more information is always good...
 
Well, the 6600XT is sadly just as disappointing as the 3060.

Had AMD offered the 6600XT for the same price as a 5600XT, it would actually be a nice upgrade, but at its price, nope.

I am missing two things from the otherwise nice review:

- Why no tests with SAM / resizable bar ? After all, the 3060 officially supports it, just like the 6600XT officially supports RT and that was tested.
- I‘d really like to see 3060/6600XT cards tested with older / lower end CPU to see how this combination performs. It would be interesting to see if either is a worthwhile upgrade if they are available at appropriate prices ($100 below their official msrp) but a test with a high end CPU does not tell me that.

In summary, a hard no buy even at msrp same as the 3060.
 
This card's performance makes it essentially an RX 5700 XTX. Sure, it's got ray-tracing capability but that's still just a frill. At this point, I'm still not convinced that ray-tracing will become the standard because it's just not enough of an improvement over rasteristion to justify the extra hardware and coding requirements to implement it.


To me, RT is supposed to provide a new level of eye candy, but no real functionality or enhancement to gameplay.

I could be wrong and perhaps I think that a game like Subnautica could get some real benefits of RT since underwater illumination on this game seems to be all over the place.

But the other reality is, the hardware to properly support RT is not out here yet, hence the "tricks" like DLSS and FSR are needed.

I just dont know if we ever get that hardware that properly support RT was really worth the silicon.
 
To me, RT is supposed to provide a new level of eye candy, but no real functionality or enhancement to gameplay.

I could be wrong and perhaps I think that a game like Subnautica could get some real benefits of RT since underwater illumination on this game seems to be all over the place.

But the other reality is, the hardware to properly support RT is not out here yet, hence the "tricks" like DLSS and FSR are needed.

I just dont know if we ever get that hardware that properly support RT was really worth the silicon.
The thing is, that extra eye candy doesn't really add anything to the gameplay. I've never been really affected by the lack of reflections in game because I don't even notice reflections in real life. If puddles stopped reflecting the sky in real life, I probably wouldn't notice for months at least and years at most.

I think that's the problem. Ray-tracing improves things that we wouldn't even notice in our day-to-day lives, let alone in a video game. To me, the graphics in AC: Odyssey are about as good as they'd ever need to be. Tessellation has made far more of an impact on graphics than I think ray-tracing ever will. Tessellation completely changed the level of detail shown in EVERYTHING. Ray-tracing makes shadows, light sources and reflections more realistic, something that I never cared about before and probably never will in the future. Are you constantly aware of the sun's position in real life? I sure as hell am not unless I need to use it to judge which way is East or West.

Whenever I've wanted to up the performance of a game (for instance, playing Godfall with an R9 Fury). turning shadows to minimum has always been my first course of action with bloom level being second. I think that this is true for most people because it allows you to keep as much texture detail as possible. Ray tracing may "improve" shadows and/or reflections but those were never all that important to begin with.
 
Last edited:
QUOTE AMD seems very confident that you’ll be able to purchase a 6600 XT at or very near the MSRP UNQUOTE

AMD has been saying the exact same thing during every single one of the Radeon 6k series GPU paper launches they pulled so far, so you can understand that I consider this statement baloney.
 
AMD's official pricing makes me wonder:

AMD could have easily set the same msrp for the 6600XT as for the 3060 and for the 6700XT as for the 3060Ti. Since no cards are available at msrp right now anyhow, all it would have taken is to offer a few own branded cards at this price or make an incentive some few AIB to offer a limited number of cards for msrp on launch day. I very much suspect that's what nVidia did.

Could this perhaps be a way to prepare for RDNA3 by trying to get away from the "maker of cheap GPU" image while it does not hurt ? It seems like they can / will only allocate limited wafers to their GPU production, dev costs are likely mostly covered by the consoles and AIB should also not lose any money if they make higher margins.
 
This card is kinda bland and gonna be way over priced. I wouldn't bother with it.

"the fact is many of you will want to use a ~$400 graphics card at 1440p and frankly that’s not asking much. "

Wanted to touch on this quote in the review. This card is clearly for 1080p gaming, if one buy this then complains it does poorly at 1440P that is kinda on you. For me when purchasing my gpu's I look at my target resolution first then will choose a gpu that will meet that level then looking at price.

I never look at a gpu and say well this cost me 500-600 bucks so I should be able to use it 4k because I spent this much money on it. Performance dictates the price not the other way around.
 
The thing is, that extra eye candy doesn't really add anything to the gameplay.

That's the thing about eye candy. Higher resolutions, better textures, antialiasing, anisotropic filtering and all such graphical innovations are not strictly necessary for gameplay either. Yet the industry keeps pushing them on. I do find that exciting in its own right.

I think that's the problem. Ray-tracing improves things that we wouldn't even notice in our day-to-day lives, let alone in a video game.

But your brain did. You don't have to have a conscious awareness of it for it to improve immersion.
 
Thanks for the Vega56 addition Steve.
Still hanging in there, solid 1440p/60fps with a few tweaks in certain games.

I, for one, will not be upgrading anytime soon given this gens price/performance.
 
"This means when using both the RTX 3060 and 6600 XT in a PCIe 3.0 system while playing Doom under these conditions, the GeForce GPU will be almost 30% faster. Just as shocking is that we found a way to make the 6600 XT slightly slower than the 5600 XT as the older 5000-series part supports x16 bandwidth."
So on top of an already high price for 128-bit, the card has been artificially crippled to run slower on most of the world motherboards? Hard pass...
 
Back