AMD Radeon RX 6700 XT Review: Better than RTX 3070?

Maybe I'll get lucky and snag one around MSRP. I'd be very content getting a new GPU that's on par with a 2080Ti performance around the $500 range.

Need to find something to replace my 980Ti. She runs all my games still, but at almost 6 years old....I'm just waiting for the day I power on my computer and she's dead.
 
Maybe I'll get lucky and snag one around MSRP. I'd be very content getting a new GPU that's on par with a 2080Ti performance around the $500 range.

Need to find something to replace my 980Ti. She runs all my games still, but at almost 6 years old....I'm just waiting for the day I power on my computer and she's dead.
No place in Europe has it for MSRP, even shops if you can find one that has any in stock are ripping people off at least 200% over....
 
No place in Europe has it for MSRP, even shops if you can find one that has any in stock are ripping people off at least 200% over....

6700XT goes on sale 3/18 - there are places I can go to and spots I can check for cards around MSRP. It's just a matter if I'm lucky enough to get one before they're all sold.
 
"The new Radeon RX 6700 XT arrives with the promise of significantly improved supply when compared to previous Big Navi models which include the RX 6800, 6800 XT and 6900 XT."

This promise is about as empty as Ford Field during the NFL playoffs. Since supply on the RX 6800, 6800 XT and 6900 XT are ZERO, even ten units could be considered to be "significantly improved supply." so it really tells us nothing at all.
 
I agree that this review is way too negative. If the RTX 3070 gets a 95, how does this get a 70...?

Exactly. Since 6700XT has around same performance as RTX3070 has and price/performance ratio is better, it essentially means 25 points come from Nvidia's useless features like DLSS, "ray tracing performance" (on Nvidia titles of course) and encoder about nobody uses.

As usual, Nvidia cards get way too high numbers. GTX1080 100/100 (here: https://www.techspot.com/review/1174-nvidia-geforce-gtx-1080/ ) is still stupidest review score I have ever seen (since GTX1080 is essentially nothing else than 28nm shrink to 16nm).
 
RX 6700 XT and RTX 3060 are both disappointing. I would say RTX 3060 Ti is the best pick of the bunch if you could get one. Nothing from AMD currently excites me.
 
"The new Radeon RX 6700 XT arrives with the promise of significantly improved supply when compared to previous Big Navi models which include the RX 6800, 6800 XT and 6900 XT."

This promise is about as empty as Ford Field during the NFL playoffs. Since supply on the RX 6800, 6800 XT and 6900 XT are ZERO, even ten units could be considered to be "significantly improved supply." so it really tells us nothing at all.
It's not zero though... See here;

 
After reading this review, I arrived at the conclusion that the best card to buy right now is the 6900 XT.

It has terminated everything else with extreme prejudice.

Maybe in a year or so when the madness has run its course.
 
Yes "murders" is the word nvidia fanboys use when nvidia card is 1-2 frames faster than AMD.

Also DLSS is dumb. Why would I lower my native resolution? Just get a card that murders the resolution.

It's the word you've chosen, excusing it by mentioning fanboys doesn't strengthen your argument. It's far from murder, this reviewer would use words more like 'margin of error'. Sub 1% gain in 1% lows, ~2.5% average fps.. not even close to a murder.

lol DLSS is dumb, said nobody who actually uses it.
 
Man AMD pricing is all over the place and really doesn't make sense much of the time. Their GPUs are highly competitive and they are faster and cheaper in the high-end. And good products do win customers but value wins fans. So many excellent moves by AMD over the years but damn you really should at least win them over before you bend them over.
 
It's the word you've chosen, excusing it by mentioning fanboys doesn't strengthen your argument. It's far from murder, this reviewer would use words more like 'margin of error'. Sub 1% gain in 1% lows, ~2.5% average fps.. not even close to a murder.

lol DLSS is dumb, said nobody who actually uses it.
yeah DLSS is dumb. I'd rather not lose detail by rendering at a lower resolution than native and upscaling.

you mad because 3090 can't keep up with 6900XT at 1440p and need to downgrade your resolution to keep up?
 
yeah DLSS is dumb. I'd rather not lose detail by rendering at a lower resolution than native and upscaling.

you mad because 3090 can't keep up with 6900XT at 1440p and need to downgrade your resolution to keep up?
What about the instances where detail is gained and the output image is overall comparable or better? Yeah I know the answer, you don't like it, doesn't make it dumb, it makes you stubborn and biased.

LOL

1. not mad... and, what's there to even be mad about? they're products. Were you mad for the years AMD couldn't even compete at the top?
2. margin of error.... watch out!
 
What about the instances where detail is gained and the output image is overall comparable or better? Yeah I know the answer, you don't like it, doesn't make it dumb, it makes you stubborn and biased.

LOL

1. not mad... and, what's there to even be mad about? they're products. Were you mad for the years AMD couldn't even compete at the top?
2. margin of error.... watch out!
You probably believe that you can zoom and enhance images too eh? 😂
 
You probably believe that you can zoom and enhance images too eh? 😂
No, I'll choose my words myself.

My viewpoint is this. Dropping render resolution to increase FPS to the desired target is a tried and true method already (like supersampling can be when excess performance is on the table), it's obviously not without IQ compromise, it depends on your hardware, the game, and your personal preference of IQ and performance balance etc. DLSS takes this established notion and disproportionately retains IQ above what the input resolution 'simply' upscaled to native resolution can do. So, I don't think it's dumb, techniques like this certainly have their place in various use cases.
 
I use DLSS on my laptop and it helps obviously but is that really all it does?
Is it any different than dynamic resolution?
Its different in the sense with Dynamic resolution scaling, the output resolution is fixed, the input resolution dynamically changes to maintain your FPS target, so if you are already at your FPS target, the internal resolution wont be dropped. With DLSS the input and outputs are fixed, and the FPS varies.

In addition it isn't just a simple lower internal res to higher output res upscale, it takes the lower input resolution, and other engine data like motion vectors, and uses the tensor cores to reconstruct the image to the desired output resolution. Which is in part why some parts of the image don't scale linearly. Some parts may more closely resemble the input resolution, and some may be as good or even better than native output resolution because the algorithm has prioritized/done more with that data.

See my reply above for more thoughts on it.
 
This card was due to release today in the U.K. and absolutely nowhere has it even listed.

But to be honest I think it should be avoided, it’s a poor showing from AMD. No innovation, high price, still no ray tracing despite the marketing advertising it will run all your games at max settings - a lie.

Also no DLSS and apparently the card hits 100C under load. You’d be an ***** to think this is a good product.
 
lol DLSS is dumb, said nobody who actually uses it.

Dumb or not, DLSS is overrated, for the simple reason that it causes a very annoying shimmering that nobody talks about. Look at this video;

Look how it looks in motion with all that shimmering, on the building and on the ground... Even DLSS Quality has it. And when he freezes the footage, suddenly everything looks a lot better for DLSS.
He goes on to compare the screenshots and how quality DLSS looks better than native. But that is deceptive and marketing, because it didn't look better in motion with all that additional shimmering.

And what kills me, he never mentions that shimmering, ever, and only talks about the positives the whole video. DLSS has a lot of keyboard warriors, and obviously a lot of influencers are pushing the tech on behalf of nVidia. But in actuality, it's not nearly as good as it's touted to be. In multiple games it also has a ghosting problem, but that's generally less noticeable than the shimmering.

DLSS looks great in screenshots, but in motion it's another story, which matters a lot in games.
 
Another GPU to try to get for those that do not currently have one. But, in a normal GPU market, I agree, this one just doesn't make sense at this price. Both the 3060 Ti and 3070 seem to make better sense for 1440p gaming, especially given DLSS and the RT capabilities of those cards. The additional VRAM is really the only selling point here, but, will it actually matter? Especially since the VRAM has a lower bus. I really liked the RX 5700 XT, it was a great card for the price, especially when the price was dropped to $399 before launch. But, the card made sense, it was better than the RTX 2060 S @ $399 and often rivaled the RTX 2070 S @ $499, but it had no RT or DLSS, which at the time were much less important.
 
If you want a RTX3070 badly. The only place to get a pre-build at a fair price is POWERSPEC (Mirco Center Brand).
 
Some parts may more closely resemble the input resolution, and some may be as good or even better than native output resolution because the algorithm has prioritized/done more with that data.
Even with the best machine learning algorithms you can't upscale a lower res image to have more information that the native res image.

Why? Information theory. The information gained (I: quality of upscale) is limited by the quality of the input dataset: I(X,Y)≤min[H(X),H(Y)]. It's always less than or equal to the input dataset to train the upscaler. Right now even the best upscaler using tech like tensor cores (oooh sounds fancy) is nowhere close to the upper limit (where upscale is equal to original).

Simple put you can not create more information than what's in the native image.
 
Back