GeForce RTX 3060 Ti vs. Radeon RX 6700 XT: 50 Game Benchmark

Yeah thats what I have said all along. Pretty much same perf between them. Yet people want to think 6700XT is a 3070 competitor, because it's priced close to 3070, it's not. 3070 is 10-15% faster than both of these cards in 1440p/4K.

6700XT might have more VRAM but suffers from the low bandwidth and 192 bit bus in higher resolutions, which is why 3060 Ti beats it in 4K. DLSS is also a lifesaver here tho. FSR too yeah, but DLSS is far more common and Nvidia cards support FSR too.

6800 non-XT is closer to 3070 than 6700XT is to 3070.
And this is why 6700XT should have been 399 dollars tops.

12GB might age better but 3060 Ti have better RT perf, and more and more games will start to include RT elements like Metro Exodus EE. Meaning that even on lowest settings, RT perf will matter.

This is 1440p cards, not 4K/UHD cards for sure. You won't really need more than 8GB for a GPU in this league. They won't be able to run AAA games maxed out at 4K anyway, meaning that VRAM usage won't be high to begin with.

8GB is plenty for 1440p maxed. Most demanding games use like 5-6GB tops in 1440p.

12GB on 6700XT is pointless, not about futureproofing. Alternative was 6GB since it's 192 bit, so I understand why they went with 12GB just in case.
It is the exact same with 3060 non-Ti, a GPU this weak (40-45% slower than 3060 Ti) will never be able to utilize 12GB, which is why 3060 Ti beats 3060 with ease even in 4K gaming.

By the time 12GB will be needed for 1440p gaming, meaning maxed out settings, all these GPUs will be considered too slow to play on anything else than low to medium settings, sooo :p
 
Last edited:
It's understandable that they test with the HD texture pack on, their site, their testing methodology. But anyone that bought a 3060Ti, that games at 4k, that plays Farcry 6, would surely turn it off rather than be like, welp, this card can't manage this, better bin it.

They'd just .... *checks notes* ... not enable the HD texture pack, and enjoy a virtually identical experience.

The same will apply as the years press on to any VRAM limited situation, which for this cards typical useage scenario, those situation will be few and far between. Lower one notch (or disable the silly HD texture pack) and boom, problem goes away for virtually no visual difference.

The visual difference that's immediately noticeable, generally, is RT effects, where the 3060Ti will handily outpace the 6700XT, and have DLSS, or any other competing tech (as AMD will make them work on anything modern) to enable too, so DLSS is still one leg up.

At the same price, you'd have to either look at the current games list and conclude you play the vast majority of AMD favored games, and will choose to do so moving forward, or be one of those people who swear on their life they wont give a dime to Nvidia, lord knows ya'll are vocal about it.

And of course, if pricing dictates one is the far and away obvious choice over the other, well then price and price alone has made the choice for you.
 

Yeah thats what I have said all along. Pretty much same perf between them. Yet people want to think 6700XT is a 3070 competitor, because it's priced close to 3070, it's not. 3070 is 10-15% faster than both of these cards in 1440p/4K

The thing is, I remember than the reception for the 6700 xt by most reviewers was far more positive than that of the 6600 xt and 6600 yet the problems we're looking at now in this comparison and that you're pointing out are not new to me: I've heard most of them from the 6600 xt coverage.

"It's roughly equal to the 3070 in terms of value though" was Steven's exact quote. Watching back that review they were very few 'AMD' titles in which the 6700xt was clearly outperforming or matching a 3070 and most 'Nvidia' titles the 6700xt was showing to be a better match for the 3060 ti yet the final thoughts did not reflect this until a full year later on this comparison video and this is going strictly by MSRP which was his logic and explanation ("I can't guess or comment about prices so I'll just comment on MSRP") so a year ago, based strictly on MSRP the 6700xt was mostly performing a lot closer to the 3060ti while being priced closer to the 3070 MSRP so even back then how was it ever 'roughly equal to the 3070 in terms of value'?

It wasn't and this was one of the less glowing reviews: many of the other, less rigorous tech channels were showing a clear AMD bias for RDNA 2.
 
If you're buying a gaming laptop nowadays, you should aim for no less than 3070.

If you're buying a video card nowadays, I'd aim for no less than a 3080.
 
In regards to the Far Cry 6 comments from the review:

"In our opinion, 12GB VRAM should be the minimum at this price point. Bizarrely, Nvidia's slower RTX 3060 (the non-Ti version) does pack a 12GB VRAM buffer. Go figure."

As for the 3060 having 12GB, that's due to the 192-bit memory bus. I think if the card launched with 6GB there would have been more of an uproar about low memory. It's kind of like the GT 730 4GB, has more RAM than it can use.

The 3060Ti runs 256-bit memory bus and I don't think putting 16GB on that particular card would be good, I don't think the card would have the power to utilize that much RAM....kind of like the GT 730 4GB cards. More RAM than it could use. Plus putting in more RAM would have pushed the cost higher and put it in the 3070 range, so they needed something to keep the costs down some to keep the pricing lower.



The 3060Ti is a mid-ranged card out of Nvidia's lineup. It's not a high end card. If you want high end, go for the 3080 or higher. The 3070 is a mid-high card. The 3060 is a mid-low card. The 3060Ti is the mid card out of this generation.

I'm not defending Nvidia, but at the same time there are some issues with everyone's perception of the available RAM on the mid-ranged card. Let us also not forget that the HD textures for Far Cry 6 are pretty pathetic and really don't look good. In terms of the extra RAM amounts needed for Far Cry 6 HD textures, I say it's a failure on Ubisoft.

As for the 3060Ti, 8GB of RAM is more than the 2060 6GB. Even after the 2060 was relaunched and 12GB was added to a slightly buffed 2060, there was almost no improvement over the 6GB model in terms of performance.

RTX 2060 had 6GB and 2060 Super has 8GB. These were the mid-ranged cards.
RTX 3060Ti is no different, a mid-ranged card, and has 8GB of RAM.

Anyone expecting miracles from the 3060Ti to run 4k and handle HD texture packs at that resolution, you're a fool.

In the end, I think the 3070 should have been 10GB, the 3080 should have been 12GB and let the 3090 have that 24GB. Nvidia has made some goofy decisions on how they designed these cards this generation, but I think people need to realize they're expecting so much more from a mid-ranged card then was intended and are quick to bash it because it's not designed with enough RAM that a high-end card has.
 
It's understandable that they test with the HD texture pack on, their site, their testing methodology. But anyone that bought a 3060Ti, that games at 4k, that plays Farcry 6, would surely turn it off rather than be like, welp, this card can't manage this, better bin it.

They'd just .... *checks notes* ... not enable the HD texture pack, and enjoy a virtually identical experience.

The same will apply as the years press on to any VRAM limited situation, which for this cards typical useage scenario, those situation will be few and far between. Lower one notch (or disable the silly HD texture pack) and boom, problem goes away for virtually no visual difference.

The visual difference that's immediately noticeable, generally, is RT effects, where the 3060Ti will handily outpace the 6700XT, and have DLSS, or any other competing tech (as AMD will make them work on anything modern) to enable too, so DLSS is still one leg up.

At the same price, you'd have to either look at the current games list and conclude you play the vast majority of AMD favored games, and will choose to do so moving forward, or be one of those people who swear on their life they wont give a dime to Nvidia, lord knows ya'll are vocal about it.

And of course, if pricing dictates one is the far and away obvious choice over the other, well then price and price alone has made the choice for you.
You could also *check notes* just play at 1080p medium with a 6600! There you go, no reason to ever look at anything higher again!

That's not how PC gaming works. And as a reminder, most games used in benchmarks today are games designed with the Xbox one and PS4 as the baseline, we are still early into the new gen which has been delayed over production issues.

Buying a new card today for 1440p with only 8GB of vram is a dumb idea. Any cars today is going to have to last 5+ years, and I'll guarantee that won't be enough by then. As you pointed out FC6 would run perfectly fine if the card had more vram, if you have to disable settings to reduce ram usage that is the DEFINITION of a card not having enough.

People will meatshield for Nvidia, but face it, Nvidia did not put enough vram on their cards, much like the 700 series, and it will cripple them long term. Every argument used against 8GB cards today was used on 2GB cards in 2012, and look how well that went for them.
Where's Battlefield 2042 ?

Surely it's not as badly optimized as ya'll claim Cyberpunk was.
Cp2077 IS poorly optimized, and that is easy to demonstrate.

BF2042 is a broken glitchy mess and is about as useful to bench as a McDonald's burger is to fine dining.
 
You could also *check notes* just play at 1080p medium with a 6600! There you go, no reason to ever look at anything higher again!

That's not how PC gaming works. And as a reminder, most games used in benchmarks today are games designed with the Xbox one and PS4 as the baseline, we are still early into the new gen which has been delayed over production issues.

Buying a new card today for 1440p with only 8GB of vram is a dumb idea. Any cars today is going to have to last 5+ years, and I'll guarantee that won't be enough by then. As you pointed out FC6 would run perfectly fine if the card had more vram, if you have to disable settings to reduce ram usage that is the DEFINITION of a card not having enough.

People will meatshield for Nvidia, but face it, Nvidia did not put enough vram on their cards, much like the 700 series, and it will cripple them long term. Every argument used against 8GB cards today was used on 2GB cards in 2012, and look how well that went for them.
Cp2077 IS poorly optimized, and that is easy to demonstrate.

BF2042 is a broken glitchy mess and is about as useful to bench as a McDonald's burger is to fine dining.

You need to get past the fact that the 3060Ti is just a mid-ranged card. It having 8GB for the position it holds is just fine. The card works just fine for 1440p, but you won't be maxing out every game out there. As for Far Cry 6, Ubisoft put out awful HD texture packs that need large amounts of VRAM and that's an issue on their end because the HD textures don't look that good over max settings without them.

I'm not defending Nvidia. I do agree that their VRAM decisions were less than ideal when you hit the higher models - such as the 3070 only having 8GB. It should have been designed for 10GB.
The 3080 should have been designed for 12GB, not 10GB.
The 3090 is where it should be - top of the line, 24GB.

As for a card lasting 5+ years, that's a personal decision. I ran a GTX 980Ti for 6 years. I used the card for 5760x1080 gaming for the first 5.5 years and the last half year of it's use from was on 2560x1440p (I even ran my GTX 570 SLI setup on 5760x1080 for the last 12-18 months I used them, they ran some games great at that resolution and others not so much, VRAM there was an issue). The card only had 6GB of VRAM, but it played games just fine for my needs. Some games I could have settings high and hit that 60fps range my monitors supported and some games I had to turn the settings down to hit that 60fps range.

What people forget about PC gaming is that PC gaming is designed with the idea of the person being able to move up or down their game settings as they see fit so they can play games to their likes or to their hardware's capabilities. Why is that so many people assume that PC gaming is only for the absolute max settings and best resolution only?

Points to take home:
* 3060Ti is a mid-ranged card, the amount of VRAM on it is good for it. 16GB would have been too much and priced the card too high.
* Not everyone plays PC games to absolutely max everything out
* Hardware can last people a long time, it's a personal choice
* Far Cry 6 HD textures are bad and Ubisoft should feel bad
 
Last edited:
The results of 10-33 fps for the 3060Ti in the Far Cry 6 4k ultra settings + HD textures doesn't seem to make sense because they contradict the results of earlier tests with the same GPU and same settings.

In this earlier test of the 3060Ti in Far Cry 6 4k [ultra] with HD textures, the results were 42-50 fps.

What caused this change from 42-50 fps to 10-33 fps?

See links:

Ultra_4K.png


https://www.techspot.com/article/2339-far-cry-6-benchmarks/
 
Yeah thats what I have said all along. Pretty much same perf between them. Yet people want to think 6700XT is a 3070 competitor, because it's priced close to 3070, it's not. 3070 is 10-15% faster than both of these cards in 1440p/4K.

6700XT might have more VRAM but suffers from the low bandwidth and 192 bit bus in higher resolutions, which is why 3060 Ti beats it in 4K. DLSS is also a lifesaver here tho. FSR too yeah, but DLSS is far more common and Nvidia cards support FSR too.

6800 non-XT is closer to 3070 than 6700XT is to 3070.
And this is why 6700XT should have been 399 dollars tops.

12GB might age better but 3060 Ti have better RT perf, and more and more games will start to include RT elements like Metro Exodus EE. Meaning that even on lowest settings, RT perf will matter.

This is 1440p cards, not 4K/UHD cards for sure. You won't really need more than 8GB for a GPU in this league. They won't be able to run AAA games maxed out at 4K anyway, meaning that VRAM usage won't be high to begin with.

8GB is plenty for 1440p maxed. Most demanding games use like 5-6GB tops in 1440p.

12GB on 6700XT is pointless, not about futureproofing. Alternative was 6GB since it's 192 bit, so I understand why they went with 12GB just in case.
It is the exact same with 3060 non-Ti, a GPU this weak (40-45% slower than 3060 Ti) will never be able to utilize 12GB, which is why 3060 Ti beats 3060 with ease even in 4K gaming.

By the time 12GB will be needed for 1440p gaming, meaning maxed out settings, all these GPUs will be considered too slow to play on anything else than low to medium settings, sooo :p
Yep. It's similar to how AMD put 8GB on the RX480 but it couldn't use all that RAM because the GPU core itself was the bottleneck.
And can someone explain the discrepency in the 3060 Ti Far Cry 6 tests?
In this earlier test of the 3060Ti in Far Cry 6 4k [ultra] with HD textures, the results were 42-50 fps.

What caused this change from 42-50 fps to 10-33 fps in the current test despite using the same GPU, same game, and same settings?

See links:

Ultra_4K.png
 
If you're buying a gaming laptop nowadays, you should aim for no less than 3070.

If you're buying a video card nowadays, I'd aim for no less than a 3080.

It depends on your budget & goals. I haven't seen a 3080 for less than $1,000 since launch and I simply don't have the money to justify that. My budget for GPUs used to be in the $200-300 range, a budget range which didn't really exist after COVID hit. I stretched my budget for a 3060Ti at MSRP and have been blown away by the performance - especially in titles with DLSS. But I was coming from a GTX 780 that was way past its prime. I'd love to get a 3080Ti, but even if I could find one at the $700 MSRP it's simply beyond my means. If that suits your budget, than more power to you!
 
If these parts were fairly compared - turning on DLSS where possible like people do in the real world then the 3060ti would probably win a roundup.

I don’t understand why reviewers turn it off, it’s like they want to hide things that might separate these cards. I guess they don’t want to upset people whom are emotionally attached to AMD more than they want to report the real world difference a user would actually get between these parts.
 
You could also *check notes* just play at 1080p medium with a 6600! There you go, no reason to ever look at anything higher again!
Strawman bad.
It having 8GB for the position it holds is just fine.
Points to take home:
* 3060Ti is a mid-ranged card, the amount of VRAM on it is good for it. 16GB would have been too much and priced the card too high.
* Not everyone plays PC games to absolutely max everything out
* Hardware can last people a long time, it's a personal choice
* Far Cry 6 HD textures are bad and Ubisoft should feel bad
Exactly. The 3060Ti has a good amount of VRAM for the GPU power it has available, and the better part of all use cases.
That's not how PC gaming works.
It's exactly how it works, that's why games have comprehensive settings menu's.
 
If these parts were fairly compared - turning on DLSS where possible like people do in the real world then the 3060ti would probably win a roundup.

I don’t understand why reviewers turn it off, it’s like they want to hide things that might separate these cards. I guess they don’t want to upset people whom are emotionally attached to AMD more than they want to report the real world difference a user would actually get between these parts.
Yup bang on. I would love to know which lunatic owns an RTX card and plays HZD, Death Stranding, Cyberpunk, Metro EEH without DLSS!? There is absolutely no reason to turn that off, it just gives you less fps and lower image quality. I practically always use DLSS, at first I didnt but since DLSS 2.0 most games actually look better with it on.

But everyone in the community knows that Steve has a well publicised beef with Nvidia, of course he will do everything he can to hide their real world advantages. Publishing results with DLSS turned off is just blatant mis-information and he knows it.
 
Where I live, the RX 6700 XT cost 46% more than the RTX 3060 Ti and in the ballpark of a RTX 3070 Ti. In the first place, I’ve never considered the RX 6700 XT as a competitor of the RTX 3070 despite whatever AMD claimed. The rightful challenger of the RTX 3070 and 3070 Ti will be the RX 6800. And that is where I find Nvidia being the usual pain in the butt for deliberately limiting the RTX 3070 series to 8GB. The RTX 3060 Ti can get away with the excuse of just 8GB of VRAM because it’s meant to be in the mid lower range, but certainly not the 3070. For the high and enthusiast segment, there is no excuse for anything less than 12GB in the first place, just because it’s 384 bit memory bus.
Also, I agree that HD pack should never be part of a review unless we are looking at an image quality review or to assess the VRAM requirement. That skews the review result considerably because clearly the lack of VRAM will hurt performance. Review of hardware or games/ apps likewise should always be limited to out of the box settings.
 
Last edited:
Where I live, the RX 6700 XT cost 46% more than the RTX 3060 Ti and in the ballpark of a RTX 3070 Ti. In the first place, I’ve never considered the RX 6700 XT as a competitor of the RTX 3070 despite whatever AMD claimed. The rightful challenger of the RTX 3070 and 3070 Ti will be the RX 6800. And that is where I find Nvidia being the usual pain in the butt for deliberately limiting the RTX 3070 series to 8GB. The RTX 3060 Ti can get away with the excuse of just 8GB of VRAM because it’s meant to be in the mid lower range, but certainly not the 3070. For the high and enthusiast segment, there is no excuse for anything less than 12GB in the first place, just because it’s 384 bit memory bus.
Also, I agree that HD pack should never be part of a review unless we are looking at an image quality review or to assess the VRAM requirement. That skews the review result considerably because clearly the lack of VRAM will hurt performance. Review of hardware or games/ apps likewise should always be limited to out of the box settings.
You dont think if you buy a 3060ti that you shouldnt expect to use HD textures? I blame the game more than Nvidia as it is an anomaly, there literally isnt another game with the same issues and Far Cry 6 is an AMD sponsored title, it wouldnt surprise me if they made sure the texture pack needs tonnes of memory. 8GB is fine for 1440p or lower, all the benchmarks demonstrate this. I have an RTX 2080 which is very similar to a 3060ti and I have no problems running the HD texture pack at 1440p in Far Cry 6.
 
These two cards are very close to each other, so if gaming is all you do then either will do the trick.

BUT, if you do other things, as well, like video editing or video transcoding with your GPU, then the Nvidia Nvenc encoder is invaluable to have. It goes through encoding like a buzz saw through balsa wood! I really wish AMD would get off their behinds and straighten out their cards and software in this regard. Nvidia clearly holds all the high ground in this regard and if you are interested in more than games then the 3060Ti should be your choice.
 
Where I live, the RX 6700 XT cost 46% more than the RTX 3060 Ti and in the ballpark of a RTX 3070 Ti. In the first place, I’ve never considered the RX 6700 XT as a competitor of the RTX 3070 despite whatever AMD claimed. The rightful challenger of the RTX 3070 and 3070 Ti will be the RX 6800. And that is where I find Nvidia being the usual pain in the butt for deliberately limiting the RTX 3070 series to 8GB. The RTX 3060 Ti can get away with the excuse of just 8GB of VRAM because it’s meant to be in the mid lower range, but certainly not the 3070. For the high and enthusiast segment, there is no excuse for anything less than 12GB in the first place, just because it’s 384 bit memory bus.
Also, I agree that HD pack should never be part of a review unless we are looking at an image quality review or to assess the VRAM requirement. That skews the review result considerably because clearly the lack of VRAM will hurt performance. Review of hardware or games/ apps likewise should always be limited to out of the box settings.
Where I live, germany, the 3060 Ti has been cheaper than 6700XT many times, not sure about right now.

I don't understand why people look at 4K maxed out gaming (even with added texture packs) for 399/479 dollar GPUs, it's 1440p solutions with option for 4K in _some games_ pref with DLSS/FSR tho.
 
If you're buying a gaming laptop nowadays, you should aim for no less than 3070.

If you're buying a video card nowadays, I'd aim for no less than a 3080.
Haha, why. 4000 series is coming in less than 6 months, no one with a clue would buy a 3080 now for inflated prices, it's 1½ year old tech about to be replaced.

95% of gamers use 1440p or lower, they will be fine with less than a 3080.
Most even play esport or older / lesser demanding titles, which is more CPU bound than GPU anyway.

Tons of gamers also prefer a GPU that sits around 175-225 watts (3060Ti/3070/6700XT) instead of 300-500 watts like the top end 6800/6900 and 3080/3090 series.

Some simply prefer smaller cards (the high-end cards are most often bricks; heavy and huge, I know my 3080 Ti is huge lol).

Steam HW Survey;
90% have 8GB VRAM or less
95% run 1440p or less
Most people simply don't need highest-end GPUs. It's funny how people act like most gamers are all aiming for 4K/60-120fps :joy:

Cards like 3060Ti/3070/3070Ti and 6700XT/6800 will do very well for 99% of people in 1440p/144Hz.

Upgrading more often (2-3 years) is always superior to buying a higher-end card and keeping it for longer (4-6 years). Staying on newest architecture means you get 100% driverfocus + optimization from gamedevelopers too.
 
Last edited:
The thing is, I remember than the reception for the 6700 xt by most reviewers was far more positive than that of the 6600 xt and 6600 yet the problems we're looking at now in this comparison and that you're pointing out are not new to me: I've heard most of them from the 6600 xt coverage.

"It's roughly equal to the 3070 in terms of value though" was Steven's exact quote. Watching back that review they were very few 'AMD' titles in which the 6700xt was clearly outperforming or matching a 3070 and most 'Nvidia' titles the 6700xt was showing to be a better match for the 3060 ti yet the final thoughts did not reflect this until a full year later on this comparison video and this is going strictly by MSRP which was his logic and explanation ("I can't guess or comment about prices so I'll just comment on MSRP") so a year ago, based strictly on MSRP the 6700xt was mostly performing a lot closer to the 3060ti while being priced closer to the 3070 MSRP so even back then how was it ever 'roughly equal to the 3070 in terms of value'?

It wasn't and this was one of the less glowing reviews: many of the other, less rigorous tech channels were showing a clear AMD bias for RDNA 2.
It's no secret that 6700XT has been overpriced since day one, it should have been priced on par or lightly higher than 3060 Ti's 399 dollars MSRP.

Especially considering all the features Nvidia/RTX has; Superior RT, DLSS, DLDSR, NvEnc, CUDA etc. You could even say FSR too since a Nvidia RTX card will support it all including XeSS. Change you will be able to use one of these features in a new game is way bigger using a Nvidia RTX card. Pretty much all new games have DLSS or FSR. Some even have both and then I tend to use DLSS because it's superior 99 out of 100 tims. Just look at all the techpowerup FSR vs DLSS comparisons.
 
Back