GeForce RTX 4070 Ti vs. GeForce RTX 3080: 50 Game Benchmark

Hmmmm a new 4700 TI @ $850 or a used 3800 @ $450? Considering the 4070ti is on average 19% faster, this is a very easy decision for most gamers.
New 4070 Ti is 799 and second hand hardware will always be better value obviously, be sure to pick a ex 24/7 mining card...

In several newer games, 4070 Ti is more than 25% faster than 3080, while using 100 watts less with option for DLSS3 as well + 2GB more VRAM

I'd never pay more than 400 for an used 3080 that is for sure, 200-300 should be the price if it's a former mining card
 
It turns out that I was wrong about something. I had agreed with you that it will probably be years before 8GB starts causing issues at 1080p. Well, that's unfortunately not the case:
Worst optimized game for Nvidia GPUs, and dev's even officially confirmed the issues with Nvidia cards, so who cares. Cherrypick less next time.

Overall a 3070 8GB beats 6700XT 12GB at 1440p. Hell it even beats it at 4K;


Yeah 8GB will last for years for 1440p gaming.
Also, DLSS is now confirmed to be superior to FSR. FSR won in ZERO games;

 
Last edited:
Worst optimized game for Nvidia GPUs, and dev's even officially confirmed the issues with Nvidia cards, so who cares. Cherrypick less next time.

Overall a 3070 8GB beats 6700XT 12GB at 1440p. Hell it even beats it at 4K;
Cherrypicking? What are you talking about? I pointed at the games that were having problems because that's where the new information is. I wasn't cherrypicking, I was being 100% honest with you. You just seem bound and determined to believe that 8GB isn't a problem when everyone else says that it is.

Ok, I'm going to be as innocuous as possible here and do a YouTube search for "8GB VRAM" because that's as neutral as it gets. I'll even link you to the page on which the search occurred:

I omitted videos that were too short or too old to be relevant but you can always go to the link I posted and see everything that I see now. I'm hiding nothing because I'm not interested in "being right", I say things because I truly believe them, not because my ego is involved.

So, here are videos from the tech mainstream that talk about the problems with 8GB cards (there are a lot):
https://www.youtube.com/watch?v=vP8ofaW9WT8
https://www.youtube.com/watch?v=KTNvdZd3FeM

Yeah 8GB will last for years for 1440p gaming.
Now I think that you're missing the point on purpose. My point was always "A high-end card like the RTX 3070 Ti should have more than 8GB of VRAM. Sure, you can play dumb and talk about lowering settings but you shouldn't have to do that on a card that costs the same as an RX 6950 XT. The RTX 3070 Ti even loses to the RX 6950 XT in ray-tracing performance (when it doesn't have VRAM problems). You're just completely wrong here and now you're just trying to gaslight me. Here's just how "great" of a purchase that the RTX 3070 Ti is:
Also, DLSS is now confirmed to be superior to FSR. FSR won in ZERO games
So, the whole discussion is about 8GB of VRAM not being enough for a high-end card and you want to gaslight me with talk about DLSS? What does that have to do with 8GB of VRAM not being enough for a card at the RTX 3070 Ti's level?

You know what though, this isn't about my ego it's about information exchange so I'll humour you about DLSS. Yes, it is superior and I have always accepted that DLSS is better than FSR, but I also never cared about that because I don't use upscaling.

I bought an RX 6800 XT because I don't want to use upscaling. I have tried out FSR in the past (nothing major, just some 4K Ultra-Quality for a few minutes but it got boring) and it looked fine. I realise that it will look worse at lower resolutions and that DLSS is better but from everything I've seen between them, it just looks like a difference of "good" and "good enough" to me. Sure, DLSS is better and has always been better (I have never claimed otherwise so I don't understand why you're even talking about it) but I have a a 16GB enthusiast-class video card and by the time I need to use ANY kind of upscaling with FSR, it will be a LOT better than it is now, that's just how things work.

Remember that FSR v1.0 came out less than two years ago and the strides that have occurred since then are incredible. I expect that I won't even consider using upscaling for 2-3 more years (longer if I'm not always looking for the latest and greatest AAA games). Just imagine how great FSR will be by then based on the first two years of its existence. For someone with a high-end, enthusiast-grade or halo-level card, something like DLSS is not supposed to be something you think about. Talking about DLSS and FSR to the owner of an RX 6800 XT is downright funny because neither of them is even remotely relevant to me for the foreseeable future. What you don't seem to get is that, as the owner of an RTX 3070 Ti, DLSS and FSR shouldn't be relevant to you either, not for at least another 1½ years.

I don't know what you're trying so savagely to defend. It's YOU that got screwed, not me.
 
Last edited:
Cherrypicking? What are you talking about? I pointed at the games that were having problems because that's where the new information is. I wasn't cherrypicking, I was being 100% honest with you. You just seem bound and determined to believe that 8GB isn't a problem when everyone else says that it is.

Ok, I'm going to be as innocuous as possible here and do a YouTube search for "8GB VRAM" because that's as neutral as it gets. I'll even link you to the page on which the search occurred:

I omitted videos that were too short or too old to be relevant but you can always go to the link I posted and see everything that I see now. I'm hiding nothing because I'm not interested in "being right", I say things because I truly believe them, not because my ego is involved.

So, here are videos from the tech mainstream that talk about the problems with 8GB cards (there are a lot):
https://www.youtube.com/watch?v=vP8ofaW9WT8
https://www.youtube.com/watch?v=KTNvdZd3FeM


Now I think that you're missing the point on purpose. My point was always "A high-end card like the RTX 3070 Ti should have more than 8GB of VRAM. Sure, you can play dumb and talk about lowering settings but you shouldn't have to do that on a card that costs the same as an RX 6950 XT. The RTX 3070 Ti even loses to the RX 6950 XT in ray-tracing performance (when it doesn't have VRAM problems). You're just completely wrong here and now you're just trying to gaslight me. Here's just how "great" of a purchase that the RTX 3070 Ti is:

So, the whole discussion is about 8GB of VRAM not being enough for a high-end card and you want to gaslight me with talk about DLSS? What does that have to do with 8GB of VRAM not being enough for a card at the RTX 3070 Ti's level?

You know what though, this isn't about my ego it's about information exchange so I'll humour you about DLSS. Yes, it is superior and I have always accepted that DLSS is better than FSR, but I also never cared about that because I don't use upscaling.

I bought an RX 6800 XT because I don't want to use upscaling. I have tried out FSR in the past (nothing major, just some 4K Ultra-Quality for a few minutes but it got boring) and it looked fine. I realise that it will look worse at lower resolutions and that DLSS is better but from everything I've seen between them, it just looks like a difference of "good" and "good enough" to me. Sure, DLSS is better and has always been better (I have never claimed otherwise so I don't understand why you're even talking about it) but I have a a 16GB enthusiast-class video card and by the time I need to use ANY kind of upscaling with FSR, it will be a LOT better than it is now, that's just how things work.

Remember that FSR v1.0 came out less than two years ago and the strides that have occurred since then are incredible. I expect that I won't even consider using upscaling for 2-3 more years (longer if I'm not always looking for the latest and greatest AAA games). Just imagine how great FSR will be by then based on the first two years of its existence. For someone with a high-end, enthusiast-grade or halo-level card, something like DLSS is not supposed to be something you think about. Talking about DLSS and FSR to the owner of an RX 6800 XT is downright funny because neither of them is even remotely relevant to me for the foreseeable future. What you don't seem to get is that, as the owner of an RTX 3070 Ti, DLSS and FSR shouldn't be relevant to you either, not for at least another 1½ years.

I don't know what you're trying so savagely to defend. It's YOU that got screwed, not me.
Again, look at the Techpowerup or Techspot average + minimum fps numbers in a recent GPU review and you will see that the 3060 Ti and 3070 performs perfectly fine in 1440p gaming, outside of officially bugged games like TLOU. Dev's officially confirmed issues with Nvidia cards and it's the worst console port ever released - pointless to even mention it before issues are fixed.

3070 still beats 6700XT overall in both average and minimum fps at 1440p games on high settings by ~10% and 3060 Ti performs on par, just like they did 2 years ago. None of these cards are going to max out NEW and DEMANDING games in 1440p anyway, especially not with RT. GPU is too weak. VRAM is not going to save a weak GPU. 24GB on 4090 and 7900XTX is stupid overkill and they only have 24GB because of the 384 bit bus. The alternative was 12GB VRAM which is on the low side for 4K gaming, however still fine in 99.9% of games at 4K on high settings. 16GB is more than enough for 4K gaming today and I bet it will be fine in 2 years as well, however the only true viable 4K card is 4090. It beats both 4080 and 7900XTX by 25% in 4K, which is alot when these cards are already struggling in some games.

You are simply overestimating how much VRAM is needed and refeering to new and demanding games being COMPLETELY MAXED out which even GPUs like 3080/3090 and 6800XT/6900XT struggles with too. Even 3090 and 6900XT are not high-end today. 6800 NON-XT is WAY SLOWER than high-end cards today. Can we stop acting like 6800 is high-end now? 3080 beats it easily, with 6GB less VRAM.

Games like Cyberpunk with fully maxed ray tracing runs like 20-30 fps on 4090. 3-6 fps on 7900XTX (lol) Who cares about the VRAM usage here. It's POINTLESS because it's unplayable regardless. GPU IS TOO WEAK.

If you simply want to play 1440p on high settings (not ultra, because ultra mostly adds garbage like motion blur, dof and other garbage effects THAT REQUIRES EVEN MORE VRAM but makes the image look like crap) _and_ no Ray Tracing, 8GB is simply fine and this won't change anytime soon. Most gamers that have a clue, will tweak and optimize games, instead of running ultra preset. This will lower VRAM usage.

Most people that bought a cheap 3060 Ti or 3070 on release will simply upgrade when 5000 series hit and this means ~5 years lifetime which is about right for a GPU for most people. By 2025 6700XT will be considered low-end as well regardless of it haveing 4GB more VRAM. It won't change a thing, because GPU is too weak. It's better to upgrade MORE OFTEN that to buy a more expensive GPU with alot of VRAM, because it won't age well anyway.

Same with AMD 6800 and 6900 series. They might have 16GB but when 16GB VRAM is ACTUALLY REQUIRED, the GPU's will be dirt slow. POINTLESS to try and futureproof by adding more VRAM when it's the ACTUAL GPU that is going to be the bottleneck.

Nothing about Radeon 6800 series is viable for 4K gaming today. Not even 6950XT does well here. It's 1440p solutions.

Fun fact. 3080 beats 6800XT by 5% in 4K gaming when looking at minimum fps ;)
And 3070 Ti 8GB performs on par with your 6800 16GB ;)

However none of these cards are good for 4K in demanding games without DLSS/FSR and DLSS is superior to FSR + lowers VRAM requirement on top (obviously). Even 3090 Ti and 6950XT struggles in demanding 4K gaming these days and requires DLSS/FSR to get a decent framerate...

If you play at 4K you need to upgrade your GPU pretty much every generation and right now, 4090 is the only true 4K card, beating 4080 and 7900XTX by ~25% in this resolution. It absolutely destroy last gen flagships in 4K.

Yet you think your 6800 series GPU don't need to utilize FSR. Are you playing old and dated games, or running 1080p? Or settles with 40-80 fps? GPU is too weak and 16GB VRAM changes nothing about that fact.

Do you SERIOUSLY think 6800XT is a high-end GPU in 2023? You sound like a 4090 owner or something. 6800XT is not a fast GPU these days... Sorry to burst you bubble

And zero 3070 owners are forced to use DLSS in 1440p, but THEY CAN and increase framerate by 75-100% IF THEY WANT, or use DLAA for superior AA -or- use DLDSR for superior downsampling in older games (4K downsampled to 1440p using DLDSR looks amazing). Nvidia features are simply just better. RTX has massive feature advantage compared to RDNA2/3, hence the price.

4080 sold much better than 7900XTX even tho its 200 dollars more. Why? Because of features, average performance numbers and RT performance. AMD are in no position to ask a premium like Nvidia is currently. AMD needs to focus only on performance per dollar and stop trying to impress with RT performance, because it's terrible compared to Nvidia (and I don't even find RT that amazing - I don't care about RT at all actually).

If AMD went full on with Raster perf and improved their features (matching DLSS, DLAA, DLDSR, NvEnc, Shadowplay and more) I would be buying AMD instead of Nvidia. I could not care less about Ray Tracing however RTX is way more than just Ray Tracing.
 
Last edited:
Back