8GB VRAM vs. 16GB VRAM: RTX 3070 vs. Radeon 6800

You didn't know that about me because it isn't true. I'm not a fanboy because I'm not a "fan" of AMD, I just despise Intel and nVidia so I buy AMD by default because there's nothing else out there, not because I "love" AMD.

I may have taken the piss out of Steve but for years I also sang his praises. I noticed that the change in tone of his reviews coincided with nVidia trying to blacklist Hardware Unboxed. Despite the backlash that the tech press had towards nVidia, it would appear that they still successfully got their message across.

It's yet another reason why I hate nVidia.
This is the kind of thing I've come to expect from you.
I noticed that the change in tone of his reviews coincided with nVidia trying to blacklist Hardware Unboxed.
Do you mean changes like being the first and only media outlet globally to highlight Nvidia's driver overheard issue in a two part investigation?

Again we're the first media outlet to really dig into the VRAM issue, and you're crying 'going soft on Nvidia'.

We laughed in Nvidia's face when they told us to change our editorial direction or we wouldn't get sampled. We told them we don't need them (which is true). Yet here you are, spouting fanboy like nonsense as usual, and I bet you have sung my praises when you've agreed with me, that's how it works with 'fans'.
 
Again we're the first media outlet to really dig into the VRAM issue, and you're crying 'going soft on Nvidia'.
Yes, something changed in the rhetoric for the last 2 years, keep it up.
We laughed in Nvidia's face when they told us to change our editorial direction or we wouldn't get sampled. We told them we don't need them (which is true).
I just watch again the video when you announced Nvidia bashing you and those big wet eyes tell different story. Body language also don't lie about your emotions then.

But now you start to bring back to old Steven we know.

Welcome back!
 
Yes, something changed in the rhetoric for the last 2 years, keep it up.

I just watch again the video when you announced Nvidia bashing you and those big wet eyes tell different story. Body language also don't lie about your emotions then.

But now you start to bring back to old Steven we know.

Welcome back!
You mean the video where we were obviously disappointed with Nvidia and told them publicly in no uncertain terms that we don't need them, and if they want to act that way we're happy to ignore them? Nothing has changed in the past 2 years, we're going about the review process the same way we always have. Those who claim we're holding a grudge and go after them harder or are scared and have backed off have no idea what they're talking about.
 
You mean the video where we were obviously disappointed with Nvidia and told them publicly in no uncertain terms that we don't need them, and if they want to act that way we're happy to ignore them? Nothing has changed in the past 2 years, we're going about the review process the same way we always have. Those who claim we're holding a grudge and go after them harder or are scared and have backed off have no idea what they're talking about.
The burning question I have is, why don't you test eg. Hogwarts with dlss + high textures vs fsr + ultra textures when comparing these cards? After all we are after image quality, right? Cause from my testing dlss q + high textures looks better than native + ultra textures, so where exactly is the issue? You get better image quality with an 8gb nvidia card than a 16gh amd card.
 
The burning question I have is, why don't you test eg. Hogwarts with dlss + high textures vs fsr + ultra textures when comparing these cards? After all we are after image quality, right? Cause from my testing dlss q + high textures looks better than native + ultra textures, so where exactly is the issue? You get better image quality with an 8gb nvidia card than a 16gh amd card.
You can't enable RT in Hogwarts on a 8GB card with high textures, even with DLSS.
 
You can't enable RT in Hogwarts on a 8GB card with high textures, even with DLSS.
You can't? I finished it on a 3060ti at 3440x1440p with DLSS quality, everything ultra except textures (high) and shadows (low). That was basically day 1, before any of the big patches.
 
Reflections and AO ultra, shadows off. Game was a constant 65+ fps except in cutscenes, there were drops to the 50ies in those. Pretty enjoyable experience overall
What Steve was referring to was 'standard' settings -- I.e. using the presets. With enough experimentation with the settings, Hogswart Legacy can run on 8 GB cards, but it depends very much on what the rest of the system is like.

I ran through Hogsmeade multiple times at 3830 x 1440 DLSS Quality, all settings at Ultra -- bar Textures to High, RT Reflections and RT Ambient at Ultra, and RT Shadows off.

The system used has a Core i7-9700K, RTX 4070 Ti (12 GB), 16 GB DDR4-3000. The average frame rate over the runs was 38 fps, with a 1% Low of 22 fps. Unfortunately, for that resolution, the game has to be windowed on my monitors, so I ran it all again at native 4K, but with different DLSS settings.

> Quality (67% rendering) = 40 fps avg, 21 fps low
> Balanced (58% rendering) = 40 fps avg, 24 fps low
> Performance (50% rendering) = 40 fps avg, 25 fps low

Clearly, Hogsmeade is very CPU-bound, so I then did some broomstick flying, at max altitude above the South Hogwarts lake, to remove as much of the CPU load as possible (and it really limits how much RT impacts the performance). The results:

> Windowed 1920 x 1080 DLSS Quality = 77 fps avg, 54 fps 1% low
> Windowed 3840 x 1440 DLSS Quality = 74 fps avg, 48 fps 1% low
> Native 4K DLSS Quality = 69 fps avg, 31 fps 1% low

These data clearly show that the CPU is a limiting factor in much of the game.
 
What Steve was referring to was 'standard' settings -- I.e. using the presets. With enough experimentation with the settings, Hogswart Legacy can run on 8 GB cards, but it depends very much on what the rest of the system is like.

I ran through Hogsmeade multiple times at 3830 x 1440 DLSS Quality, all settings at Ultra -- bar Textures to High, RT Reflections and RT Ambient at Ultra, and RT Shadows off.

The system used has a Core i7-9700K, RTX 4070 Ti (12 GB), 16 GB DDR4-3000. The average frame rate over the runs was 38 fps, with a 1% Low of 22 fps. Unfortunately, for that resolution, the game has to be windowed on my monitors, so I ran it all again at native 4K, but with different DLSS settings.

> Quality (67% rendering) = 40 fps avg, 21 fps low
> Balanced (58% rendering) = 40 fps avg, 24 fps low
> Performance (50% rendering) = 40 fps avg, 25 fps low

Clearly, Hogsmeade is very CPU-bound, so I then did some broomstick flying, at max altitude above the South Hogwarts lake, to remove as much of the CPU load as possible (and it really limits how much RT impacts the performance). The results:

> Windowed 1920 x 1080 DLSS Quality = 77 fps avg, 54 fps 1% low
> Windowed 3840 x 1440 DLSS Quality = 74 fps avg, 48 fps 1% low
> Native 4K DLSS Quality = 69 fps avg, 31 fps 1% low

These data clearly show that the CPU is a limiting factor in much of the game.
Thanks for trying. I had a 3700x with the 3060ti, and yeah hogsmeade is entirely cpu bound. Even my 12900k severely bottlenecks my 4090 in the village
 
It appears to me that many of you should buy a console and move on from PC gaming. The days of products getting twice as fast while dropping in price are clearly gone.

Anyone here own a Harley (I doubt it)? PC gaming is still one of the cheapest hobbies around. Cheaper than golf, cheaper than cooking ($1,000 for knives and cookware), certainly cheaper than snowmobiles or boats.

If you can't afford it - just GET OUT. No one is holding you hostage.
 
Back