Nvidia RTX 4060's price drops in Europe following poor reception - US market may follow

the 4060 die is already a low margin product for nVidia
I have to ask, why do you think this? It's an absolutely tiny die.

4060 = 146mm2
3060 = 276 mm2

Based on the 4060's pricing, how much TSMC charge per wafer (which Nvidia will probably be paying considerably less) and how many you can get out of a single wafer, these 4060's are cheaper to make than the old 3060.
 
A recent video made by Moores Law is Dead where he was talking to a developer
I think the guy just pulled a random number. From what I have read, DLSS actually decreases Vram usage, albeit by a tiny amount. That doesn`t mean 8Gb is enough. I just don`t think DLSS has any effect, good or bad, on it.
 
A recent video made by Moores Law is Dead where he was talking to a developer
MLIS churns out lots of videos -- any chance you could narrow it down by providing a link to the exact one?

Any developer claiming that DLSS uses 3 GB of VRAM is either making stuff up/misinterpreting something that someone has told them (because they're not actually a programmer, more rather a lead designer for example) or they're referring to their own game's implementation of it -- and if it's the latter, then they're absolutely not doing it right.

I checked out Cyberpunk 2077 and The Last of Us, both on High quality settings, using PIX to record VRAM usage (not allocation).

CP2077
1080p = 5.8 GB
4K = 7.8 GB
4K + DLSS Perf = 6.0 GB

TLOU
1080p = 6.6 GB
4K = 8.7 GB
4K + DLSS Perf = 6.8 GB

In both cases, DLSS Performance mode sets the engine to use 1080p rendering, so comparing that footprint to native 1080p's gives one a good estimate as to how much VRAM the DLSS system is using. And as one can see, it's around 200 MB for both games.

Now, if we're talking about DLSS3, aka Frame Generation, then it's a slightly different story but not in the GB region. In CP2077, this sets the DLSS mode to Auto, which should use 1080p rendering when a 4K output is selected (according to the SDK documents). With these settings, the VRAM usage was 7.4 GB -- that's 800 MB over native 1080p.
 
MLIS churns out lots of videos -- any chance you could narrow it down by providing a link to the exact one?

Any developer claiming that DLSS uses 3 GB of VRAM is either making stuff up/misinterpreting something that someone has told them (because they're not actually a programmer, more rather a lead designer for example) or they're referring to their own game's implementation of it -- and if it's the latter, then they're absolutely not doing it right.

I checked out Cyberpunk 2077 and The Last of Us, both on High quality settings, using PIX to record VRAM usage (not allocation).

CP2077
1080p = 5.8 GB
4K = 7.8 GB
4K + DLSS Perf = 6.0 GB

TLOU
1080p = 6.6 GB
4K = 8.7 GB
4K + DLSS Perf = 6.8 GB

In both cases, DLSS Performance mode sets the engine to use 1080p rendering, so comparing that footprint to native 1080p's gives one a good estimate as to how much VRAM the DLSS system is using. And as one can see, it's around 200 MB for both games.

Now, if we're talking about DLSS3, aka Frame Generation, then it's a slightly different story but not in the GB region. In CP2077, this sets the DLSS mode to Auto, which should use 1080p rendering when a 4K output is selected (according to the SDK documents). With these settings, the VRAM usage was 7.4 GB -- that's 800 MB over native 1080p.

This guy benches
 
:laughing:
This guy benches
Nowhere near as much as I used to nor as much as I'd like to!

===

With regards to my earlier post, I forgot to mention that with DLSS, the majority of the post-processing effects are done at the output resolution -- exactly which ones depend entirely on where in the rendering chain the developers have inserted the DLSS routine. So the use of Performance mode at 4K isn't quite a like-for-like comparison with native 1080p, as not all of the render targets and buffers associated with things like tone mapping, motion blur, and bloom will be at the lower resolution.

However, since post-processing is relatively trivial, compared to the time spent doing the geometry and fragment processing, it's a good enough approximation of the increase in VRAM usage with using DLSS.
 
The 4060 seems like an ok video card, but it wouldn't be worth upgrading from a 3060 or even a 2060. It's not better enough to justify the purchase.
 
Back