Nvidia reveals first wave of games to support DLSS 3.0, new RTX 4080 benchmarks

Well, maybe, I've said repeatedly that there is the whole question of whether any of these cards are worth the asking price. But, given the MSRP we have, that's all I can compare by.
https://www.newegg.com/p/pl?N=100007709 601408872 <--- four of these cards are going for MSRP with more cards to be listed on there within the next few weeks. As of now those RTX 4080's are looking like a killer deal if the reviews are anything close to what Nvidia is showing us.
 
Last edited:
The 4070 aka 4080 12GB is pretty crap, it totally will rely on DLSS3 to make it look good. $900 for that crap. 7800XT will smoke it in rasterisation as it's not gimping the bus width, 320 bit vs 192 bit and will be cheaper.
 
The 4070 aka 4080 12GB is pretty crap, it totally will rely on DLSS3 to make it look good. $900 for that crap. 7800XT will smoke it in rasterisation as it's not gimping the bus width, 320 bit vs 192 bit and will be cheaper.
The $900 4080 12GB ran head to head with the 3090 Ti with DLSS turned off in two out of the three game benches in the OP
 
The $900 4080 12GB ran head to head with the 3090 Ti with DLSS turned off in two out of the three game benches in the OP

LOL in rasterisation it was 20% faster than the 3080 in those benchmarks. It needed DLSS3 to look good. Once the hack to get DLSS3 working on Ampere is widely available no one will care about the 4080 12GB unless it drops to $699 at most.
 
LOL in rasterisation it was 20% faster than the 3080 in those benchmarks. It needed DLSS3 to look good. Once the hack to get DLSS3 working on Ampere is widely available no one will care about the 4080 12GB unless it drops to $699 at most.
The $900 RTX 4080 12GB goes head to head with the $1400 RTX 3090 Ti in two out of three of those benches with DLSS turned off. I'm no math expert but I'm going to go out on a limb and say $900 < $1400.
 
I still don't get it...I mean, I understand how it works, but I don't get how they can tout massive fps gains when it's not rendering the game faster, but it's adding in extra frames it creates on its own to pad the numbers.

It basically works like this:
Frame 1 is rendered.
Frame 2 is rendered.
DLSS 3.0 generates a made-up frame (call it Frame 1.5) and then it pushes out the the frames in such order:
Frame 1
Frame 1.5
Frame 2

The GPU isn't rendering the game any faster, it's just using smoke and mirrors to make things "appear" like they're going faster.

Oh well, I guess it just bugs me more than anything. I'm not really impressed with the 4090 and I don't expect to be impressed by either of the 4080s. I suppose I fall into the group that Nvidia doesn't care about because this release wasn't marketed towards someone like me.

If the "make up your own frame and add it in" feature is good for others, I hope you enjoy it.

You hit the nail in the coffin, it's very misleading advertising if you ask me and a complete rip off.
 
I still don't get it...I mean, I understand how it works, but I don't get how they can tout massive fps gains when it's not rendering the game faster, but it's adding in extra frames it creates on its own to pad the numbers.

It basically works like this:
Frame 1 is rendered.
Frame 2 is rendered.
DLSS 3.0 generates a made-up frame (call it Frame 1.5) and then it pushes out the the frames in such order:
Frame 1
Frame 1.5
Frame 2

The GPU isn't rendering the game any faster, it's just using smoke and mirrors to make things "appear" like they're going faster.

Oh well, I guess it just bugs me more than anything. I'm not really impressed with the 4090 and I don't expect to be impressed by either of the 4080s. I suppose I fall into the group that Nvidia doesn't care about because this release wasn't marketed towards someone like me.

If the "make up your own frame and add it in" feature is good for others, I hope you enjoy it.

Rich man's motion interpolation.
 
The 4080 12G is performing more like a 3090 Ti and besting it with DLSS turned on, at less than half the MSRP.

[...]
Maybe RDNA 3 will blow them out of the water. We can only hope they do so at a lower price point. But somehow, I doubt RDNA 3 will deliver the RT performance that Nvidia cards can deliver.

AFAIK from many rumors and blogs, RDNA3 will perform great compared to RDNA2 but I really doubt that the chip will be so massive (= good performer, specially at 4K with RT on) as these ones. I think in 2022 and later RT is very important as it is DLSS and brothers, on those aspects RDNA3 will hardly be as fast as Nvidia.

I only hope that RDNA3 performance on RT is at least higher than RTX30 cards at a lower price point than RTX40. If that is right, I'll go RDNA3.
 
One of the first 4k high refresh rate capable GPU's but they don't allow you to hook up more capable monitors. Sure, DisplayPort 2.0 monitors are basically non-existent at the moment but damn, if I'm spending £2000 on a GPU I expect to be able to hook up the latest and greatest Monitor to it...
Except you can - it's really more down to the monitor support, at that end of the resolution+refresh rate market.

If it's 4K 144Hz, then either DP1.4 with DSC or 4:2:2, or HDMI 2.1 with no compression are both good enough.

If it's a 4K 240Hz monitor, like the Samsung Neo G8, then it will need to be DP1.4 with DSC and custom timings, DP2.0 with DSC or HDMI 2.1 with DSC on standard timings. That particular monitor has DP1.4 and HDMI 2.1 inputs, and offers 240Hz VRR for both (although apparently, it's not great looking at that rate).

Sure, having multiple DP2.0 outputs would solve lots of problems, and I'm a little disappointed that Nvidia hasn't bothered to offer even one, but it's not like there's no way at all to hook up at least one 4K 240Hz monitor.
 
AFAIK from many rumors and blogs, RDNA3 will perform great compared to RDNA2 but I really doubt that the chip will be so massive (= good performer, specially at 4K with RT on) as these ones. I think in 2022 and later RT is very important as it is DLSS and brothers, on those aspects RDNA3 will hardly be as fast as Nvidia.

I only hope that RDNA3 performance on RT is at least higher than RTX30 cards at a lower price point than RTX40. If that is right, I'll go RDNA3.
You are probably right. I know AMD has done better at 1080 and 1440 than 4K and usually with RT off. Personally, I'll be looking at 1440 @120-240 Hz refresh rate. That seems to be a sweet spot right now.

While these results are interesting, I'd like to see more benchmarks at lower lower resolutions to see how they do.
 
I was comparing MSRP, not street price. Let's see how 4080s fare 2 years from now.

The 4080 12G is performing more like a 3090 Ti and besting it with DLSS turned on, at less than half the MSRP. I also don't know of any 3090 Ti going for $1,000, more like $1200-1600.

How do you define "generational" performance upgrade? The 4080 12G is between 10-30% faster than the 3080 (as shown here) and much faster with DLSS turned on. And, we haven't looked at RT yet. I'd say it's a bit early to write off the 4080 cards.

Maybe RDNA 3 will blow them out of the water. We can only hope they do so at a lower price point. But somehow, I doubt RDNA 3 will deliver the RT performance that Nvidia cards can deliver.
If we've learned anything over the last 2 years, it's that only street price matters. For most of that time the MSRP vs street price comparison was heavily against consumers. Now that it is somewhat in the consumer's favor on RDNA2 and Ampere, I'm not giving MSRP any more weight than sellers were a year ago.

Street price of Product A vs. street price of Product B are the only prices that matter.
 
If we've learned anything over the last 2 years, it's that only street price matters. For most of that time the MSRP vs street price comparison was heavily against consumers. Now that it is somewhat in the consumer's favor on RDNA2 and Ampere, I'm not giving MSRP any more weight than sellers were a year ago.

Street price of Product A vs. street price of Product B are the only prices that matter.
Street price of the Ti was $2K or more for a long time. It's a 2 year old card and still sells well above the $1000 price point, $1200-1600 for the most part. Even at the low end of the price range, a $900 card that delivers the same performance in raster and (maybe) better RT performance is still a good deal, comparatively speaking.

If 3090 prices come down, then it's a different comparison, but so far I'm not seeing the huge price drops, with the exception of the 3090Ti coming down from the $2K MSRP. 3080s and 3070s are holding close to MSRP
 
If it's a 4K 240Hz monitor, like the Samsung Neo G8, then it will need to be DP1.4 with DSC and custom timings, DP2.0 with DSC or HDMI 2.1 with DSC on standard timings. That particular monitor has DP1.4 and HDMI 2.1 inputs, and offers 240Hz VRR for both (although apparently, it's not great looking at that rate).
I thought DisplayPort 2.0 could do 4k 240Hz without compression? When I do a quick Google that appears to be the case.
 
I thought DisplayPort 2.0 could do 4k 240Hz without compression? When I do a quick Google that appears to be the case.
With the UHBR20, yes. I should have clarified that I was referring to the lowest rate DP2.0 offers. It can do it with UHBR13.5 too, but it’s right on the limit, so some timings might throw it off.
 
With the UHBR20, yes. I should have clarified that I was referring to the lowest rate DP2.0 offers. It can do it with UHBR13.5 too, but it’s right on the limit, so some timings might throw it off.
For a £2000 GPU, you gotta admit, you'd expect DP 2.0 UHBR20 support.

I get monitor manufacturers are very slow to uptake DP2.0 but it is a chicken and egg moment, if GPU's don't support it, then monitor manufacturers will save the R&D costs of implementing it.
 
These cards are priced for early adopters, they will be paying the highest Nivida tax. And as AMD and maybe Intel release new cards then we will see a price adjustment. Most people know this isn't the final price.

As for me, I don't use DLSS for performance comparison. I look at the cards without DLSS running for true performance comparison. And keep in mind that DLSS is used in less than .1 percent of games released each year.
 
Last edited:
Back