Well, maybe, I've said repeatedly that there is the whole question of whether any of these cards are worth the asking price. But, given the MSRP we have, that's all I can compare by.You are one of the few on this thread that gets it. ^^
Well, maybe, I've said repeatedly that there is the whole question of whether any of these cards are worth the asking price. But, given the MSRP we have, that's all I can compare by.You are one of the few on this thread that gets it. ^^
https://www.newegg.com/p/pl?N=100007709 601408872 <--- four of these cards are going for MSRP with more cards to be listed on there within the next few weeks. As of now those RTX 4080's are looking like a killer deal if the reviews are anything close to what Nvidia is showing us.Well, maybe, I've said repeatedly that there is the whole question of whether any of these cards are worth the asking price. But, given the MSRP we have, that's all I can compare by.
The $900 4080 12GB ran head to head with the 3090 Ti with DLSS turned off in two out of the three game benches in the OPThe 4070 aka 4080 12GB is pretty crap, it totally will rely on DLSS3 to make it look good. $900 for that crap. 7800XT will smoke it in rasterisation as it's not gimping the bus width, 320 bit vs 192 bit and will be cheaper.
The $900 4080 12GB ran head to head with the 3090 Ti with DLSS turned off in two out of the three game benches in the OP
The $900 RTX 4080 12GB goes head to head with the $1400 RTX 3090 Ti in two out of three of those benches with DLSS turned off. I'm no math expert but I'm going to go out on a limb and say $900 < $1400.LOL in rasterisation it was 20% faster than the 3080 in those benchmarks. It needed DLSS3 to look good. Once the hack to get DLSS3 working on Ampere is widely available no one will care about the 4080 12GB unless it drops to $699 at most.
I still don't get it...I mean, I understand how it works, but I don't get how they can tout massive fps gains when it's not rendering the game faster, but it's adding in extra frames it creates on its own to pad the numbers.
It basically works like this:
Frame 1 is rendered.
Frame 2 is rendered.
DLSS 3.0 generates a made-up frame (call it Frame 1.5) and then it pushes out the the frames in such order:
Frame 1
Frame 1.5
Frame 2
The GPU isn't rendering the game any faster, it's just using smoke and mirrors to make things "appear" like they're going faster.
Oh well, I guess it just bugs me more than anything. I'm not really impressed with the 4090 and I don't expect to be impressed by either of the 4080s. I suppose I fall into the group that Nvidia doesn't care about because this release wasn't marketed towards someone like me.
If the "make up your own frame and add it in" feature is good for others, I hope you enjoy it.
I still don't get it...I mean, I understand how it works, but I don't get how they can tout massive fps gains when it's not rendering the game faster, but it's adding in extra frames it creates on its own to pad the numbers.
It basically works like this:
Frame 1 is rendered.
Frame 2 is rendered.
DLSS 3.0 generates a made-up frame (call it Frame 1.5) and then it pushes out the the frames in such order:
Frame 1
Frame 1.5
Frame 2
The GPU isn't rendering the game any faster, it's just using smoke and mirrors to make things "appear" like they're going faster.
Oh well, I guess it just bugs me more than anything. I'm not really impressed with the 4090 and I don't expect to be impressed by either of the 4080s. I suppose I fall into the group that Nvidia doesn't care about because this release wasn't marketed towards someone like me.
If the "make up your own frame and add it in" feature is good for others, I hope you enjoy it.
The 4080 12G is performing more like a 3090 Ti and besting it with DLSS turned on, at less than half the MSRP.
[...]
Maybe RDNA 3 will blow them out of the water. We can only hope they do so at a lower price point. But somehow, I doubt RDNA 3 will deliver the RT performance that Nvidia cards can deliver.
Except you can - it's really more down to the monitor support, at that end of the resolution+refresh rate market.One of the first 4k high refresh rate capable GPU's but they don't allow you to hook up more capable monitors. Sure, DisplayPort 2.0 monitors are basically non-existent at the moment but damn, if I'm spending £2000 on a GPU I expect to be able to hook up the latest and greatest Monitor to it...
You are probably right. I know AMD has done better at 1080 and 1440 than 4K and usually with RT off. Personally, I'll be looking at 1440 @120-240 Hz refresh rate. That seems to be a sweet spot right now.AFAIK from many rumors and blogs, RDNA3 will perform great compared to RDNA2 but I really doubt that the chip will be so massive (= good performer, specially at 4K with RT on) as these ones. I think in 2022 and later RT is very important as it is DLSS and brothers, on those aspects RDNA3 will hardly be as fast as Nvidia.
I only hope that RDNA3 performance on RT is at least higher than RTX30 cards at a lower price point than RTX40. If that is right, I'll go RDNA3.
If we've learned anything over the last 2 years, it's that only street price matters. For most of that time the MSRP vs street price comparison was heavily against consumers. Now that it is somewhat in the consumer's favor on RDNA2 and Ampere, I'm not giving MSRP any more weight than sellers were a year ago.I was comparing MSRP, not street price. Let's see how 4080s fare 2 years from now.
The 4080 12G is performing more like a 3090 Ti and besting it with DLSS turned on, at less than half the MSRP. I also don't know of any 3090 Ti going for $1,000, more like $1200-1600.
How do you define "generational" performance upgrade? The 4080 12G is between 10-30% faster than the 3080 (as shown here) and much faster with DLSS turned on. And, we haven't looked at RT yet. I'd say it's a bit early to write off the 4080 cards.
Maybe RDNA 3 will blow them out of the water. We can only hope they do so at a lower price point. But somehow, I doubt RDNA 3 will deliver the RT performance that Nvidia cards can deliver.
Street price of the Ti was $2K or more for a long time. It's a 2 year old card and still sells well above the $1000 price point, $1200-1600 for the most part. Even at the low end of the price range, a $900 card that delivers the same performance in raster and (maybe) better RT performance is still a good deal, comparatively speaking.If we've learned anything over the last 2 years, it's that only street price matters. For most of that time the MSRP vs street price comparison was heavily against consumers. Now that it is somewhat in the consumer's favor on RDNA2 and Ampere, I'm not giving MSRP any more weight than sellers were a year ago.
Street price of Product A vs. street price of Product B are the only prices that matter.
I thought DisplayPort 2.0 could do 4k 240Hz without compression? When I do a quick Google that appears to be the case.If it's a 4K 240Hz monitor, like the Samsung Neo G8, then it will need to be DP1.4 with DSC and custom timings, DP2.0 with DSC or HDMI 2.1 with DSC on standard timings. That particular monitor has DP1.4 and HDMI 2.1 inputs, and offers 240Hz VRR for both (although apparently, it's not great looking at that rate).
With the UHBR20, yes. I should have clarified that I was referring to the lowest rate DP2.0 offers. It can do it with UHBR13.5 too, but it’s right on the limit, so some timings might throw it off.I thought DisplayPort 2.0 could do 4k 240Hz without compression? When I do a quick Google that appears to be the case.
For a £2000 GPU, you gotta admit, you'd expect DP 2.0 UHBR20 support.With the UHBR20, yes. I should have clarified that I was referring to the lowest rate DP2.0 offers. It can do it with UHBR13.5 too, but it’s right on the limit, so some timings might throw it off.