Is AMD finally targeting Nvidia? Radeon RX 7800 XT & RX 7700 XT launch details, FSR 3,...

Status
Not open for further replies.

Scorpus

Posts: 2,162   +239
Staff member
What just happened? AMD officially announced the Radeon RX 7800 XT and Radeon RX 7700 XT graphics cards. We have all the details, including specifications, launch date, pricing, and AMD's performance claims – plus additional information on FSR 3, Anti-Lag+, Hypr-RX, and more. So let's get started.

Pricing, Specs and Availability

Let's begin with the two new graphics cards, which will be available on September 6th. The RX 7800 XT will be priced at $500 US, while the RX 7700 XT is slightly less at $450 US. Both still lie in the middle of the current market in direct competition with Nvidia's GeForce RTX 4070 and RTX 4060 Ti.

These new cards are based on AMD's RDNA3 Navi 32 GPU die, which is chiplet based. It has one GCD in the middle, a smaller GCD than was used with Navi 31, but is still built on TSMC's N5 node. Flanking this are four MCDs using a TSMC N6 process. So, from a GPU design perspective, it's more similar to the RX 7900 XTX and 7900 XT than the RX 7600, which used a monolithic Navi 33 die.

The 7800 XT uses a fully unlocked Navi 32 die with 60 compute units and all four MCDs, offering a 256-bit memory bus and 64 MB of infinity cache paired with 16GB of GDDR6 memory at 19.5 Gbps. Meanwhile, the RX 7700 XT is cut down to 54 compute units, and it has three of the four MCDs, so a 192-bit memory bus and 48 MB of infinity cache. This allows for a 12 GB GDDR6 memory configuration, although it's also clocked slightly lower than the 7800 XT, at just 18 Gbps. Both cards have a similar game clock, 2171 MHz for the 7700 and 2124 MHz for the 7800, although the rated boost clock for the 7700 is somewhat higher. Typically, the game clock is a more realistic reflection of in-game clock speeds.

In addition, both models support new RDNA3 features like AV1 video encoding and DisplayPort 2.1. Board power ratings are mid-tier, as expected, with the 7800 XT listed at 263W while the 7700 XT runs at 245W. Given that AMD is pitting these models against the 200W RTX 4070 and 160W RTX 4060 Ti, this suggests inferior efficiency for Navi 32 compared to Nvidia's AD104 and AD106 silicon.

RX 7700 XT Performance Claims

Based on specifications and pricing alone, there are some concerns about the price of the RX 7700 XT. The 7700 XT is 10 percent cheaper than the 7800 XT, for a 10-percent reduction in compute units. That sounds reasonable. However, the memory system is significantly reduced. The 7700 XT, with its 192-bit bus and slower memory, ends up with 31 percent less memory bandwidth than the 7800 XT.

Realistically, the performance gap between these two models will be more than 10 percent. That's certainly what AMD is expecting based on this slide, where the 7800 XT is, on average, 22 percent faster than the 7700 XT, albeit across just four games.

If this is an accurate reflection of performance, the 7800 XT will be around 20 percent faster for just a 10-percent price premium, giving the 7800 XT a better cost per frame. This looks to be a very similar situation to the 7900 XT versus the 7900 XTX, where the XTX was only around 10 percent more expensive for 20 percent more performance at launch. We'll have to wait and see where the final benchmarks lie in our testing, but our initial assessment points toward the 7700 XT being dead on arrival, at least in AMD's line-up.

AMD's central performance claims for the 7700 XT compare it to Nvidia's RTX 4060 Ti 16GB. On average, across this 20-game configuration sample (below), AMD has the 7700 XT 12 percent ahead of the 4060 Ti. However, this should be taken with a healthy dose of skepticism, given these are first-party benchmarks, and AMD hasn't always been super accurate with the performance of this generation. Some of these results are ray-traced games, and some use rasterization, so if we split those off separately, AMD is claiming a 17-percent lead in raster but a four-percent deficit in ray tracing.

At first glance, this looks reasonable. AMD priced this new card 10 percent below the 4060 Ti 16GB but 13 percent ahead of the 4060 Ti 8GB, which we believe are awful graphics cards with terrible prices. VRAM capacity also slots in the middle, offering 4GB more than the anemic 8GB 4060 Ti, but you're also paying a premium for this privilege. Similarly, while it is cheaper than the 16GB model, Nvidia has a VRAM advantage.

AMD might struggle with the 7700 XT compared to some of their RDNA2 models, which are still available to purchase. The RX 6800, in particular, is currently priced around $430 to $450, and in our testing, the 6800 is about 16 percent faster than the 4060 Ti in rasterization. This would indicate similar performance to the 7700 XT at a comparable or lower price while also packing more VRAM, 16GB versus 12GB. So, that will be a very interesting comparison for our final review to see precisely where price-to-performance ratios lie, especially if the 7700 XT can't quite reach these performance figures.

Our initial feeling is that the 7700 XT is too expensive and should be a $400 GPU to fit with the rest of the RDNA3 line-up and older RDNA2 models while also offering strong competition to GeForce competitors. With that pricing, AMD could have claimed a faster GPU with more VRAM at the same price as the RTX 4060 Ti 8GB model. Instead, they have chosen not to take any big advantage in the mid-range.

RX 7800 XT Performance Claims

The 7800 XT has been pitted against the RTX 4070 in AMD's performance slide (below), showing a four-percent gain on average across these 19 games. This breaks down into a nine-percent advantage in rasterization and a six-percent deficit in ray tracing, which is a bit more favorable than the 7700 XT, given this is what AMD is offering at a $100 lower price point, not to forget the extra VRAM coming in at 16GB versus 12GB for Nvidia's competitor.

It should also lead to a more significant performance gap compared to the RTX 4060 Ti 16GB at the same price, given that our testing put the 4070 at 28 percent faster than the 4060 Ti, and AMD is claiming a slight lead over the 4070. Obviously, a thorn in AMD's side in this price tier will be ray tracing performance, which AMD suggests is lower than the 4070, and they didn't even test some of the most intensive RT titles, such as Cyberpunk 2077.

Where this model could be shaky is when comparing it to the nearest-priced RDNA2 model – the RX 6800 XT – which is currently available for around $530. Our testing has the RTX 4070 and RX 6800 XT offering similar rasterization performance, while AMD claims a nine-percent raster lead over the 4070. This does point to a better cost per frame for the 7800 XT, but it hinges entirely on the gap between it and the 6800 XT. If it's near 10 percent, as suggested, that does give it a reasonable advantage in value to the newer 7800 XT, given it's also slightly cheaper. However, if the cards end up more equal, it is only a slight generational uplift. We'll have to see how it all plays out in our comprehensive review.

At least on the graphics card launch side of things, what AMD is showing looks reasonable, not super amazing must-buys, not awfully inadequate models, but just okay. Again, the 7800 XT looks better from what AMD has shown than the 7700 XT, but with both GPUs, it's really going to come down to final performance, features, and value. There are many decent options in this price range, including the RX 6800 XT, RX 6800, RX 6700 XT, and probably Nvidia's best RTX 40 series GPU outside the 4090 in the RTX 4070. One thing AMD has going for it with this launch is it can take the crown for the cheapest current generation GPU to offer more than 8GB of VRAM in the 7700 XT, succeeding the RTX 4060 Ti due to its $50 lower price tag and 12GB capacity.

AMD Provides More FSR 3 Details

During its Gamescom presentation, AMD also spent some time talking about new features for Radeon GPUs and games as a whole. One of the biggest ones is FSR 3, which is hotly anticipated after the company announced it way back in November of 2022. We're nearly 10 months on from that announcement, and despite revisiting FSR 3 at Gamescom, AMD is still not ready to release this feature and hasn't put a firm release date on it. The best we could get from AMD is it will roll out in "early fall," which could mean as early as September.

We also learned that contrary to recent rumors, AMD is not aiming to have Starfield as their first FSR 3 launch title. In fact, this presentation did not indicate that FSR 3 would be coming to Bethesda's game at all. Starfield is not listed on AMD's future game support slide, and Bethesda is not listed as a partner for FSR 3. AMD specifically told us that Forspoken and Immortals of Aveum would be the first games to receive FSR 3. That's a bit of a blow, given that Starfield is currently AMD's flagship sponsored title. However, FSR 3 is coming to Cyberpunk 2077, which is good to see given the imminent launch of that game's expansion, Phantom Liberty.

We didn't learn much more about FSR 3 that AMD hadn't previously claimed. Their demonstrations show frame generation technology working as expected to roughly double the frame rate, as we expected based on what the company said last year. As with DLSS 3 frame generation, you don't get precisely double the FPS, as seen below, but the technology is showing a significant leap in smoothness with it enabled. AMD is branding its frame generation tech as "Fluid Motion Frames." When implemented in FSR 3, it uses frame interpolation and motion vectors to generate these additional frames (quite similar to DLSS 3, by the sounds of things).

Where FSR 3 will differ from DLSS 3 is in hardware support. FSR 3 will be available for most graphics cards. When we asked, AMD said support goes back to RDNA1-based GPUs and "a broad range of competitor solutions, including NVIDIA GeForce RTX 20, 30, and 40 Series graphics cards." However, it also recommended using FSR 3 on RDNA2 or RDNA3 GPUs, suggesting the tech is optimized for newer architectures and may suffer on those older cards. It's unclear whether FSR 3 will be blocked entirely from running on GPUs that aren't in this supported list, for example, the trusty old RX 580; it could be a situation where the architecture of older models doesn't include key functionality that FSR 3 requires.

We asked AMD whether FSR 3 uses FSR 2 for temporal upscaling or whether the upscaling component has been upgraded.

"FSR 3 includes the latest version of our temporal upscaling technology, which has been optimized to be fully integrated with the new frame generation technology. However, our main focus with AMD FSR 3 research and development has been on creating high-performance, high-quality frame generation technology that works across a broad range of products."

So it doesn't sound like the temporal upscaler has been upgraded, but it will use the latest version. AMD also said it has optimized FSR 3 to reconstruct game UIs, saying that "UI processing is an integral part of our solution to minimize any impact frame generation can have on it."

Alongside FSR 3, AMD will be integrating frame generation into their driver as a standalone Fluid Motion Frames feature, similar to what it did with Radeon Super Resolution, bringing FSR 1 into the driver. This will offer broad game compatibility and allow you to use a very rudimentary version of frame generation in any title on your Radeon GPU.

It all sounds good in theory, and we're sure many will comment about how this will put AMD ahead in the "generated frames" race, and plenty of fanboy wars will ensue. However, based on what we've seen with DLSS 3 frame generation, game integration and the use of motion vectors are crucial to generating visually decent frames. The frame interpolation component is often the weak link to frame generation, causing blurry, garbled messes every second frame. With DLSS 3, usually only parts of the screen have to rely on interpolation, with the rest being processed primarily thanks to motion vector and game engine data (e.g., shift this entire frame slightly to the left for the next generated frame).

With AMD saying that Fluid Motion Frames in the driver is just the motion interpolation component – of course it is; the driver cannot access data such as motion vectors – we have pretty substantial question marks over the visual quality of this feature. Without that crucial motion data, is the entire frame just going to be a blurry, smeared mess blending two complete frames together? If it's anything like what we've seen from interpolation in DLSS 3 but now applied to the entire generated frame every single time, it could look awful, and we're not sure it's a feature that will be worth using. So, that one will require some pretty heavy scrutiny before we get excited about it.

New Anti-Lag Plus Feature for RDNA3

Latency is where it gets a bit complicated. You might have spotted on these FSR 3 slides something called "Anti-Lag+," a new feature being announced. It is separate from FSR 3's built-in latency-reducing technology. So, to be clear, when you enable FSR 3, it will automatically apply latency-reducing tech, but then Radeon owners can use Anti-Lag+ on top of this at the driver level. AMD's slides make it seem like Anti-Lag+ is part of FSR 3, like how Nvidia Reflex is part of DLSS 3 and is enabled when frame generation is toggled on. However, AMD clarified this isn't the case.

Anti-Lag+ is a new driver-level latency-reducing feature available exclusively for RDNA3 GPUs and presumably anything else moving forward. It will be available alongside the old Anti-Lag but promises some improvements. AMD explained it to us as follows:

"AMD Radeon Anti-Lag helps synchronize CPU and GPU processing during gameplay to reduce input latency. Anti-Lag's synchronization point, however, is placed at a fixed location in the rendering pipeline. AMD Radeon Anti-Lag+ further reduces latency through intelligent synchronization placement that considers the holistic rendering pipeline. As a result, Anti-Lag+ can further reduce latency where it matters most."

AMD didn't go into specifics, but this seems more akin to Nvidia Reflex than their previous Anti-Lag technology. Reflex intelligently adjusts to minimize latency in the pipeline, effectively providing the benefits of a dynamic frame cap to slightly alleviate any GPU bottleneck, a proven technique for improving latency. Nvidia's technology is integrated into the games and is said to include optimizations on a per-title basis. We don't know if Anti-Lag+ works like this yet, and implementation does differ in that it's driver-side instead of game-side. However, the way AMD has explained it makes it sound like it's trying to achieve something similar. Hopefully, it's like Reflex, a great technology that works well, and up to this point, AMD didn't have a competitor for it.

It's also unclear whether Anti-Lag+ will work across all games or just a selection of titles. AMD confirmed that it isn't integrated into games but is driver profile-based. The company seemed to suggest only some titles will support Anti-Lag+. We'll have to see how that all works when it's integrated into the driver.

How Anti-Lag, Anti-Lag+, and FSR3 intertwine is a bit confusing, especially concerning how DLSS3 and Reflex operate on the Nvidia side. So here is the best explanation of how it all works:

AMD's FSR3 frame generation technology has built-in latency-reducing technology that works across all GPUs. Gamers using an RX 7000 series GPU can go into the driver and enable Anti-Lag+ on top of FSR3's latency-reduction tech in supported titles. Anti-Lag+ is not enabled by default in FSR3 titles; it's an optional feature that users must turn on manually. Owners of older Radeon GPUs can apply Anti-Lag (the non-plus version) on top of FSR3's latency-reducing tech instead, again an optional feature.

"The built-in latency reduction technology in AMD FSR 3 will provide valuable latency reduction when using FSR 3 frame generation, and our in-driver latency reduction solutions are a bonus that stacks on top to provide the best possible experience when using FSR 3," AMD explained.

As for why Anti-Lag+ is only supported on RDNA3, AMD said, "Anti-Lag+ [relies] on Intellectual Property that was introduced with AMD RDNA 3 and is not available on previous generations of hardware. At AMD, we always aim to support previous generations of hardware. […] While we are investigating the ability to do […] Anti-Lag+ in earlier hardware, we cannot commit that this is possible."

We would like to know exactly what intellectual property is specific to RDNA3 that is required for this feature, but we'll have to wait to find out.

For regular FSR 2 users, AMD is releasing a new native anti-aliasing mode, which allows you to use FSR 2 technology without any upscaling applied. This is effectively what Nvidia is doing with DLAA. So, it gives AMD parity with Nvidia in games that integrate this feature. This native AA mode will provide gamers with a higher quality anti-aliasing technique than many games' built-in TAA solution, at some cost to performance, while also featuring FSR's built-in sharpening. Many people have been requesting something like this, so AMD is now delivering it.

HyperRX

Lastly, we have Hypr-RX, AMD's one-click toggle to apply all its performance-enhancing features. First announced way back with RDNA3 last year, this feature is finally ready and will be available natively in some games and as a driver toggle. Essentially, it aims to make it easy for novice users to enable features like Radeon Boost, Anti-Lag, Super Resolution, and even Fluid Motion Frames by hitting one button. In-game implementations will use FSR when Hypr-RX is on, while the driver toggle will use Radeon Super Resolution. Gamers can then go in and further customize what features are enabled, so they could turn on Hypr-RX and then disable Boost or RSR if they don't like those features.

Hypr-RX is the first time that all of these features can be enabled together. Previously, you couldn't use Boost and Radeon Super Resolution simultaneously. AMD has worked around that limitation, the caveat being that, like Anti-Lag+, Hypr-RX is only supported on RDNA3 GPUs and newer. AMD offered the same technical reason for this as with Anti-Lag+, which seems pretty questionable.

We don't think Hypr-RX will be an essential feature for most enthusiast gamers, as they'll probably want to tweak each of AMD's features individually.

To recap: The RX 7800 XT and RX 7700 XT are coming on September 6th, priced at $500 and $450, respectively. AMD FSR 3 is nearly ready and will be debuting in early fall, starting with Forspoken and Immortals of Aveum. Anti-Lag+, Fluid Motion Frames, and Hypr-RX are also coming to AMD's Radeon driver shortly.

Permalink to story.

 
The MSRP of the 6800XT is US$650, if you take inflation into account it would be around US$750 today. It's dishonest to say that the 7800XT doesn't offer a better price for not being priced below a clearly temporarily discounted product to clear inventory.

It seems to me the best value for money from AMD in recent years.

 
The 7700 XT is very weird. At $450 it doesn't make sense and only exists to upsell the 7800 XT. But this also implies good yields on Navi 32, which is why AMD would want to make the full die option (7800 XT) more appealing. That makes it awkward to slot another product between the $250 7600 and the $450 7700 XT. They can't make Navi 33 any larger, and they don't seem to want to make Navi 32 variants any smaller either, so what are they gonna sell in the $300~$350 range?
 
Find quite awkward that among all of what AMD announced today, Tim seems unimpressed at least, or nit picking at its best.
AMD FSR3 may be a huge blow to Nvidia DLSS3 RTX 4xxxx "exclusivity" which shows again how much anticonsumers and antigamers Nvidia is with its own buyers. And this may explain better why Nvidia rushed to announce DLSS3.5 available for all their RTX cards. And why they called it 3.5. To confuse gamers that they have something available for all their RTX cards, just that not DLSS3?
This may be too late for Nvidia to full gamers from now on. Because AMD is making FSR3 available for more Nvidia cards, RTX2xxx, 3xxx and 4xxx, than Nvidia did with its DLSS3 which is available only for RTX4xxx cards.
Again, quite awkard for Tim to disregard Nvidia DLSS black pattern, artificial market segmentation, not saying a word about this, instead of condemning it or at least acknowledging it. Keeping this loudly silence about it, just to be more favorable for Nvidia, is in detriment of gamers and Techspot readers who are not informed about this.
AMD making FSR3 available for all Nvidia RTX cards could have been easily the frontline title for a more interesting news presentation. Instead redears get "silentio stampa" awkward silence when Nvidia is caught in its own wrongdoings.
But hey, we get an "unimpressed" subtitle like: "Bottom line: Reasonable, not super amazing must-buys, not awfully inadequate models, but just okay".
This is what many can call it a wasted tech journalist opportunity, when it could easily be like:
"AMD is coming big after Nvidia, offering FSR3 for all RTX gen cards 2xxx, 3xxx, 4xxx while Nvidia made DLSS3 available only for its RTX 4xxx cards".

Also 7800XT looks promising, let's see if independent benchmarks will confirm.
On paper, both 7700XT and 7800XT are better offerings than Nvidia counterparts, let's see if benchmarks will confirm.
 
Last edited:
Find quite awkward that among all of what AMD announced today, Tim seems unimpressed at least, or nit picking at its best.
AMD FSR3 may be a huge blow to Nvidia DLSS3 RTX 4xxxx "exclusivity" which shows again how much anticonsumers and antigamers Nvidia is with its own buyers. And this explain better why Nvidia rushed to announce DLSS3.5 available for all their RTX cards. And why they called it 3.5. To confuse gamers that they have something available for all their RTX cards, just that not DLSS3?

AMD is making FSR3 available for more Nvidia cards, RTX2xxx, 3xxx and 4xxx, than Nvidia did with its DLSS3 which is available only for RTX4xxx cards.
Again, quite awkard for Tim to disregard Nvidia black pattern artificial market segmentation, not saying a word about this, instead of condemning it or at least acknowledging it. Keeping this loudly silence about it, just to be more favorable for Nvidia, is in detriment of gamers who are not informed about this.
AMD making FSR3 available for all Nvidia RTX cards could have been easily the frontline title for a more interesting news presentation. Instead redears get "silentio stampa" awkward silence when Nvidia is caught in its own wrongdoings.

I noticed that too. If anything they seem to always criticize or nit picks AMD more. To be fair they do it to Nvidia too but not to the extent as much as they do to AMD.


 
Yeah yeah, same old story, ever so slightly cheaper GPU's with slightly more VRAM, but still worse RT performance, power draw and based on the fact they have nothing to compete with DLSS3.5, now tangibly worse image quality.

Good going AMD.
 
Wish they would target fixing their own broke *** drivers instead

the only time I had problems with an ATI/AMD gpu was with the 5700xt, and I use their hardware since the 3D rage pro from 1998... I build a cr@pload of PCs since it's my job and even my customers never had any problems... but better blaming the product and not ourselves ... like most ppl not using DDU or cleaning old drivers from the previous GPU when switching from NV to AMD this causing 80%+ of the BS of amd drivers = bad... cause you know, you would had the same problems when switching to NV from AMD, and what you would do then ? blaming NV for sh!t drivers too ?
 
Yeah yeah, same old story, ever so slightly cheaper GPU's with slightly more VRAM, but still worse RT performance, power draw and based on the fact they have nothing to compete with DLSS3.5, now tangibly worse image quality.

Good going AMD.
Every user care about more VRAM. However, RT is still very niche, power draw is not issue at all (RTX 3000 -series proves that) and "guess what game should look like" DLSS 3.5 is equally useless as is DLSS 3.0.

AMD does right things again.
 
image quality improvements are useless ?
If nvidia's rt/dlss sr+rr are useless, just imagine how useless amd's console "rt" + fsr2 must be.
Define improvement. Many times DLSS/FSR produce worse quality. DLSS 3 on other hand increases latency, input lag or both so much it's basically useless.

All of those are useless right now. RT might be useful in some years (remember, RTX 2000 cards launched 5 years ago), but that same **** have been told many years now. I have never been fan of those techniques that give "better" image quality but many times also fail to do it.
 
Define improvement. Many times DLSS/FSR produce worse quality. DLSS 3 on other hand increases latency, input lag or both so much it's basically useless.
overall, in the IQ comparison test across 25 games, HUB gave dlssq 17 points at 1440p, while native +taa got 11. So go argue with them, not me, I'm going off the review.
FG is not useless even with slight lag increase, at least not for people who use controllers to play tpp games. What is stupid is nvidia thinking ai frames can be part of generational performance increase, which they are not.
And how is the dlss 3.5 denoiser useless, you forgot to explain that part.... HUB seems to think otherwise too.
I think many of the HUB fanbase are getting mad when the guys are just talking facts.
 
overall, in the IQ comparison test across 25 games, HUB gave dlssq 17 points at 1440p, while native +taa got 11. So go argue with them, not me, I'm going off the review.
FG is not useless even with slight lag increase, at least not for people who use controllers to play tpp games. What is stupid is nvidia thinking ai frames can be part of generational performance increase, which they are not.
And how is the dlss 3.5 denoiser useless, you forgot to explain that part.... HUB seems to think otherwise too.
I think many of the HUB fanbase are getting mad when the guys are just talking facts.
Image quality is subjective issue. I'm not fan of "we won't produce faster cards but instead lower image quality" -type of crap. And will never be.

I don't play with consoles and have never played turn based game that worked too slow with my current hardware (graphic wise). For me, there is nothing to see.

DLSS 3.5 is same "lower quality, more speed" -kind of stuff I don't fully support. And never will. That simple. Unlike many, I keep my opinions and won't change them every time something new appears.
 
AMD does right things again.
Right moves? Oh if that's the case, we should see a huge swing in sales then! Bye bye Nvidia, hello AMD in the Steam charts!

I'm sure AMD's move to try and ban DLSS in sponsored games is a winning strategy as well.
Define improvement.
The denoiser replacement (Ray Reconstruction) has a very clear improvement, AMD barely caught up with feature parity with Nvidia through frame reconstruction (and truthfully, it's not a very useful feature a lot of the time) but this denoiser replacement is actually quite a big step up for image quality.
 
Right moves? Oh if that's the case, we should see a huge swing in sales then! Bye bye Nvidia, hello AMD in the Steam charts!

I'm sure AMD's move to try and ban DLSS in sponsored games is a winning strategy as well.
Since when quality, better products etc guarantee better sales? Obviously you haven't followed tech industry since yesterday.

Give me one good reason why DLSS should NOT be banned from games? Nvidia only tech that in some versions require certain Nvidia generation. Without DLSS we have better FSR what works with basically every card. Just like with G-Sync, that was useless Nvidia locked down feature that was ripped apart with Adaptive sync (aka Freesync).
The denoiser replacement (Ray Reconstruction) has a very clear improvement, AMD barely caught up with feature parity with Nvidia through frame reconstruction (and truthfully, it's not a very useful feature a lot of the time) but this denoiser replacement is actually quite a big step up for image quality.
Still, it is "we don't want to put more compute power so we calculate less" -type of stuff I won't ever support. Feel free to disagree.
 
I don't play with consoles and have never played turn based game that worked too slow with my current hardware (graphic wise). For me, there is nothing to see.
then move along, and let people enjoy what they want to enjoy, instead of whining all the time.

DLSS 3.5 is same "lower quality, more speed" -kind of stuff I don't fully support. And never will. That simple.

did you watch any 3.5 coverage, or are you still going purely off your nvidia hate ? rr is same performance but better denoising.

Unlike many, I keep my opinions and won't change them every time something new appears.

You're like the catholic church in the middle ages.Explains a lot.

'm not fan of "we won't produce faster cards but instead lower image quality" -type of crap.
again, you're the one who needs getting your facts straight. comparison was done vs native. native+taa wins only 9 out of 24 vs dlssq, and only two of those nine in significant way (++). I'll trust HUB and my own judgement of owning 6800 and 3080, thank you very much.

3yvn8Jb.png
 
Last edited:
then move along.
Turn based games never have any performance issues. So DLSS 3 tries to answer problem that don't even exist. Simple.
did you watch any 3.5 coverage, or are you still going purely off your nvidia hate ? rr is same performance but better denoising.
Just like with other DLSS stuff, it's Supposed to give better quality based on guessing things. Instead of giving actually more compute power. This was we can soon create games where AI just guesses everything and that's better than ever.
 
Turn based games never have any performance issues. So DLSS 3 tries to answer problem that don't even exist. Simple.
Lower motion fluidity is not a problem in turn based ? Since when ? it's like saying a 60 fps video has no benefit over 24 fps. Latency is not a problem there, but not fluidity.

Just like with other DLSS stuff, it's Supposed to give better quality based on guessing things. Instead of giving actually more compute power. This was we can soon create games where AI just guesses everything and that's better than ever.
"guessing things" is computers work.
has nothing to do with compute power, but rather improved algorithms. 4090 without the new denoiser will look worse than 4060 with it, and no amount of game slider maxing will make it equal.

The denoiser replacement (Ray Reconstruction) has a very clear improvement, AMD barely caught up with feature parity with Nvidia through frame reconstruction (and truthfully, it's not a very useful feature a lot of the time) but this denoiser replacement is actually quite a big step up for image quality.

yeah, can't wait to see that rt denoiser work in alan wake 2. so glad, been number one game on my waiting list since announcement. now it looks like it'll blow anything in terms of graphics quality like control did years ago.
 
Last edited:
I can understand price inflation etc. But don't know what to do with AMD naming inflation. One may recall the 5700XT set against the RTX2070. Later we got 6700XT with sub-3070 performance. Now that AMD moved to the 7000 Radeon series, 7800XT is marketed as the 4070 competitor. Not even 7800 but 7800XT. But why? Is there something x800-like in these new Radeons? We were informed that 7700XT and 7800XT are both built from the same silicon. Why not to unite these products under the same hood and call them 7700 and 7700XT? I personally could eat this naming, and the close pricing would be a bit more sensible (like 5700 and 5700XT if someone recalls it). Maybe 7700 for $400 would be better (Tim already noted this). And there would be gen-to-gen improvement performance-wise. Instead looks like 7800XT may even have deficit compared to 6800XT. Why? The answer was already given. Blame AMD for the naming.

Of course it all came from the upper class cards naming. When 7900XT can't do the job against 4080, it's hard for AMD to swallow that this gen Radeons are too weak.
 
Lower motion fluidity is not a problem in turn based ? Since when ? it's like saying a 60 fps video has no benefit over 24 fps.
Not a problem. DLSS 3 requires RTX 4000 card. Slowest RTX 4000 card is 4060 that is more than capable for any turn based game.

Even if you do have totally trash card like GT210 that Might benefit from DLSS 3, guess what, it does not support it. Totally useless feature that does not resolve any problems because cards that Could benefit from it are not supported. Shortly: You need latest tech to benefit from feature that could be usable with low end hardware.

And like I said, improving performance on turn based games is very small problem. I cannot recall any graphic related performance problems with them and have been playing for decades. CPU issues have been much much worse.
"guessing things" is computers work.
has nothing to do with compute power, but rather improved algorithms. 4090 without the new denoiser will look worse than 4060 with it, and no amount of game slider maxing will make it equal.
CPUs guess but also get correct answer. GPU guessing is more like "what actually looks better" and that's subjective issue.
I can understand price inflation etc. But don't know what to do with AMD naming inflation. One may recall the 5700XT set against the RTX2070. Later we got 6700XT with sub-3070 performance. Now that AMD moved to the 7000 Radeon series, 7800XT is marketed as the 4070 competitor. Not even 7800 but 7800XT. But why? Is there something x800-like in these new Radeons? We were informed that 7700XT and 7800XT are both built from the same silicon. Why not to unite these products under the same hood and call them 7700 and 7700XT? I personally could eat this naming, and the close pricing would be a bit more sensible (like 5700 and 5700XT if someone recalls it). Maybe 7700 for $400 would be better (Tim already noted this). And there would be gen-to-gen improvement performance-wise. Instead looks like 7800XT may even have deficit compared to 6800XT. Why? The answer was already given. Blame AMD for the naming.
Then what would be 7800? AMD needs x800 card and since 7900XT is already cut down from 7900XTX, only possibility is to make another chip for only x800 series OR rename 7900XT to be x800.

Also Nvidia launched 4080 first so AMD must have competitor for that. Most people are stupid so marketing must match competitor model numbers at least some way.

Like: because Intel have i3, i5, i7 and i9, AMD must also have 3, 5, 7, and 9.
 
You need latest tech to benefit from feature that could be usable with low end hardware.
Incorrect again. FG needs high fps input to work well. In fact, that was precisely specified by amd in their fsr3 presentation. fsr3 needs +60 fps for intended outcome, or else the frames are too far away from each other, and the hw+sw will not produce an accurate representation.
Plus FSR3 taps into async compute to simulate what nvidia dedicated OFA does, therefore old cards will never benefit from it, and new ones will see diminishing returns in games that use async.That's the reason fsr3 needs 10 series and 20 is recommended. Not FG, not FSR3 will run on a gt210.
Even 4060 is too slow for FG imo, unless you mean using it at 1080p/high to take 60 fps to 90. Nvidia putting OFA for FG on 4060 is pure marketing theater by David Leatherman.

To sum up: Get informed and then we'll talk. That'll be all for now.
 
Last edited:
Incorrect again. FG needs high fps input to work well. In fact, that was precisely specified by amd in their fsr3 presentation. fsr3 needs +60 fps for intended outcome, or else the frames are too far away from each other, and the hw+sw will not produce an accurate representation.
Plus FSR3 taps into async compute to simulate what nvidia dedicated OFA does, therefore old cards will never benefit from it, and new ones will see diminishing returns in games that use async.
Even 4060 is too slow for FG imo, unless you mean using it at 1080p/high to take 60 fps to 90.

To sum up: Get informed and then we'll talk. That'll be all for now.
Not incorrect but correct. I talked about turn based games where GPU performance rarely is an issue. However that's also reason why they could be played on low end hardware. About only scenario where frame generation would really benefit is playing turn based game with potato GPU. But for that you'll need already fast enough GPU from AMD (for it to work properly) or from Nvidia (to have it even supported).

Other "usable" scenario is when wanting to have already high FPS to be very high while having input lag or latency. What's the main point having high FPS? To have less input lag and/or latency?

That "feature" makes absolutely zero sense.
 
Status
Not open for further replies.
Back