Is AMD finally targeting Nvidia? Radeon RX 7800 XT & RX 7700 XT launch details, FSR 3,...

Status
Not open for further replies.
Not incorrect but correct. I talked about turn based games where GPU performance rarely is an issue.


sub-30 on rx580 1080p does look like an issue to me.

What's the main point having high FPS? To have less input lag and/or latency?
more frames equals higher motion fluidity. ai generated frames are still new frames that contribute to that. turn based games do not need lower latency, but higher frame count they do. but like I said, you need an async compute card at least to make fsr3 work, so your "gt240" point is just invalid. and the more a game uses async, the less return you'll see. nvidia's FG solution is hw-locked, but does not produce lower returns when a game uses async (which is standard since early dx12)
 
Last edited:
Give me one good reason why DLSS should NOT be banned from games? Nvidia only tech that in some versions require certain Nvidia generation. Without DLSS we have better FSR what works with basically every card. Just like with G-Sync, that was useless Nvidia locked down feature that was ripped apart with Adaptive sync (aka Freesync).

Still, it is "we don't want to put more compute power so we calculate less" -type of stuff I won't ever support. Feel free to disagree.
This is just, as a whole, quite a painful comment. Do you follow places like Digital Foundry? Just wondering because they recently addressed DLSS/FSR/XeSS and the reasons they exist and are good for gaming in general.
 
It will be interesting if the Steamdeck can leverage FSR3 for gains.
Same for the Nintendo Switch. However, given the system's heavy reliance on two key factors (already relatively short frame time and light asynchronous shader load), neither the Switch nor Steam Deck may have sufficient 'oomph' to really make the most from it.

This is the advantage that Intel's XeSS on Arc cards and Nvidia's DLSS on RTX cards have over using shader-based upscaling/frame generation algorithms -- the matrix/tensor cores can be used concurrently with the normal SIMD units and generally aren't used for anything else (FP16 calculations in the case of Nvidia), so the use of the routines doesn't add to the SIMD load.
 
the only time I had problems with an ATI/AMD gpu was with the 5700xt, and I use their hardware since the 3D rage pro from 1998... I build a cr@pload of PCs since it's my job and even my customers never had any problems... but better blaming the product and not ourselves ... like most ppl not using DDU or cleaning old drivers from the previous GPU when switching from NV to AMD this causing 80%+ of the BS of amd drivers = bad... cause you know, you would had the same problems when switching to NV from AMD, and what you would do then ? blaming NV for sh!t drivers too ?
I've got many modernGPUs including 7970/290/290X/390/Nano/Fury non X/RX480/RX 580/Vega 64/7900XT alongside equivalent Nvidia cards 680/780Ti/Titan/980Ti/1080Ti/3080Ti; and the difference in serious problems is pretty stark across my W7/10 systems; i7 920/980X/4770K/5930K/6850K/7800X/10700KF/i9 10920X; I have no game that doesn't work under nvidia; with AMD CM new engine is a crashfest; AF forcing simply doesn't work in older titles like Operation Flashpoint and Oblivion, and disabling vsync via driver does not work in certain games like Shadow of Chernobyl. The reverse of all that is the experience with the Nvidia cards. The QC on the driver front is to the point where I will no longer buy AMD.

 
Last edited:
I've got many modernGPUs including 7970/290/290X/390/Nano/Fury non X/RX480/RX 580/Vega 64/7900XT alongside equivalent Nvidia cards 680/780Ti/Titan/980Ti/1080Ti/3080Ti; and the difference in serious problems is pretty stark across my W7/10 systems; i7 920/980X/4770K/5930K/6850K/7800X/10700KF/i9 10920X; I havemo game that doesn't work under nvidia; with AMD CM new engine is a crashfest; AF forcing simply doesn't work in older titles like Operation Flashpoint and Oblivion, and disabling vsync via driver does not work in certain games like Shadow of Chernobyl. The reverse of all that is theexperience with the Nvidia cards. The QC on the driver front is to the point where I will no longer buy AMD.

amd do not even report some of the high impact issues. I had constant vram leak related crashes in rdr2, which I later learned were related to using sam with vulkan. never even mentioned in open issues. same as deleting all game profiles from amd's adrenaline control center upon crash. never acknowledged in the open issue list. dual monitor power issue is gone from the list for rx5000/6000, but not solved.
the nvidia ones I had (dpc, g-sync, freestyle filter related) were always acknowledged and fixed within a week or two.
 
Last edited:
This is just, as a whole, quite a painful comment. Do you follow places like Digital Foundry? Just wondering because they recently addressed DLSS/FSR/XeSS and the reasons they exist and are good for gaming in general.
Yeah. from my own testing 1440p fsr/dlss just wrecks 1080p native in quality, while performance is similar. same for 4k dlss/fsr vs native 1440. I can't imagine anyone using 1080p anymore, as even fsr version 2.0 (the oldest of modern upscalers, with the highest amount of issues) on balanced preset looks better and runs the same. Doing your upscaler/ reconstruction technique research is a must while buying a card these days
 
overall, in the IQ comparison test across 25 games, HUB gave dlssq 17 points at 1440p, while native +taa got 11. So go argue with them, not me, I'm going off the review.
FG is not useless even with slight lag increase, at least not for people who use controllers to play tpp games. What is stupid is nvidia thinking ai frames can be part of generational performance increase, which they are not.
And how is the dlss 3.5 denoiser useless, you forgot to explain that part.... HUB seems to think otherwise too.
I think many of the HUB fanbase are getting mad when the guys are just talking facts.
Facts. There are only what like 3 titles with dlss 3.5 while fr3 open standard should work on all dx11/12 titles and competing hardware as well as previous gen hardware. This is great for the gaming industry. Frame generation became freesync status. Maybe this is why Nvidia needed to show off dlss 3.5 in less than a handful of titles to fortify its position of dominance. Useful or not frame gen charts with 3x performance on 4000 series cards is now a thing of the past.
 
Yeah. from my own testing 1440p fsr/dlss just wrecks 1080p native in quality, while performance is similar. same for 4k dlss/fsr vs native 1440. I can't imagine anyone using 1080p anymore, as even fsr version 2.0 (the oldest of modern upscalers, with the highest amount of issues) on balanced preset looks better and runs the same. Doing your upscaler/ reconstruction technique research is a must while buying a card these days
It was Digital Foundry's talking points around UE5 performance and the reasons why DLSS/FSR/XeSS exist.


Essentially, Ray-Tracing is the future, in order to get much better lighting and all the rest of what encompasses the next generation of video game graphics means we're putting more pressure on GPU's than ever before.

UE5 is giving a higher quality of pixel, but the stress on the GPU cannot produce a native amount of pixels, upscaling is a requirement.

Also the latest tech in DLSS3.5 makes an actual legit change to the quality of ray-tracing denoising. Proving having these dedicated upscalers is a net positive.
 
Every user care about more VRAM. However, RT is still very niche, power draw is not issue at all (RTX 3000 -series proves that) and "guess what game should look like" DLSS 3.5 is equally useless as is DLSS 3.0.

AMD does right things again.
If power draw is not an issue because of RTX 3000 then RT and dlss IS an issue. Look at RTX 3000.

Man, can you stop being biased for a single post? Can you?
 
It was Digital Foundry's talking points around UE5 performance and the reasons why DLSS/FSR/XeSS exist.


Essentially, Ray-Tracing is the future, in order to get much better lighting and all the rest of what encompasses the next generation of video game graphics means we're putting more pressure on GPU's than ever before.

UE5 is giving a higher quality of pixel, but the stress on the GPU cannot produce a native amount of pixels, upscaling is a requirement.

Also the latest tech in DLSS3.5 makes an actual legit change to the quality of ray-tracing denoising. Proving having these dedicated upscalers is a net positive.
That's why we need AMD to keep competing with Nvidia to keep them on their toes. Honest question do you think Nvidia would have released dlss 3.5 right now if it wasn't for FSR 3.0? Hence the timing of it all?
 
Facts. There are only what like 3 titles with dlss 3.5 while fr3 open standard should work on all dx11/12 titles and competing hardware as well as previous gen hardware. This is great for the gaming industry. Frame generation became freesync status. Maybe this is why Nvidia needed to show off dlss 3.5 in less than a handful of titles to fortify its position of dominance. Useful or not frame gen charts with 3x performance on 4000 series cards is now a thing of the past.
Facts - fsr3 is frame generation and dlss 3.5 is a rt denoiser. No idea why anyone would compare them. Its like saying camels are better than pencils.
Fsr3 is limited to using async compute and will be hit or miss until it gets dedicated hardware on die, like OFA on rtx40.
In two years time it will end like dlss2 vs fsr2, fsr3 will hit a wall without dedicated hardware/software for AI while dlss3 will get better and more functional.
Imo that is the reason AMD is so shady with sponsorships and blocking dlss/xess, fsr 2.2 is probably as good as a spatial upscaler will ever get, while dlss keeps getting better and more functional ( better denoiser running on tensor cores, no need to tap into fp32/int32 resources to use it). The decision to not include ai hardware on rdna1/2 will hamper what rdna3 buyers will get in the future.
 
Last edited:
Facts - fsr3 is frame generation and dlss 3.5 is a rt denoiser. No idea why anyone would compare them. Its like saying camels are better than pencils.
Fsr3 is limited to using async compute and will be hit or miss until it gets dedicated hardware on die, like OFA on rtx40.
It's called the concept of competition. Which is a good thing for gamers but this might trigger some brand loyalists.
 
It's called the concept of competition. Which is a good thing for gamers but this might trigger some brand loyalists.
Illusion of competition you mean. Its the same competition as fsr2 vs dlss sr+rr, without dedicated ML hardware AMD will always be limited in what they can do. Seeing fp16/8 dedicated hardware on AMD cards is inevitable, rdna3 already has it, but won't use it for image reconstruction or frame generation cause rdna1/2 don't. So AMD keeps buying time with shady sponsorships.
 

sub-30 on rx580 1080p does look like an issue to me.
7 year old card with $229 MSRP for AAA game does not sound too bad. Baldurs Gate 3 is more AAA game than turn based game. There are very few turn based AAA-games overall.
more frames equals higher motion fluidity. ai generated frames are still new frames that contribute to that. turn based games do not need lower latency, but higher frame count they do. but like I said, you need an async compute card at least to make fsr3 work, so your "gt240" point is just invalid. and the more a game uses async, the less return you'll see. nvidia's FG solution is hw-locked, but does not produce lower returns when a game uses async (which is standard since early dx12)
Motion fluidity can also be achieved with bigger framerate. FYI many turn based games have FPS limiter, since not much FPS is really needed and more FPS = more power consumption. Also on turn based games graphics are rarely main "thing". Frame generation is one step forward and at least one step backwards.

My point is that frame generation would be useful to make crap cards faster so that they could be used to play games normally unplayable. But because this would mean less GPU sales, it won't happen of course. That's why whole idea makes no sense at all.
This is just, as a whole, quite a painful comment. Do you follow places like Digital Foundry? Just wondering because they recently addressed DLSS/FSR/XeSS and the reasons they exist and are good for gaming in general.
People here seem to comment that new GPUs are not better than previous generation. That is because these technologies mean there is no need to develop better GPUs, just put some new software solutions and that's it.

So anyone that complain about lack of GPU development should also despise those image guessing technologies. Make your choice. I have already decided to rather have more processing power than "better" image guessers.
If power draw is not an issue because of RTX 3000 then RT and dlss IS an issue. Look at RTX 3000.

Man, can you stop being biased for a single post? Can you?
But why? If power consumption was really issue, then nobody would have bought RTX 3000 series. Also no-one would buy Intel heater CPUs. But when AMD have higer power draw, then it's serious issue. I call that reverse logic.

No idea why RT is issue.
 
Illusion of competition you mean. Its the same competition as fsr2 vs dlss sr+rr, without dedicated ML hardware AMD will always be limited in what they can do. Seeing fp16/8 dedicated hardware on AMD cards is inevitable, rdna3 already has it, but won't use it for image reconstruction or frame generation cause rdna1/2 don't. So AMD keeps buying time with shady sponsorships.
Well yes their is definitely lack of competition in this space. The fact that gamers got thrown a bone with dlss 3.5 and FSR 3.0 when ai roi is more than 10x is a good thing imo. There is also Intel with their Xess improvements in the background. I believe this was great week for gamers with these 2 technologies. As 4090 owner I want Nvidia to keep on improving dlss. I honestly believe if FSR3 wasn't going to be announced this week, Nvidia would be delaying RR with dlss 3.5. Nvidia pulled the same thing with dlss 2 when FSR dropped.
 
But why? If power consumption was really issue, then nobody would have bought RTX 3000 series. Also no-one would buy Intel heater CPUs. But when AMD have higer power draw, then it's serious issue. I call that reverse logic.

No idea why RT is issue.
If RT wasn't good then people wouldn't have bought RTX 3000 gpus :cool:
 
Well yes their is definitely lack of competition in this space. The fact that gamers got thrown a bone with dlss 3.5 and FSR 3.0 when ai roi is more than 10x is a good thing imo. There is also Intel with their Xess improvements in the background. I believe this was great week for gamers with these 2 technologies. As 4090 owner I want Nvidia to keep on improving dlss. I honestly believe if FSR3 wasn't going to be announced this week, Nvidia would be delaying RR with dlss 3.5. Nvidia pulled the same thing with dlss 2 when FSR dropped.
Yeah NVIDIA usually has something up their sleeve for every AMD event. That new rt denoiser looks sweeeeet, especially when Alan Wake 2 has been the most anticipated game for me since forever. Probably the first day one pc game buy for me in 7 or 8 years, as I prefer to buy them fixed and discounted.
 
Incorrect again. FG needs high fps input to work well. In fact, that was precisely specified by amd in their fsr3 presentation. fsr3 needs +60 fps for intended outcome, or else the frames are too far away from each other, and the hw+sw will not produce an accurate representation.

Plus FSR3 taps into async compute to simulate what nvidia dedicated OFA does, therefore old cards will never benefit from it, and new ones will see diminishing returns in games that use async.That's the reason fsr3 needs 10 series and 20 is recommended. Not FG, not FSR3 will run on a gt210.
Even 4060 is too slow for FG imo, unless you mean using it at 1080p/high to take 60 fps to 90. Nvidia putting OFA for FG on 4060 is pure marketing theater by David Leatherman.

To sum up: Get informed and then we'll talk. That'll be all for now.

AMD's agnostic FSR3 has made NVidia's upscaling technologies irrelevant.

All the RTX2000 (even GTX) owners already know this^, as they already got suckered by NVidia's marketing gimmicks on past purchases. Many here (& elsewhere) can not see through NVidia's marketing, but EVGA and all the luminaries can see how badly NVidia is trying to off-load and sell non-gaming architecture to gamers, to profit Ada-Lovelace's main purpose, Enterprise.

RTX40 can't hardly beat RDNA2... so all NVidia can do is keep touting their proprietary gimmicks in hope that people don't see through their lack of actual performance.


There is a reason NVidia's doesn't show off it's RTX40 technologies using a RTX4060... it's as slow as a 2080 and can't muster any frames in RT either. It's always someone with a RTX4090 at 4k touting how great RTX technologies are, but dlss doesn't scale well at 1080p, or work well in fast paced 1440p (its a blurry mess). It's only good for 4k and above, where upscaling quality comes into play. Below 4k, native res will always be better and more precise for people who buy GPUs to actually game on. (IMO, playing a single player game with a controller is not actual hardcore PC gaming, how/why would performance even matter, let alone input latency, etc..?).

Two things:
-I have never met a person who has upgraded their GPU for a single-player game, only multiplayer competitive type of game.
-Anyone playing as single player game can now just use FSR3 and get FREE frames... why would anyone ever have to buy a RTX GPU to use NV's (paid for) FG.

The people who need frame-generation the most, or the ones with the weakest graphic cards. Thank you AMD for helping RTX20 owners..
 
why would anyone ever have to buy a RTX GPU to use NV's (paid for) FG.
Because they are faster in raster, much faster in RT, consume way less power and have more features. 4060 vs 7600 for example, the first support dlss, fsr, FG, FSR 3. Why would you buy the inferior 7600?
 
AMD's agnostic FSR3 has made NVidia's upscaling technologies irrelevant.

All the RTX2000 (even GTX) owners already know this^, as they already got suckered by NVidia's marketing gimmicks on past purchases. Many here (& elsewhere) can not see through NVidia's marketing, but EVGA and all the luminaries can see how badly NVidia is trying to off-load and sell non-gaming architecture to gamers, to profit Ada-Lovelace's main purpose, Enterprise.

RTX40 can't hardly beat RDNA2... so all NVidia can do is keep touting their proprietary gimmicks in hope that people don't see through their lack of actual performance.


There is a reason NVidia's doesn't show off it's RTX40 technologies using a RTX4060... it's as slow as a 2080 and can't muster any frames in RT either. It's always someone with a RTX4090 at 4k touting how great RTX technologies are, but dlss doesn't scale well at 1080p, or work well in fast paced 1440p (its a blurry mess). It's only good for 4k and above, where upscaling quality comes into play. Below 4k, native res will always be better and more precise for people who buy GPUs to actually game on. (IMO, playing a single player game with a controller is not actual hardcore PC gaming, how/why would performance even matter, let alone input latency, etc..?).

Two things:
-I have never met a person who has upgraded their GPU for a single-player game, only multiplayer competitive type of game.
-Anyone playing as single player game can now just use FSR3 and get FREE frames... why would anyone ever have to buy a RTX GPU to use NV's (paid for) FG.

The people who need frame-generation the most, or the ones with the weakest graphic cards. Thank you AMD for helping RTX20 owners..
We'll see how this "irrelevant" thing goes for amd without buying more sponsorships. They're paying studios for damage control instead of leveraging the hardware they put on rdna3, doesnt seem like a good plan for the future. Read techpowerup or hardware unboxed reviews on dlss2 vs fsr2, they'll tell you all about how irrelevant dlss is.
 
Because they are faster in raster, much faster in RT, consume way less power and have more features. 4060 vs 7600 for example, the first support dlss, fsr, FG, FSR 3. Why would you buy the inferior 7600?

It went ovr your head....
Why would anyone who owns a RTX2080, or a GTX1080ti, or a RTX2060 need to buy a new video card to get frame generation, when AMD just gave their current card the ability for frame generation... & upscaling.

There is no need to buy-into NVidia paywell now. Specially when you've already bought in and can't use RTX On gimmicks.

We'll see how this "irrelevant" thing goes for amd without buying more sponsorships. They're paying studios for damage control instead of leveraging the hardware they put on rdna3, doesnt seem like a good plan for the future. Read techpowerup or hardware unboxed reviews on dlss2 vs fsr2, they'll tell you all about how irrelevant dlss is.

All AMD did was shocked reality into average joe gamers, who only knew nVidia's TWIWMTBP marketing. You are feigning shock ovr an AMD exclusive... how does that hurt you, your card can use FSR3...? (explain)

NVidia had exclusive rights to Cyberpunk and locked out XSX from having RT in the Console game for 14 months, because NVidia didn't want RTX20 & 30 owners feels scammed for RTX ON comments... (imagine cyberpunk releasing and consoles had RT from day one... what would the RTX20 owner be buying then...?)

Again, DLSS3 is irrelevant, because it only works on 4% of the PC gamer's systems. FSR3 works on nearly all of them.
 
It went ovr your head....
Why would anyone who owns a RTX2080, or a GTX1080ti, or a RTX2060 need to buy a new video card to get frame generation, when AMD just gave their current card the ability for frame generation... & upscaling.

There is no need to buy-into NVidia paywell now. Specially when you've already bought in and can't use RTX On gimmicks.
What you are saying makes no sense whatsoever. Why would you NOT buy an nvidia card since they get ALL the features, while AMD cards don't? Buying an nvidia is a nobrainer since their cards support both DLSS and FSR, both FG and FSR 3 etc.
 
It went ovr your head....
Why would anyone who owns a RTX2080, or a GTX1080ti, or a RTX2060 need to buy a new video card to get frame generation, when AMD just gave their current card the ability for frame generation... & upscaling.

There is no need to buy-into NVidia paywell now. Specially when you've already bought in and can't use RTX On gimmicks.



All AMD did was shocked reality into average joe gamers, who only knew nVidia's TWIWMTBP marketing. You are feigning shock ovr an AMD exclusive... how does that hurt you, your card can use FSR3...? (explain)

NVidia had exclusive rights to Cyberpunk and locked out XSX from having RT in the Console game for 14 months, because NVidia didn't want RTX20 & 30 owners feels scammed for RTX ON comments... (imagine cyberpunk releasing and consoles had RT from day one... what would the RTX20 owner be buying then...?)

Again, DLSS3 is irrelevant, because it only works on 4% of the PC gamer's systems. FSR3 works on nearly all of them.
Sorry pal, no beef with you, but I heard this "dlss is irrelevant" nonsense when fsr launched. Now see HUB's recent review on dlss/fsr/xess vs native and figure how that went. 4% is probably 3060ti market share alone.
 
Last edited:
Wish they would target fixing their own broke *** drivers instead
I've used Nvidia for yours, never had an AMD card, but I know for a fact that Nvidia has some pretty bad drivers out there.

Nvidia has released some ghetto drivers over the years and they still do. For those that claim AMD drivers are broken, they're either biased or they just had a sour experience.

Nvidia has released several drivers where they burned out cards.
They have had drivers that cause high CPU usage.
They have had drivers where multi monitor setups wouldn't work properly.
They have had drivers were SLI configurations wouldn't work properly.
They have had drivers were all shadows had green pixilation.
They have had drivers were video playback would fail.
They have had drivers that caused crashes.
....and the list goes on.

You can't claim AMD has bad/broken drivers without pointing your finger at Nvidia for having just as many driver issues.
 
Status
Not open for further replies.
Back