Nvidia RTX 5080 might have a 24GB variant, RTX 5090 set to offer DisplayPort 2.1a

The high end has always been closer to $1000. The 8800gtx was $950 adjusted for inflation. The ultra adjusted was $1100.

The $500 flagship era was a product of the Great Financial Crisis. Mix that with hyperinflation from injecting over $6 TRILLION into the economy, tariffs, and Nvidias monopoly on the high end, and you can kiss those $500 gtx 580s goodbye.
The $1000 trend started in 2018 when 2080 Ti was $999. Officially. In reality it sold for $1200 for most of it's shelf life. AMD followed in 2020 with $999 6900 XT.

However looking at the history 8800 GTX and 8800 Ultra were exceptions. Most flagships hovered around the $499-$649 range with some deviations below or above that.

And adjusting for inflation is meaningless. You cant buy these cards new today so why are you using today's dollar value? People who bought 8800 Ultra back in the day did not pay $1100 for it. They paid $829.

That's like saying ten years from now how 4090's used to cost $2500 or something like that adjusted for inflation. It's meaningless metric meant to make people accept current prices as somehow ok.

I mean even the double sandwich 7950GX2 card used to cost only $599. These day's you're lucky if you can get a midrange card for that. Much less a flagship dual-GPU card.
Since im lazy I asked ChatGPT to list all the prices from last 20 years:

Here is the extended list of flagship gaming GPUs from **AMD (ATI)** and **NVIDIA** over the past **20 years**, along with their MSRP at launch:

### AMD (ATI):
1. **Radeon RX 7900 XTX**
- **Launch Year:** 2022
- **MSRP:** $999
2. **Radeon RX 6900 XT**
- **Launch Year:** 2020
- **MSRP:** $999
3. **Radeon RX 5700 XT**
- **Launch Year:** 2019
- **MSRP:** $399
4. **Radeon RX Vega 64**
- **Launch Year:** 2017
- **MSRP:** $499
5. **Radeon R9 Fury X**
- **Launch Year:** 2015
- **MSRP:** $649
6. **Radeon R9 290X**
- **Launch Year:** 2013
- **MSRP:** $549
7. **Radeon HD 7970**
- **Launch Year:** 2011
- **MSRP:** $549
8. **Radeon HD 6970**
- **Launch Year:** 2010
- **MSRP:** $369
9. **Radeon HD 5870**
- **Launch Year:** 2009
- **MSRP:** $379
10. **Radeon HD 4870 X2**
- **Launch Year:** 2008
- **MSRP:** $549
11. **Radeon HD 2900 XT**
- **Launch Year:** 2007
- **MSRP:** $399
12. **Radeon X1950 XTX**
- **Launch Year:** 2006
- **MSRP:** $449
13. **Radeon X1800 XT**
- **Launch Year:** 2005
- **MSRP:** $549
14. **Radeon X850 XT Platinum Edition**
- **Launch Year:** 2004
- **MSRP:** $549
15. **Radeon 9800 XT**
- **Launch Year:** 2003
- **MSRP:** $499

### NVIDIA:
1. **GeForce RTX 4090**
- **Launch Year:** 2022
- **MSRP:** $1,599
2. **GeForce RTX 3090**
- **Launch Year:** 2020
- **MSRP:** $1,499
3. **GeForce RTX 2080 Ti**
- **Launch Year:** 2018
- **MSRP:** $999
4. **GeForce GTX 1080 Ti**
- **Launch Year:** 2017
- **MSRP:** $699
5. **GeForce GTX 980 Ti**
- **Launch Year:** 2015
- **MSRP:** $649
6. **GeForce GTX 780 Ti**
- **Launch Year:** 2013
- **MSRP:** $699
7. **GeForce GTX 680**
- **Launch Year:** 2012
- **MSRP:** $499
8. **GeForce GTX 580**
- **Launch Year:** 2010
- **MSRP:** $499
9. **GeForce GTX 480**
- **Launch Year:** 2010
- **MSRP:** $499
10. **GeForce GTX 280**
- **Launch Year:** 2008
- **MSRP:** $649
11. **GeForce 8800 Ultra**
- **Launch Year:** 2007
- **MSRP:** $829
12. **GeForce 7800 GTX 512**
- **Launch Year:** 2005
- **MSRP:** $649
13. **GeForce 6800 Ultra**
- **Launch Year:** 2004
- **MSRP:** $499
14. **GeForce FX 5950 Ultra**
- **Launch Year:** 2003
- **MSRP:** $499

This list now covers the **last 20 years** of flagship GPUs from AMD and NVIDIA, including their launch prices.
 
The $1000 trend started in 2018 when 2080 Ti was $999. Officially. In reality it sold for $1200 for most of it's shelf life. AMD followed in 2020 with $999 6900 XT.

However looking at the history 8800 GTX and 8800 Ultra were exceptions. Most flagships hovered around the $499-$649 range with some deviations below or above that.

And adjusting for inflation is meaningless. You cant buy these cards new today so why are you using today's dollar value? People who bought 8800 Ultra back in the day did not pay $1100 for it. They paid $829.

That's like saying ten years from now how 4090's used to cost $2500 or something like that adjusted for inflation. It's meaningless metric meant to make people accept current prices as somehow ok.

I mean even the double sandwich 7950GX2 card used to cost only $599. These day's you're lucky if you can get a midrange card for that. Much less a flagship dual-GPU card.
Since im lazy I asked ChatGPT to list all the prices from last 20 years:

Here is the extended list of flagship gaming GPUs from **AMD (ATI)** and **NVIDIA** over the past **20 years**, along with their MSRP at launch:

### AMD (ATI):
1. **Radeon RX 7900 XTX**
- **Launch Year:** 2022
- **MSRP:** $999
2. **Radeon RX 6900 XT**
- **Launch Year:** 2020
- **MSRP:** $999
3. **Radeon RX 5700 XT**
- **Launch Year:** 2019
- **MSRP:** $399
4. **Radeon RX Vega 64**
- **Launch Year:** 2017
- **MSRP:** $499
5. **Radeon R9 Fury X**
- **Launch Year:** 2015
- **MSRP:** $649
6. **Radeon R9 290X**
- **Launch Year:** 2013
- **MSRP:** $549
7. **Radeon HD 7970**
- **Launch Year:** 2011
- **MSRP:** $549
8. **Radeon HD 6970**
- **Launch Year:** 2010
- **MSRP:** $369
9. **Radeon HD 5870**
- **Launch Year:** 2009
- **MSRP:** $379
10. **Radeon HD 4870 X2**
- **Launch Year:** 2008
- **MSRP:** $549
11. **Radeon HD 2900 XT**
- **Launch Year:** 2007
- **MSRP:** $399
12. **Radeon X1950 XTX**
- **Launch Year:** 2006
- **MSRP:** $449
13. **Radeon X1800 XT**
- **Launch Year:** 2005
- **MSRP:** $549
14. **Radeon X850 XT Platinum Edition**
- **Launch Year:** 2004
- **MSRP:** $549
15. **Radeon 9800 XT**
- **Launch Year:** 2003
- **MSRP:** $499

### NVIDIA:
1. **GeForce RTX 4090**
- **Launch Year:** 2022
- **MSRP:** $1,599
2. **GeForce RTX 3090**
- **Launch Year:** 2020
- **MSRP:** $1,499
3. **GeForce RTX 2080 Ti**
- **Launch Year:** 2018
- **MSRP:** $999
4. **GeForce GTX 1080 Ti**
- **Launch Year:** 2017
- **MSRP:** $699
5. **GeForce GTX 980 Ti**
- **Launch Year:** 2015
- **MSRP:** $649
6. **GeForce GTX 780 Ti**
- **Launch Year:** 2013
- **MSRP:** $699
7. **GeForce GTX 680**
- **Launch Year:** 2012
- **MSRP:** $499
8. **GeForce GTX 580**
- **Launch Year:** 2010
- **MSRP:** $499
9. **GeForce GTX 480**
- **Launch Year:** 2010
- **MSRP:** $499
10. **GeForce GTX 280**
- **Launch Year:** 2008
- **MSRP:** $649
11. **GeForce 8800 Ultra**
- **Launch Year:** 2007
- **MSRP:** $829
12. **GeForce 7800 GTX 512**
- **Launch Year:** 2005
- **MSRP:** $649
13. **GeForce 6800 Ultra**
- **Launch Year:** 2004
- **MSRP:** $499
14. **GeForce FX 5950 Ultra**
- **Launch Year:** 2003
- **MSRP:** $499

This list now covers the **last 20 years** of flagship GPUs from AMD and NVIDIA, including their launch prices.

What are you trying to say here? Advanced chip design gets more expensive for every new node and TSMC increased prices many times over the last 5-10 years. Also, high-end chips got way bigger = more costly to make.

Back when flagship GPUs were cheaper, they lasted for less time with huge improvements gen to gen, meaning you would need to upgrade much sooner.

Besides, most PC gamers use 1440p or less and won't need an entusiastclass GPU made for 4K/UHD or higher - with upscaling tho, most mid-end GPUs will handle 4K anyway.


About 10 years ago Fury X launched for 650-700 dollars. Thats like 900 dollars today and it was even DOA with 4GB VRAM which was the reason 980 Ti curbstomped it over the years, also, 980 Ti / GM200 is the best overclocker of all time. My 980 Ti was overclocked like 40% and was destroying Fury X completely.

Multi GPU also die. Back in the days, people bought 2-3-4 GPUs sometimes. Today they don't. They buy one massive GPU instead. Entusiast class tier is for... Yeah entusiasts.

So nah, price for having the absolute best did not really change. Many people used multiple GPUs back then, including myself. Buying 2-3 GPUs at 500+ dollars a pop.

And low to mid-end gaming GPUs are cheaper than ever, also they last for a lot longer.
 
Last edited:
What are you trying to say here? Advanced chip design gets more expensive for every new node and TSMC increased prices many times over the last 5-10 years. Also, high-end chips got way bigger = more costly to make.
Odd how this argument only seems to apply to Nvidia. AMD seems to be able to produce advanced chips on new nodes for the same price or even lower prices. High end chips never got bigger. Especially Nvidia ones that were always near the reticle limit and are today, too.
Back when flagship GPUs were cheaper, they lasted for less time with huge improvements gen to gen, meaning you would need to upgrade much sooner.
Tell that to $699 1080 Ti owners.
About 10 years ago Fury X launched for 650-700 dollars. Thats like 900 dollars today and it was even DOA with 4GB VRAM which was the reason 980 Ti curbstomped it over the years, also, 980 Ti / GM200 is the best overclocker of all time. My 980 Ti was overclocked like 40% and was destroying Fury X completely.
As if 980 Ti itself had massive VRAM? It was 6GB. For the time 4GB was enough.
And low to mid-end gaming GPUs are cheaper than ever, also they last for a lot longer.
Not sure what you're smoking here. Nvidia wants over $400 for a 50 class GPU with 128bit that is branded as 4060 Ti. And the measly 8GB ensures it will need replacing really soon.
Oh and 16GB version of this card has a massive $100 markup. Nowhere near what it cost Nvidia to add those extra 8GB there.
 
Odd how this argument only seems to apply to Nvidia. AMD seems to be able to produce advanced chips on new nodes for the same price or even lower prices. High end chips never got bigger. Especially Nvidia ones that were always near the reticle limit and are today, too.

Tell that to $699 1080 Ti owners.

As if 980 Ti itself had massive VRAM? It was 6GB. For the time 4GB was enough.

Not sure what you're smoking here. Nvidia wants over $400 for a 50 class GPU with 128bit that is branded as 4060 Ti. And the measly 8GB ensures it will need replacing really soon.
Oh and 16GB version of this card has a massive $100 markup. Nowhere near what it cost Nvidia to add those extra 8GB there.

AMD is not competiting in high-end anymore, so you don't really know. 7900 series is MCM and not a single chip.

1080 Ti has been garbage for years at this point and considered low-end today. Who do people keep bringing this up? You own one? LMAO.

980 Ti had 50% more VRAM than Fury and no, 4GB was not enough for all games for several years after release on highest settings, 6GB did it with ease tho. FYI AMD released 390 series with 8GB at the same time as Fury and praised the amount of VRAM here, all while Fury line was limited to 4GB due to being HBM. 980 Ti smacked Fury X silly, both on release and even more a few years later. Fury X could not OC at all (did not stop Lisa Su from calling it an overclockers dream tho) and GM200 was the best overclocker of all time, with custom cards hitting 1500-1600 MHz or even higher. Massive OC headroom.



4060 Ti 16GB performs exactly the same as the 8GB version: https://www.techpowerup.com/review/nvidia-geforce-rtx-4060-ti-16-gb/

"No significant performance gains from 16 GB VRAM"

Even 4060 Ti 8GB beats Radeon 6800 16GB in 1440p in God of War Ragnarok, that is funny. While using 1/3 the power and option for DLSS and Frame Gen on top.


Cards like 3070 and 6700XT are still good here 4 years after release. Back when flagship GPUs were 500 dollars, they barely lasted 2 years before they became obsolete or died due to cheap production.

Upscaling is a lifesaver for longevity and DLSS wins hands down. Upscaling beats VRAM any day for longevity.

Again, people with high demands used more than a single GPU back then.

AMD is loosing marketshare quarter for quarter and is around 10% now. They have nothing really. Nvidia dominate in all GPU markets with ease and you can ramble all day long about VRAM, does not change reality. Nvidia beats AMD while using less VRAM and Nvidia has superior features as well. RT that actually work, etc.

I'll take a fast GPU with less VRAM over a slow GPU with more VRAM any day of the week. Luckily I don't have to choose, because I use 4090 and have been using this since september 2022.
 
Last edited:
It amazes me that people are baffled every time a new generation comes out, with a higher price tag.

You know chip production gets more and more expensive, the more advanced it gets, right!?

You expect huge gains, and lower prices! LOL! Not going to happen friends!
If you can't afford to buy, stop reading about high-end hardware, is my suggestion!
 
AMD is not competiting in high-end anymore, so you don't really know. 7900 series is MCM and not a single chip.
AMD is not competing at the high end NOW. It wont mean it will be like this forever. I dont remember such doom and gloom being talked about before RDNA1 release that had the 5700 XT as the fastest card. 7900 despite being MCM was still above average size chip manufactured on 5nm and yet somehow AMD managed to retain 6000 series prices at the high end.
1080 Ti has been garbage for years at this point and considered low-end today. Who do people keep bringing this up? You own one? LMAO.
It has been 8 years. Today it's about 7-8% slower than 7600 and 4060.
But it held up incredibly well for a long time and did not cost and arm and a leg at launch. Plus with 11GB of VRAM it had longer lifespan than most cards. Despite not having Maxwell levels of OC headroom it was still a solid 25-35% going from 1500 to 2000 and beyond.
980 Ti had 50% more VRAM than Fury and no, 4GB was not enough for all games for several years after release on highest settings, 6GB did it with ease tho.
4GB was not enough but somehow 6GB was? Don't make me laugh.
FYI AMD released 390 series with 8GB at the same time as Fury and praised the amount of VRAM here, all while Fury line was limited to 4GB due to being HBM. 980 Ti smacked Fury X silly, both on release and even more a few years later. Fury X could not OC at all (did not stop Lisa Su from calling it an overclockers dream tho) and GM200 was the best overclocker of all time, with custom cards hitting 1500-1600 MHz or even higher. Massive OC headroom.
8GB was nice to have but overkill at the time. Im not saying Maxwell was not a good architecture. No need to get all defensive.
4060 Ti 16GB performs exactly the same as the 8GB version: https://www.techpowerup.com/review/nvidia-geforce-rtx-4060-ti-16-gb/

"No significant performance gains from 16 GB VRAM"

Even 4060 Ti 8GB beats Radeon 6800 16GB in 1440p in God of War Ragnarok, that is funny. While using 1/3 the power and option for DLSS and Frame Gen on top.
It does not.
8GB has worse frametimes and 1% lows.

I can also produce links that show 4060 Ti 8GB losing to 6800 16GB even at 1080p and obviously more at 1440p and 4K:

Using less power is not much of a win considering that 4060 Ti is manufactured on 4nm while 6800 was on 7nm. Not to mention the chip size itself. It would be off if it did not consume less power.
Cards like 3070 and 6700XT are still good here 4 years after release. Back when flagship GPUs were 500 dollars, they barely lasted 2 years before they became obsolete or died due to cheap production.
These also cost less than $500. 6700 XT even has 12GB.
Upscaling is a lifesaver for longevity and DLSS wins hands down. Upscaling beats VRAM any day for longevity.
What should I prefer:
1. A soft blurry image upscaled from 720p coupled with low textures due to low VRAM.
2. Sharp and crisp native image with highest textures or texture mods and plenty of VRAM.
I know what I'll choose. DLSS makes the image too soft. Even with manual sharpening. DLAA is great and frame generation has it uses but I have very little to say about DLSS having tested it in multiple games. I always end up turning it off. It does not look "better than native" to me like some people claim, and how can it. No matter how much you sugar coat it it's still an upscaler.
Again, people with high demands used more than a single GPU back then.
And the associated problems that came with it.
AMD is loosing marketshare quarter for quarter and is around 10% now. They have nothing really. Nvidia dominate in all GPU markets with ease and you can ramble all day long about VRAM, does not change reality. Nvidia beats AMD while using less VRAM and Nvidia has superior features as well. RT that actually work, etc.
AMD's RT does not work? So 7900 being roughly 3090 RT perf is now "not working"?
I'll take a fast GPU with less VRAM over a slow GPU with more VRAM any day of the week. Luckily I don't have to choose, because I use 4090 and have been using this since september 2022.
Your purchase does not match your words., If you truly believe that then you would be using 8GB card from 4000 series with DLSS and running your glorious RT on that.
It amazes me that people are baffled every time a new generation comes out, with a higher price tag.

You know chip production gets more and more expensive, the more advanced it gets, right!?

You expect huge gains, and lower prices! LOL! Not going to happen friends!
If you can't afford to buy, stop reading about high-end hardware, is my suggestion!
Like I said AMD seems to be able to produce their cards at the same price and their CPU's at even lower prices than before. Chip production does not get universally more expensive. Just Nvidia's margins get bigger and bigger.

People seem to have bought nicely into the idea of ever increasing prices and you deserve whatever is coming your way.
 
Did not even bother to read your crap. You already lost.
You're another clueless Nvidia fanboy who cant back up their arguments. The moment someone refutes your clueless arguments with facts you cant even read it. Truth really does hurt.

Oh I "lost" to a fanboy on the internet. So sad now. /s
 
You're another clueless Nvidia fanboy who cant back up their arguments. The moment someone refutes your clueless arguments with facts you cant even read it. Truth really does hurt.

Oh I "lost" to a fanboy on the internet. So sad now. /s
I don't need to backup anything, I just need to look at AMDs miserable GPU marketshare and I know Nvidia is the way to go. Nvidia is king of GPUs for a reason. AMD is a CPU company. No wonder they can't compete when it comes to GPUs.

AMD GPU performance is miserable in pretty much all new games.




Be careful. Reality hits hard.
 
AMD drivers and support is horrible compared to Nvidia. Features are worse all over. FSR is so bad it is barely worth using compared to DLSS and DLAA. A blur-fest with artifacts all over, and FSR don't work well in 1080p and 1440p which is the most widely used resolutions by PC gamers.

I can say this for sure since I used 6800XT before getting a 4090 and I have had zero issues in any games with my 4090, where on my AMD GPU, tons of games had weird artifacts, crashes etc.

Why do people think AMD is cheaper?


Here Techspot dude say he loves Nvidia cards. Why do you think? Because they are simply better. AMD is the cheap brand for a reason. Worse in all aspects and with a lower price tag to follow.

Even with AMD GPU being cheaper, it is not selling. AMD really needs to dig deep for next gen to gain marketshare again. Lets see.
 
AMD drivers and support is horrible compared to Nvidia.
I see this being told all the time, but no one seems to be able to explain to me how their drivers and support is horrible. Reviewers have AMD cards in their systems.
Linus and others have ran AMD cards as their daily drivers as a test and found no major problems. I think Linus mentioned that Nvidia was better in VR but not much else. AMD releases their drivers supporting new games on time. Their control panel has been far better than Nvidia's for a long time etc.
Features are worse all over.
Some features are worse but they were also introduced later.
FSR is so bad it is barely worth using compared to DLSS and DLAA. A blur-fest with artifacts all over, and FSR don't work well in 1080p and 1440p which is the most widely used resolutions by PC gamers.
FSR is upscaler. DLAA is AA. Different things.
DLSS also suffers on 1080p and 1440p. Not to the same degree as FSR, but it does.
I can say this for sure since I used 6800XT before getting a 4090 and I have had zero issues in any games with my 4090, where on my AMD GPU, tons of games had weird artifacts, crashes etc.
Open Nvidia forums and you'll see plenty of people complaining about Nvidia drivers too.
No company is free from driver problems and never will be. I've had both cards. Currently on Nvidia. Have had little problems with both but nothing major.
Worse in all aspects and with a lower price tag to follow.
All aspects? Your own sentence contradicts this statement because if it's cheaper then it's one aspect where it's better. If you would say worse in marketing, initial pricing up upscaler quality then I would agree with you.
Even with AMD GPU being cheaper, it is not selling. AMD really needs to dig deep for next gen to gain marketshare again. Lets see.
AMD not selling is less to do with price and more to do with their non-existent or weird marketing. Plus not having some of their features on the same level as Nvidia.
Trust is built upon many generations. AMD needs to execute well on all aspects for two or maybe three generations before their market share starts growing. Like they did with Zen.
 
I see this being told all the time, but no one seems to be able to explain to me how their drivers and support is horrible. Reviewers have AMD cards in their systems.
Linus and others have ran AMD cards as their daily drivers as a test and found no major problems. I think Linus mentioned that Nvidia was better in VR but not much else. AMD releases their drivers supporting new games on time. Their control panel has been far better than Nvidia's for a long time etc.

Some features are worse but they were also introduced later.

FSR is upscaler. DLAA is AA. Different things.
DLSS also suffers on 1080p and 1440p. Not to the same degree as FSR, but it does.

Open Nvidia forums and you'll see plenty of people complaining about Nvidia drivers too.
No company is free from driver problems and never will be. I've had both cards. Currently on Nvidia. Have had little problems with both but nothing major.

All aspects? Your own sentence contradicts this statement because if it's cheaper then it's one aspect where it's better. If you would say worse in marketing, initial pricing up upscaler quality then I would agree with you.

AMD not selling is less to do with price and more to do with their non-existent or weird marketing. Plus not having some of their features on the same level as Nvidia.
Trust is built upon many generations. AMD needs to execute well on all aspects for two or maybe three generations before their market share starts growing. Like they did with Zen.
AMD users complain in game forums all the time.
Example: https://steamcommunity.com/app/594650/discussions/8/6462188749584990994/

"It took Crytek almost 2 months to fix the rain on AMD last time, so I believe we will be waiting a longggggggg time for this one to get fixed."

DLAA is DLSS running at native res, at 100% scaling. Sharpening and AA is built in, which is why DLAA beats native res with ease and the reason why DLSS can beat native as well in many aspects, all while boosting performance for free by 75% on average.

DLSS works far better in low resolution than FSR do. FSR mostly work "fine" for 4K/UHD but DLSS still win here as well.

Also, DLSS is in like 600+ games now and dll file can easily be changed out. With FSR you are stuck with the version developers put in.

I don't hate AMD - I own an AMD CPU - and I would love AMD to compete in GPUs again, would be better for all of us, but I have tried tons of AMD cards in the last 20 years and it is very obvious that Nvidia generally works better, especially when you leave the most popular games which gets benchmarked in reviews. For emulation, which I do alot, AMD GPU is often really really bad.

Back when Nvidia went RTX, AMD simply fell of the wagon. They have been playing catch up on features ever since RTX came out but sadly can't match Nvidia due to much lower research and software devlopment funds.

Nvidia just keeps gaining marketshare and AMD should really go aggressive with pricing on Radeon 8000 series or it won't matter anyway. If Radeon 8000 fails, I could see AMD leaving mid-end market as well. They won't keep wasting money chasing markets with little margins, if I remember correctly, AMD is down almost 75% on gaming GPU sales YoY.

That is a huge drop. 7000 series has been a disaster for AMD really.

Radeon 6000 series did better vs Nvidia, mostly because Nvidia used a cheap Samsung node. However RTX 3000 series sold waaaay better and Nvidia even had much higher margins due to using Samsung instead of TSMC.

Nvidia going back to TSMC with 4000 series, is when AMD was left on the station. Huge efficiency gains were made and clockspeed was massively improved.

Difference between 3090 and 4090 is massive. One of the biggest gen to gen leaps ever seen.
 
Last edited:
DLAA is DLSS running at native res, at 100% scaling. Sharpening and AA is built in, which is why DLAA beats native res with ease and the reason why DLSS can beat native as well in many aspects, all while boosting performance for free by 75% on average.
DLAA should be compared to FSR NativeAA then. Most people dont even know such a things exists - no marketing from AMD again. Nothing is really "free" though. That performance has to come prom somewhere and it comes from extremely low input resolution. AI is good but it's not without it's faults or artifacts.
DLSS works far better in low resolution than FSR do. FSR mostly work "fine" for 4K/UHD but DLSS still win here as well.
Yes I agree on that.
Also, DLSS is in like 600+ games now and dll file can easily be changed out. With FSR you are stuck with the version developers put in.
FSR 3.1 changed that. Now it's possible to swap DLL files like people have been doing with DLSS for years. Of course this only helps newer games that come with 3.1 support.
Upscaler and Frame Generation are now decoupled too. It's now possible to mix and match.
For emulation, which I do alot, AMD GPU is often really really bad.
For Linux AMD has been better than Nvidia too. There are niche use cases where one is better than the other. I mentioned VR earlier where Nvidia is better.
 
DLAA should be compared to FSR NativeAA then. Most people dont even know such a things exists - no marketing from AMD again. Nothing is really "free" though. That performance has to come prom somewhere and it comes from extremely low input resolution. AI is good but it's not without it's faults or artifacts.

Yes I agree on that.

FSR 3.1 changed that. Now it's possible to swap DLL files like people have been doing with DLSS for years. Of course this only helps newer games that come with 3.1 support.
Upscaler and Frame Generation are now decoupled too. It's now possible to mix and match.

For Linux AMD has been better than Nvidia too. There are niche use cases where one is better than the other. I mentioned VR earlier where Nvidia is better.

I know FSR Native and used it but support is very limited which is sad to see.
I like AMDs CAS Sharpening too. Most just sharpening and lacks the AA aspect.

I use DLAA in all DLSS games personally, minor perf decrease, around 5% but visuals are vastly better. It's pretty much like MSAAx4/8 just with little perf hit but with sharpening on top, which can be adjusted.
 
Back