Nvidia might discontinue the RTX 4080 in favor of 20GB RTX 4080 Super

midian182

Posts: 9,745   +121
Staff member
Rumor mill: More rumors have arrived relating to Nvidia's alleged Super versions of its RTX 4000 series. The latest claim is that Team Green is not only planning an RTX 4080 Super, but it will also discontinue the current RTX 4080 to make way for the new variant.

The new Super allegation comes from Chinese tech site IT Home. As with all rumors, a healthy dose of salt is recommended with this one.

It's claimed that retailers in China are stocking up on RTX 4080 cards ahead of it being replaced by the Super version. Manufacturers are also said to be increasing shipments, but many still expect to see the card's price go up and its availability decline if it is killed off by Nvidia, especially if the RTX 4080 Super proves to be a lot more expensive than the current $1,199 (MSRP) vanilla version.

We've also heard some rumors about what sorts of specs the RTX 4080 Super will be packing. Benchlife claims that board partners revealed the card will use the AD102 GPU and come with 20GB of memory. That's a big jump over the current AD103-powered RTX 4080's 16GB and would be a welcome upgrade when more games are demanding increasing amounts of VRAM. It's also likely that the Super variant will see the 256-bit memory bus increase to 320-bit.

Those specs would represent an impressive performance uplift for the RTX 4080 Super compared to the RTX 4080, but not everyone thinks this will be the case. Prolific and usually accurate leaker kopite7kimi believes the differences between the two cards will mimic those of the RTX 2080 and RTX 2080 Super, meaning the new variant might come with the full AD 103 GPU, introducing more SMs and CUDA cores along with increased boost clocks, thereby offering only a slight improvement.

Last week, we heard from two leakers that Nvidia is planning an RTX 4080 Super, RTX 4070 Super, and RTX 4070 Ti Super. That last one sounds unlikely, but Nvidia isn't afraid to release multiple versions of the same card and use names likely to confuse everyday consumers.

To reiterate, these are all still rumors, but Nvidia isn't expected to release the next-gen RTX 5000 series until 2025, so a refresh does look likely.

Permalink to story.

 
The only good MAYBE is that we might see a drop in plain 4080 price below 1000 $ ,that means another big MAYBE in Europe will see a drop below 1100€-1500€ still of course not a good selling point but better !!!Did I say MAYBE ?
 
The only good MAYBE is that we might see a drop in plain 4080 price below 1000 $ ,that means another big MAYBE in Europe will see a drop below 1100€-1500€ still of course not a good selling point but better !!!Did I say MAYBE ?
I doubt it, the 4080 is selling so horribly that they'll probably discontinue production of it all together. They'll release a 4080 super with the a 4090 MSRP and release a 4070ti super for like $100 less than a 4080. There are rumors going around right now, credible ones, of nVidia telling board partners that they aren't getting anymore 4090s for the foreseeable future. I hate the 4090, not because of it's a bad card but because it's priced poorly.
 
If the current 4080 does get discontinued the price might drop $100 or so, but I wouldn't expect them to ever go under $1k.

The gap between the 4080 and 4090, as it stands in terms of performance is upwards of 30% difference - depending on the application. I'm sure the super model will cut that gap by half and come into the foray at $1399.

Hopefully, though very doubtful, any super additions to the current available models from Nvidia go unpurchased and Nvidia loses their arse. I also hope folks stop paying these outlandish prices and force the industry to ease up on the over the barrel pricing they've been doing the past 2-3 years.
 
The only reason to discontinue the 4080 is if the 4080 Super is using the AD103. As it is almost fully utilized in the 4080, there might not be enough separation to keep both. But I'm not sure that a few more cores and GB of VRAM are enough to fix the 4080 value problem. Or address the large gap to the 4090.
 
The 4070 TI Super makes sense (bad name aside).

Nvidia can make it faster than the 7900XT, give it 16 GB, and charge $800.

This fixes the VRAM issue of the 4070 TI and steals the higher end bang for your buck crown from AMD without changing their product pricing much. (4080 Super takes the $1200 price point, the 4080 drops to $1000, and the 4070 TI to $700).

Value improves enough to keep sales coming for another year before the RTX 5000s.
 
The 4070 TI Super makes sense (bad name aside).

Nvidia can make it faster than the 7900XT, give it 16 GB, and charge $800.

This fixes the VRAM issue of the 4070 TI and steals the higher end bang for your buck crown from AMD without changing their product pricing much. (4080 Super takes the $1200 price point, the 4080 drops to $1000, and the 4070 TI to $700).

Value improves enough to keep sales coming for another year before the RTX 5000s.
4070 TI is already faster vs 7900XTX in max settings 4k (Rt+Raster)
7900XT is like 5% faster in Raster vs 4070 TI.

Nvidia launching this is likely Jensen's ego issue. He wants that 5% raster win too. Same for 16GB vram.
 
That's a big jump over the current AD103-powered RTX 4080's 16GB and would be a welcome upgrade when more games are demanding increasing amounts of VRAM.

Per a recent Techspot article, https://www.techspot.com/article/2670-vram-use-games/, very few games are using over 12GB of VRAM, even at 4K resolutions and all the bells and whistles turned on. The AD102 die and higher bandwidth memory bus are going to be the primary reasons for better performance. I doubt the extra 4G is going to buy you much at all in terms of more FPS.

If Nvidia were smart, and I have serious doubts about that, they would fire-sale the current 4080 for about $900 and bring the Super in at $1100-1200. AMD could be in a pickle if Nvidia prices the Super models right.


 
Per a recent Techspot article, https://www.techspot.com/article/2670-vram-use-games/, very few games are using over 12GB of VRAM, even at 4K resolutions and all the bells and whistles turned on. The AD102 die and higher bandwidth memory bus are going to be the primary reasons for better performance. I doubt the extra 4G is going to buy you much at all in terms of more FPS.

If Nvidia were smart, and I have serious doubts about that, they would fire-sale the current 4080 for about $900 and bring the Super in at $1100-1200. AMD could be in a pickle if Nvidia prices the Super models right.
People don't want massive amounts of VRAM because games are using it, people want massive amounts of VRAM on their graphics cards because they don't want a $1000+ piece of tech to be obsolete because the end of its generation. I expect to pay about $500/yr for my PC with $1000 of that being the graphics cards. I want to get at least 4 years out of that $1000 graphics card. I bought a 6700XT in march to replace my dead 1070ti fully expecting to replace it in about a year. I'm waiting for the RDNA refresh to buy whatever AMD releases as 7950xt/xtx because I'm tired of being bullied by nVidia after being a customer of theirs for nearly 20 years.
 
Per a recent Techspot article, https://www.techspot.com/article/2670-vram-use-games/, very few games are using over 12GB of VRAM, even at 4K resolutions and all the bells and whistles turned on. The AD102 die and higher bandwidth memory bus are going to be the primary reasons for better performance. I doubt the extra 4G is going to buy you much at all in terms of more FPS.
True, BUT that is today.

With 1) consoles having a combo RAM/VRAM of 16GB, 2) new VRAM-intensive RT being created, and 3) ports to PC often being subpar, the likelihood that more games will hit 12.5GB in the near future is high.

Sure, few games will likely run differently with 14 or 16 GB, but that isn't an option. So 12 GB and know that at least some games will be gimped or 16 and know that it's plenty until the 6000 series.
 
True, BUT that is today.

With 1) consoles having a combo RAM/VRAM of 16GB, 2) new VRAM-intensive RT being created, and 3) ports to PC often being subpar, the likelihood that more games will hit 12.5GB in the near future is high.

Sure, few games will likely run differently with 14 or 16 GB, but that isn't an option. So 12 GB and know that at least some games will be gimped or 16 and know that it's plenty until the 6000 series.
I highly doubt that many games will break the 16G barrier within the next 3-4 years. They are just now going over 12G, and only in a handful of games. The question is, do you want to spend the extra money for 20G, which (for gaming) you may never use, or get a GPU with 16G that cost $100-150 less? Heck, why not 24G or 32? Sometimes, I think "future proofing" is more wasteful, financially speaking, than just doing upgrades when you need them.
 
People don't want massive amounts of VRAM because games are using it, people want massive amounts of VRAM on their graphics cards because they don't want a $1000+ piece of tech to be obsolete because the end of its generation. I expect to pay about $500/yr for my PC with $1000 of that being the graphics cards. I want to get at least 4 years out of that $1000 graphics card. I bought a 6700XT in march to replace my dead 1070ti fully expecting to replace it in about a year. I'm waiting for the RDNA refresh to buy whatever AMD releases as 7950xt/xtx because I'm tired of being bullied by nVidia after being a customer of theirs for nearly 20 years.
My point is a 16G GPU isn't going to be obsolete in 3-4 years. If for no other reason than developers will eventually wake up to the fact that people don't want to spend $1000+ on GPUs just to play a game. I've also read that there are other technologies that are being looked at like texture compression that will reduce VRAM usage (maybe).

The storage industry went through this several years ago. When hard drives were cheap, companies were buying 100s of terabytes of storage. But, very quickly they learned that there was a cost to this, and storage companies started developing technologies for de-duplication and compression that greatly reduced the storage footprint in the datacenter.

In other words, gaming companies that continue to push high VRAM usage will be selling to a subset of the total market and I'm certain that is not what they want. If the 4080 Super was only the 4080 w/4G more of VRAM, I'd pass and wait for the 16G model to drop in price.
 
My point is a 16G GPU isn't going to be obsolete in 3-4 years. If for no other reason than developers will eventually wake up to the fact that people don't want to spend $1000+ on GPUs just to play a game. I've also read that there are other technologies that are being looked at like texture compression that will reduce VRAM usage (maybe).

The storage industry went through this several years ago. When hard drives were cheap, companies were buying 100s of terabytes of storage. But, very quickly they learned that there was a cost to this, and storage companies started developing technologies for de-duplication and compression that greatly reduced the storage footprint in the datacenter.

In other words, gaming companies that continue to push high VRAM usage will be selling to a subset of the total market and I'm certain that is not what they want. If the 4080 Super was only the 4080 w/4G more of VRAM, I'd pass and wait for the 16G model to drop in price.
The 4070 is already obsolete by those standards and it cost $600. Oh, but it has DLSS so I guess that adds $100 to the price of that card. Not even all games support or will support DLSS. Further, going on die size(and memory bus width) the 4090 is really a 4080, the 4080 is really a 4070, the 4070 is really a 4060 and the 4060 is really a 4050.

While nVidia certainly does have the high end(The 4080 isn't selling) AMD is beating NVIDIA pretty badly dollar-for-dollar at just about every point. Not in sales, mind you, but in performance. Everyone points to ray tracing and DLSS immediately but there are plenty of games that don't use ray tracing. There are games that use ray tracing and you get no benefit from it. I also vowed to never buy another nVidia card after I saw that they're releasing a new version of DLSS with every generation of graphics card so unless you plan to upgrade to a 50 series card as soon as it is released then DLSS should just be "nice to have" and not a selling point. I was also pretty salty about my 1070ti failing especially when I looked into it and saw that 10 series cards have an issue with the memory controller causing them to fail.
 
If it stays on AD 103 the 512 additional shaders would give a 5-6% boost and 4 additional GB of VRAM would be virtually pointless in 99% of titles unless it increases the memory bus to 320 bit, which would end up being the biggest improvement for the GPU. The 4080, I believe, can occasionally be held back by its 256 bit bus restricting the bandwidth. The improvements would put the 4080 S in better position against the 7900 XTX.

By the way, this makes more sense as I'm fairly sure that Nvidia will release a 4080 Ti, but they're going to want at least $1400 for it.
 
Last edited:
I highly doubt that many games will break the 16G barrier within the next 3-4 years. They are just now going over 12G, and only in a handful of games. The question is, do you want to spend the extra money for 20G, which (for gaming) you may never use, or get a GPU with 16G that cost $100-150 less? Heck, why not 24G or 32? Sometimes, I think "future proofing" is more wasteful, financially speaking, than just doing upgrades when you need them.
You should actually read my post.

I explicitly make the point of 12 GB vs. 16 GB and that 16 GB will be good for years.
Then you reply by talking about not needing more than 16 GB and why pay for 20 GB when 16 will be good for years.

I'm not sure who you are arguing with, but it isn't me.
 
The 4070 is already obsolete by those standards and it cost $600. Oh, but it has DLSS so I guess that adds $100 to the price of that card. Not even all games support or will support DLSS. Further, going on die size(and memory bus width) the 4090 is really a 4080, the 4080 is really a 4070, the 4070 is really a 4060 and the 4060 is really a 4050.

While nVidia certainly does have the high end(The 4080 isn't selling) AMD is beating NVIDIA pretty badly dollar-for-dollar at just about every point. Not in sales, mind you, but in performance. Everyone points to ray tracing and DLSS immediately but there are plenty of games that don't use ray tracing. There are games that use ray tracing and you get no benefit from it. I also vowed to never buy another nVidia card after I saw that they're releasing a new version of DLSS with every generation of graphics card so unless you plan to upgrade to a 50 series card as soon as it is released then DLSS should just be "nice to have" and not a selling point. I was also pretty salty about my 1070ti failing especially when I looked into it and saw that 10 series cards have an issue with the memory controller causing them to fail.
The 4070 isn't obsolete because it only has 12G of VRAM. 12G is enough to play most games today at 1080 or 1440. I would say the performance is not limited by its VRAM and this is proven by the fact that the 4060Ti, with 16G, does not outperform the 4070 or even come close. This is consistent with the last statement in my post, if the 4080 Super is just adding more VRAM, it's not going to give you any real performance improvements, just like the 4060Ti didn't outperform the 4070. The 4070 has a faster and wider memory bus along with more cores. Whether the 4070 is a good value or the right card to buy is a different discussion, but as of today, 12G will play just fine on most games. Per Techspot, the 4070 gets 175 fps at 1080 and 126 @1440. It even does a respectable 69 fps at 4K, so I don't understand how you come to any conclusion that the GPU is obsolete.

As for AMD vs Nvidia, the 4070Ti is within 5% of the 7900XT at the same price and when you turn on RT it's a lot closer if not faster in some games. I'd hardly call that getting beat badly, especially when AMD had to drop prices to compete. That extra 8G of VRAM didn't do much for the 7900XT did it? Same for the XTX. Compared to the 4080, they are pretty much on par across the board, sans RT. The 4080 sells for just under $1100, $1089 whereas the 7900XTX sells for just under $1000, $975 or so. So, about a $100 difference for the same, non-RT performance. Again, the extra 8G of VRAM didn't seem to do much to boost performance.

I also don't know where you got the notion that Nvidia is releasing new DLSS with every new GPU generation. DLSS 3.5 will work with 30 and 20 series RTX GPUs so that statement is patently false. Will you now go back to Nvidia since they don't require a new GPU for DLSS?

https://www.techspot.com/review/2663-nvidia-geforce-rtx-4070/
 
You should actually read my post.

I explicitly make the point of 12 GB vs. 16 GB and that 16 GB will be good for years.
Then you reply by talking about not needing more than 16 GB and why pay for 20 GB when 16 will be good for years.

I'm not sure who you are arguing with, but it isn't me.
Not trying to argue, but pointing out that 12G will be fine for a couple more years if you play at 1440 or 1080. 16G is a good option and trying to sell the Super model based strictly on the increased VRAM is a bit of a fool's game.
 
The 4070 isn't obsolete because it only has 12G of VRAM. 12G is enough to play most games today at 1080 or 1440. I would say the performance is not limited by its VRAM and this is proven by the fact that the 4060Ti, with 16G, does not outperform the 4070 or even come close. This is consistent with the last statement in my post, if the 4080 Super is just adding more VRAM, it's not going to give you any real performance improvements, just like the 4060Ti didn't outperform the 4070. The 4070 has a faster and wider memory bus along with more cores. Whether the 4070 is a good value or the right card to buy is a different discussion, but as of today, 12G will play just fine on most games. Per Techspot, the 4070 gets 175 fps at 1080 and 126 @1440. It even does a respectable 69 fps at 4K, so I don't understand how you come to any conclusion that the GPU is obsolete.

As for AMD vs Nvidia, the 4070Ti is within 5% of the 7900XT at the same price and when you turn on RT it's a lot closer if not faster in some games. I'd hardly call that getting beat badly, especially when AMD had to drop prices to compete. That extra 8G of VRAM didn't do much for the 7900XT did it? Same for the XTX. Compared to the 4080, they are pretty much on par across the board, sans RT. The 4080 sells for just under $1100, $1089 whereas the 7900XTX sells for just under $1000, $975 or so. So, about a $100 difference for the same, non-RT performance. Again, the extra 8G of VRAM didn't seem to do much to boost performance.

I also don't know where you got the notion that Nvidia is releasing new DLSS with every new GPU generation. DLSS 3.5 will work with 30 and 20 series RTX GPUs so that statement is patently false. Will you now go back to Nvidia since they don't require a new GPU for DLSS?

https://www.techspot.com/review/2663-nvidia-geforce-rtx-4070/
I'm so sick of the "they had to drop the price to compete" argument. AMD has 2 cards, the XTX and the XT. The XT is meant to sell the XTX and it's price is just there to make the XTX look good. the XT cards have a cut down die that is not suitable for the full performance they wanted out of the 7900XTX. Then they build a price drop into it to. AMD was never dumb enough to think the 7900XT was actually a $900 card but if people were dumb enough to pay it so be it.

But here's the thing7700xt has 12gigs of vram on it for $450. I, again, think the 7700xt has a price drop built into it but it's actually selling pretty well $450 so they haven't had to drop the price. I think that AMD is waiting to move all the 6700XT units before they move the price of the card closer $400 but I don't think it'll go below $370.

But here is the thing, 12 gig cards work great for 1080p and 1440p. However, people paying $700+ for a graphics card don't want to play games at 1080. I've been gaming at 4k60 since 2017. Granted, I have to adjust some of those settings to get playable frame rates, but I'm not going to be paying $700 for a graphics card that can barely do 4k better than a 6700XT once it hits a memory bottleneck. And forget raytracing. Sure I can use DLSS but until when? DLSS stops working in new AAA titles once the 5000 series hits? The 4070 is, at best, a 2 year card and the only reason it is a 2 year card is that nVidia put 12 gigs of VRAM on it as a form of planned obsolesence.

I'm okay with replacing the 6700XT after only a year, it was always just a stopgap card. However, when I go to spend real money on a card, like the 7950XTX, I expect to get several years out of it. Something that infuriates me about nVidia is that people aren't going to get several years out of the 4070 or 4080. They'll release new proprietary tech that only works on later generations and they'll be left behind. 4090 owners will be able to just buy 5090s but there is no value to the other cards. The very tech that makes them work will stop working once they release their new cards with new AI tech and new proprietary software that needs specialized hardware on the GPU to function.

AMD's solutions aren't as good but atleast they work on everything. You're giving up quality for compatibility. Looking at nVidia's track record for compatibility, that's a sacrifice that I'm willing to make.
 
I'm so sick of the "they had to drop the price to compete" argument. AMD has 2 cards, the XTX and the XT. The XT is meant to sell the XTX and it's price is just there to make the XTX look good. the XT cards have a cut down die that is not suitable for the full performance they wanted out of the 7900XTX. Then they build a price drop into it to. AMD was never dumb enough to think the 7900XT was actually a $900 card but if people were dumb enough to pay it so be it.

But here's the thing7700xt has 12gigs of vram on it for $450. I, again, think the 7700xt has a price drop built into it but it's actually selling pretty well $450 so they haven't had to drop the price. I think that AMD is waiting to move all the 6700XT units before they move the price of the card closer $400 but I don't think it'll go below $370.

But here is the thing, 12 gig cards work great for 1080p and 1440p. However, people paying $700+ for a graphics card don't want to play games at 1080. I've been gaming at 4k60 since 2017. Granted, I have to adjust some of those settings to get playable frame rates, but I'm not going to be paying $700 for a graphics card that can barely do 4k better than a 6700XT once it hits a memory bottleneck. And forget raytracing. Sure I can use DLSS but until when? DLSS stops working in new AAA titles once the 5000 series hits? The 4070 is, at best, a 2 year card and the only reason it is a 2 year card is that nVidia put 12 gigs of VRAM on it as a form of planned obsolesence.

I'm okay with replacing the 6700XT after only a year, it was always just a stopgap card. However, when I go to spend real money on a card, like the 7950XTX, I expect to get several years out of it. Something that infuriates me about nVidia is that people aren't going to get several years out of the 4070 or 4080. They'll release new proprietary tech that only works on later generations and they'll be left behind. 4090 owners will be able to just buy 5090s but there is no value to the other cards. The very tech that makes them work will stop working once they release their new cards with new AI tech and new proprietary software that needs specialized hardware on the GPU to function.

AMD's solutions aren't as good but atleast they work on everything. You're giving up quality for compatibility. Looking at nVidia's track record for compatibility, that's a sacrifice that I'm willing to make.
I agree that I find it hard to believe that DLSS will be a long time supported software that you'll be seeing as we go further down the road.

DLSS comes out, but only supported by the 2xxx series and up.
DLSS 3 comes out, but only supported by the 4xxx series.
DLSS 3.5 comes out, supported by all RTX models.

What happens with the 5xxx series comes out? Will there be another iteration of DLSS that only they support? When these cards come out will developers still be trying to use DLSS or will they have moved on to only using DLSS 3?

Where does it stop?
 
I'm so sick of the "they had to drop the price to compete" argument. AMD has 2 cards, the XTX and the XT. The XT is meant to sell the XTX and it's price is just there to make the XTX look good. the XT cards have a cut down die that is not suitable for the full performance they wanted out of the 7900XTX. Then they build a price drop into it to. AMD was never dumb enough to think the 7900XT was actually a $900 card but if people were dumb enough to pay it so be it.

But here's the thing7700xt has 12gigs of vram on it for $450. I, again, think the 7700xt has a price drop built into it but it's actually selling pretty well $450 so they haven't had to drop the price. I think that AMD is waiting to move all the 6700XT units before they move the price of the card closer $400 but I don't think it'll go below $370.

But here is the thing, 12 gig cards work great for 1080p and 1440p. However, people paying $700+ for a graphics card don't want to play games at 1080. I've been gaming at 4k60 since 2017. Granted, I have to adjust some of those settings to get playable frame rates, but I'm not going to be paying $700 for a graphics card that can barely do 4k better than a 6700XT once it hits a memory bottleneck. And forget raytracing. Sure I can use DLSS but until when? DLSS stops working in new AAA titles once the 5000 series hits? The 4070 is, at best, a 2 year card and the only reason it is a 2 year card is that nVidia put 12 gigs of VRAM on it as a form of planned obsolesence.

I'm okay with replacing the 6700XT after only a year, it was always just a stopgap card. However, when I go to spend real money on a card, like the 7950XTX, I expect to get several years out of it. Something that infuriates me about nVidia is that people aren't going to get several years out of the 4070 or 4080. They'll release new proprietary tech that only works on later generations and they'll be left behind. 4090 owners will be able to just buy 5090s but there is no value to the other cards. The very tech that makes them work will stop working once they release their new cards with new AI tech and new proprietary software that needs specialized hardware on the GPU to function.

AMD's solutions aren't as good but atleast they work on everything. You're giving up quality for compatibility. Looking at nVidia's track record for compatibility, that's a sacrifice that I'm willing to make.
Well, you can be "sick of it" but it doesn't change the fact that it's true. The supposition that the XT is meant to sell XTX is a ridiculous notion. No one in product development is going to spend the money to develop a product like the XT not expecting it to sell. Do you even realize how much it cost to bring such a product to market? Why would anyone do that if they expect the product to never sell? The XT was meant to give users an option to get a little less performance for a little less money and they had to reduce the price, within 3 months of launch no less, because it wasn't selling. If it was meant to sell XTX models, then why drop the price? Oh yeah, they had to drop the price of the XTX as well, because they weren't selling at $1,000. If they had a price drop "built-in" as you claim isn't that gouging customers right out of the gate? And you want to do business with that company?

Regarding gaming resolutions, Steam disagrees with you. 61% of gamers today, on Steam, play at 1080. Another 16% play at 1440 and 4K represents less than 4% of all Steam gamers. So, while people may want to play at 4K, not very many people are.

As for longevity of a GPU, given that the 1650 and 1060 are the number 2 and 3 GPUs used by Steam Players, I'd say you have a good shot at getting 3-5 years out of a 4070 or 4080. 12G will be fine at 1080 and 1440 for at least a couple of more years and 16G will be more than fine. Very likely it won't be VRAM that forces people to upgrade but rather the actual performance of the card which is more about core count and memory performance than quantity of VRAM. And I don't know why you would spend $300+ for a GPU that you would only keep for a year. That seems wasteful (monetarily speaking) to me.

You keep implying that Nvidia will not support DLSS going forward, but that flies in the face of what's actually happening with DLSS 3.5. No one should ever buy a GPU based on what "might" happen in the future. AMD might stop supporting FSR on older cards or non-AMD cards when their new series comes out. A lot of things could happen, but I would never buy anyone's GPU based on unconfirmed speculation. Technology advances and it's not always possible to support certain things on older hardware. I would not want my new products to be handicapped by forcing backwards compatibility with every release, otherwise, your competitors will eat your lunch.

I don't know what you're referring to as far as compatibility, but every Nvidia GPU I've had was compatible with the games I play. I think their track record is fine.
 
Well, you can be "sick of it" but it doesn't change the fact that it's true. The supposition that the XT is meant to sell XTX is a ridiculous notion. No one in product development is going to spend the money to develop a product like the XT not expecting it to sell. Do you even realize how much it cost to bring such a product to market?
What don't people understand that part of the die is literally defective. They didn't design a separate 7900XT, a portion of the silicon is literally defective and it's cheaper to sell it as a lesser model than it is to just scrap the whole chip because some of it isn't working. They've been doing this for decades as a way to recoup costs but also market higher end models. The 4080 super is just going to be a defective AD102 die(IE 4090) with 20gigs of ram
You keep implying that Nvidia will not support DLSS going forward, but that flies in the face of what's actually happening with DLSS 3.5. No one should ever buy a GPU based on what "might" happen in the future. AMD might stop supporting FSR on older cards or non-AMD cards when their new series comes out. A lot of things could happen, but I would never buy anyone's GPU based on unconfirmed speculation. Technology advances and it's not always possible to support certain things on older hardware. I would not want my new products to be handicapped by forcing backwards compatibility with every release, otherwise, your competitors will eat your lunch.

I don't know what you're referring to as far as compatibility, but every Nvidia GPU I've had was compatible with the games I play. I think their track record is fine.
nVidia has dropped DLSS support for the 20 series, they have a history of doing this crap to consumers. Well, they didn't really "drop" it, more like they make a new version of DLSS that the 20 series doesn't have the instruction sets to run. The 30 series cannot run the full DLSS 3.5. The 30 series likely wont be able to run DLSS 4.0 when the 50 series is released. So you can say "well AMD might do this in the future". AMD has a history of creating open standards and making them free for everyone while nVidia has a history of creating proprietary tech and only supporting it for 1 or 2 generations. If you look at G-sync, it's nothing more than a marketing term now and is actually just free-sync with an nVidia badge on it.

As for longevity of a GPU, given that the 1650 and 1060 are the number 2 and 3 GPUs used by Steam Players, I'd say you have a good shot at getting 3-5 years out of a 4070 or 4080. 12G will be fine at 1080 and 1440 for at least a couple of more years and 16G will be more than fine. Very likely it won't be VRAM that forces people to upgrade but rather the actual performance of the card which is more about core count and memory performance than quantity of VRAM. And I don't know why you would spend $300+ for a GPU that you would only keep for a year. That seems wasteful (monetarily speaking) to me.
Because my 1070ti Died and we were in the middle of a GPU shortage. The 7000 series wasn't out yet and I wasn't buying a 4090. I also explained previously, when I build a PC I expect to pay about $500/yr for it and I like to keep my PCs for at least 4 years which gives me a $2000 budget. Currently, I'm on 6 years with this PC and I'm eager for an upgrade. A 6700xt for $320 was both the bare minimum I would settle for while also being about as much as I wanted to spend on a GPU I plan on replacing within a year. $320 is more than I wanted to spend, but it's close enough to my $250/yr GPU budget that I tolerated it. At least I didn't pay $1000 for a 3070 in the middle of the GPU shortage. I'll either sell it for ~$250 or so or just keep it for diagnostic purposes, haven't decided yet. And, yes, those were my options when I made the purchase. An ebay 3070 for $1000 or an open box 6700XT on newegg for $320. Considering over 90% of what I do now is on linux and nvidia Linux drivers are trash the AMD card for $320 seemed like a no-brainer.
 
Alright, let's guess it will cost 1300 dollars. It is then not the best card that cost a lot. 4090 will be a better deal.
 
Didn't they do the same thing with the RTX 20 SUPER series of cards?

Didn't the non SUPER variants even get price drops? Not liquidation price drops, but enough to make some consider buying them even in the face of the updates.
 
What don't people understand that part of the die is literally defective. They didn't design a separate 7900XT, a portion of the silicon is literally defective and it's cheaper to sell it as a lesser model than it is to just scrap the whole chip because some of it isn't working. They've been doing this for decades as a way to recoup costs but also market higher end models. The 4080 super is just going to be a defective AD102 die(IE 4090) with 20gigs of ram
It may very well use defective parts, but it is also "designed" for that. Busses are different, clock speeds are different. Packaging has to be created separately. All of this cost money and if they didn't expect to sell them, they wouldn't build them in the first place. And you just confirmed what I'm saying that it's a way to recoup money!! Thank you.
nVidia has dropped DLSS support for the 20 series, they have a history of doing this crap to consumers. Well, they didn't really "drop" it, more like they make a new version of DLSS that the 20 series doesn't have the instruction sets to run. The 30 series cannot run the full DLSS 3.5. The 30 series likely wont be able to run DLSS 4.0 when the 50 series is released. So you can say "well AMD might do this in the future". AMD has a history of creating open standards and making them free for everyone while nVidia has a history of creating proprietary tech and only supporting it for 1 or 2 generations. If you look at G-sync, it's nothing more than a marketing term now and is actually just free-sync with an nVidia badge on it.
Uh, DLSS 3.5 supports series 20, just not frame gen because it doesn't have the hardware to do it. Are you suggesting that Nvidia or any GPU manufacturer should not include new functionality in new models in order to maintain backward compatibility with older models?

G-Sync is not FreeSync. FS uses VESA Adaptive Sync, which is a free protocol. Nvidia developed a proprietary system in order to work with their GPUs. I have no issue with this. I want vendors to use whatever they can to make their products work the best they can.
Because my 1070ti Died and we were in the middle of a GPU shortage. The 7000 series wasn't out yet and I wasn't buying a 4090. I also explained previously, when I build a PC I expect to pay about $500/yr for it and I like to keep my PCs for at least 4 years which gives me a $2000 budget. Currently, I'm on 6 years with this PC and I'm eager for an upgrade. A 6700xt for $320 was both the bare minimum I would settle for while also being about as much as I wanted to spend on a GPU I plan on replacing within a year. $320 is more than I wanted to spend, but it's close enough to my $250/yr GPU budget that I tolerated it. At least I didn't pay $1000 for a 3070 in the middle of the GPU shortage. I'll either sell it for ~$250 or so or just keep it for diagnostic purposes, haven't decided yet. And, yes, those were my options when I made the purchase. An ebay 3070 for $1000 or an open box 6700XT on newegg for $320. Considering over 90% of what I do now is on linux and nvidia Linux drivers are trash the AMD card for $320 seemed like a no-brainer.
Fair enough, I'll give you the Linux drivers point.
 
Back