Nvidia hints that the RTX 3000 series and RTX 4000 cards will co-exist

midian182

Posts: 9,662   +121
Staff member
Forward-looking: With Nvidia’s next-gen RTX 4000 series set to arrive this year, will its launch spell the end of the current Ampere consumer line? Possibly not, according to one exec. It appears that the company could continue to produce some RTX 3000 cards alongside Lovelace.

PCMag reports that during a Morgan Stanley investment meeting on Monday, Nvidia CFO Colette Kress hinted at the possibility of both current and next-gen cards being produced and sold alongside each other.

“Even during this period of COVID and supply constraints, it’s been interesting because it’s given us the opportunity for gaming to continue to sell both the current generation as well as the Turing generation,” Kress said. “So we’ve been doing that to provide more and more supply to our gamers in that. And we may see something like that continue in the future.”

As Kress points out, Nvidia kept on making the RTX 2060 after the launch of Ampere, and it released a 12GB version in December that turned out to be expensive, hard to find, and seemingly aimed more at miners than gamers—it certainly didn’t do much to address the graphics card shortages.

Back in September 2018, when the first Turing products launched, Kress confirmed that its 10-series Pascal cards would co-exist alongside the new architecture. At the time, many felt this was necessary due to the high selling points of the RTX 2080 Ti/2080/2070; little did we know just how expensive graphics cards would become.

Kress has previously said, on two occasions, that Nvidia expects graphics card supplies to improve in the second half of the year. She repeated the claim again on Monday, claiming a boost to availability would get here starting from the third quarter of 2022. “We will continue to work on supply. I think we’ll be in a good supply situation in the second half,” she said. Kress added that there will be "some items" related to the next-gen GPUs at Nvidia’s GTC event next month.

We’re already seeing better availability for graphics cards, along with falling prices, in Europe and on eBay. With the RTX 3090 Ti rumored to land later this month, it seems the GPU crisis is finally starting to alleviate.

Permalink to story.

 
Cool, I have been in EVGA's GPU queue already for 18 months at this stage and still no card.

I will probably get the email for my card in like 2-3 years down the road.
 
Cool, I have been in EVGA's GPU queue already for 18 months at this stage and still no card.

I will probably get the email for my card in like 2-3 years down the road.

I actually got my email last week, I think I put it in 2 years ago but by then I was able to pick up one from Microcenter.
 
These companies need to increase the span between models and product generation lines to much greater than 2 years. If not, then theh need to agree to iterative steps again and not make successive generations "twice as powerful" as the last,

As such, I'd also like Nvidia to go back to the previous naming scheme; was hoping for a series of RTX 3100s or 3660s or some such.
 
If you have a 3080, 3080Ti or 3090, chances are, you can comfortably skip the 4000 generation.
Don't know about that. The 3090 was for a specific kind of crowd even at MSRP. Those that have plenty of cash and simply want the best.
The least likely people to skip a generation imo.
Not to mention if you've bought one of these cards you're likely after 4k gaming which they'll do unless you want ready tracing and ultra settings at a locked 60fps or better. If the 4000 series is as good as rumoured you'll want to upgrade.

These companies need to increase the span between models and product generation lines to much greater than 2 years. If not, then theh need to agree to iterative steps again and not make successive generations "twice as powerful" as the last,

As such, I'd also like Nvidia to go back to the previous naming scheme; was hoping for a series of RTX 3100s or 3660s or some such.
Why of why would you want to artificially cap progress? Massive performance gains used to be the norm for generation on generation for both CPU and GPU. Intel got away with small increases per generation because they had no competition. Try and do that as Nvidia and AMD has you beat the very next generation, not to mention with Intel now in the graphics market and looking to make a decent entry you bet they'll outperform you soon enough as well.

So who wins, NVIDIA? No.
The customer being milked for tiny performance increases? No.

It's not like the x86 market either where you can rely on the licensing system to lock out other companies from entering the market. If someone else wants to enter the market like Intel is doing atm they can. If one the mobile phone GPU makes think they can make profit by entering the PC market noone is stopping them.

If you're worried about prices those are already a mess. The most likely way to get a good deal is to buy a used card of those upgrading and forced to sell at quite a loss. Or just wait 2 generations and if you don't need the performance move down a series (get a RTX 5060 if you're on a 3070).
 
Don't know about that. The 3090 was for a specific kind of crowd even at MSRP. Those that have plenty of cash and simply want the best.
The least likely people to skip a generation imo.
Not to mention if you've bought one of these cards you're likely after 4k gaming which they'll do unless you want ready tracing and ultra settings at a locked 60fps or better. If the 4000 series is as good as rumoured you'll want to upgrade.


Why of why would you want to artificially cap progress? Massive performance gains used to be the norm for generation on generation for both CPU and GPU. Intel got away with small increases per generation because they had no competition. Try and do that as Nvidia and AMD has you beat the very next generation, not to mention with Intel now in the graphics market and looking to make a decent entry you bet they'll outperform you soon enough as well.

So who wins, NVIDIA? No.
The customer being milked for tiny performance increases? No.

It's not like the x86 market either where you can rely on the licensing system to lock out other companies from entering the market. If someone else wants to enter the market like Intel is doing atm they can. If one the mobile phone GPU makes think they can make profit by entering the PC market noone is stopping them.

If you're worried about prices those are already a mess. The most likely way to get a good deal is to buy a used card of those upgrading and forced to sell at quite a loss. Or just wait 2 generations and if you don't need the performance move down a series (get a RTX 5060 if you're on a 3070).
I have to go with the other guy, they DO bring these cards out to fast, with higher and higher prices for what? a handful of extra frames?

What's actually needed is game optimization, look at the switch and ps4, what they've been able to crank out with pretty much fancy cellphone and laptop cpu's with low end gpu's is insane, and its because those games are tuned to precision. It's tough to read about a 4000 series card when my ps4pro is running GT7 at 60fps, locked.

the switch shows that devs can run their games on a toaster if they try hard enough.

the ps4 shows you can get cutting edge graphics if the work is put in(uncharted4, gran turismo 7)

So why in TF do I need to keep buying these ridiculous cards, to brute force my way through a game that a now ancient console can handle fine?

I digress tho, rant ended,
yay capitalism n all that, hope I can snag one of these new cards asap, thats the right cheer isnt it?
 
Hmm, Im on an RTX 2080 so the 3000 series isnt really worth bothering with. But if I was building a new system from scratch and the 3000 series is discounted then its probably a good call.

Also I can imagine this could be to challenge whatever Intel releases. If they release anything.
 
Makes sense, since most likely production of 5nm GPUs won't be able to meet demand.
The only question is which skus will be kept in production from 8nm. Most likely 3050 / 3060 (ti) / 3070
 
This is for one reason only. Lovelace cards will be a lot dearer and they'll need Ampere as the "affordable" option. 4090 will we a $2K card for a start. 4070 will be a $900 card IMO, but it will easily beat 3080.
 
If you have a 3080, 3080Ti or 3090, chances are, you can comfortably skip the 4000 generation.
Seriously, I have 3070ti running 1440 and it's more than enough. The next generation for CPU's and GPU's is overkill. 98% of computer users do not need them.
 
This is for one reason only. Lovelace cards will be a lot dearer and they'll need Ampere as the "affordable" option. 4090 will we a $2K card for a start. 4070 will be a $900 card IMO, but it will easily beat 3080.

Watching how NGridia operated the last 2 years, I think the 4070 will most likely sell for $1500+.

MSRP is just a gimmick for them and the joke is on the gamers, as usual.
 
The RTX 3080 is incredibly more powerful than the RTX 2080. Way more than a "handful". Is an RTX 2080 good enough at 115 fps avg.? SURE! Until you factor in:

I own a 3440x1440p monitor. They have 4.95 million pixels. 2560x1440p has just under 3.7 million. That's nearly a 35% increase.

Then factor in a 120Hz refresh rate and suddenly an RTX 3080 doesn't look overpowered at all- especially in demanding (poorly optimized?) games like CP2077 and Red Dead. I doubt this card would even get me to 120Hz at high(ish) settings in many AAA games. AC Valhalla for example too.

I'll consider my GPU as more than enough when it can deliver the full capabilities of my monitor. And I'm not even attempting 4K!

1646785581823.png
 
The RTX 3080 is incredibly more powerful than the RTX 2080. Way more than a "handful". Is an RTX 2080 good enough at 115 fps avg.? SURE! Until you factor in:

I own a 3440x1440p monitor. They have 4.95 million pixels. 2560x1440p has just under 3.7 million. That's nearly a 35% increase.

Then factor in a 120Hz refresh rate and suddenly an RTX 3080 doesn't look overpowered at all- especially in demanding (poorly optimized?) games like CP2077 and Red Dead. I doubt this card would even get me to 120Hz at high(ish) settings in many AAA games. AC Valhalla for example too.

I'll consider my GPU as more than enough when it can deliver the full capabilities of my monitor. And I'm not even attempting 4K!

View attachment 88243
Well 1080ti and 2080ti still doing fine @3440x1440 for me, but I been a pc gamer since about 1989; so I'm not as delicate about high framerates.
 
I have to go with the other guy, they DO bring these cards out to fast, with higher and higher prices for what? a handful of extra frames?

What's actually needed is game optimization, look at the switch and ps4, what they've been able to crank out with pretty much fancy cellphone and laptop cpu's with low end gpu's is insane, and its because those games are tuned to precision. It's tough to read about a 4000 series card when my ps4pro is running GT7 at 60fps, locked.

the switch shows that devs can run their games on a toaster if they try hard enough.

the ps4 shows you can get cutting edge graphics if the work is put in(uncharted4, gran turismo 7)

So why in TF do I need to keep buying these ridiculous cards, to brute force my way through a game that a now ancient console can handle fine?

I digress tho, rant ended,
yay capitalism n all that, hope I can snag one of these new cards asap, thats the right cheer isnt it?

For a PS4-like performance you just need a GTX 1060. If Switch-like graphics are enough for you, then a GTX 750 Ti or a Ryzen with iGPU is enough. So I agree, you don't need to buy a GPU that costs like a new PS and XB together every year. No wonder the 1060 is still the most used GPU on Steam.
 
I think it's likely NVIDIA would bring out the 4090 or whatever it's going to be called well before a 4060 so the 3060 would likely still be sold. I don't think this is unusual.
 
Back