Nvidia announces GeForce RTX 40 series GPUs based on Ada Lovelace architecture

Status
Not open for further replies.
Let's be real for a minute. By the time that games come out with native DLSS 3.0 support, you will be on the market for a new GPU. Why invest in RT cutting-edge hardware now when the games that support this type of performance are years away from release? The money you will save from not being on the cutting edge will let you upgrade your GPU more often.

You are gonna want a new GPU in 3 to 4 years anyway with the advances in performance/watt, RT and AI learning, and on-cache chiplets.
I'm on the market for a new GPU *right now*

I still use my old GTX 1050 Ti from 2017. I'm interested to get RTX 4060 or 4060 Ti early next year.
 
12 GB 4080 is a joke, they just renamed the 4070 to 4080 just to sell it on higher price otherwise having 4070 on 900$ would be ridiculous yet 12GB 4080 still ridiculous, 80 level card with 192-bit bandwidth, must be a joke.
For those who wants to take look at the raw performance can check this graph from nvidia themselves:https://images.nvidia.com/aem-dam/S.../geforce-rtx-40-series-gaming-performance.png
The first 3 game on the left has no dlss 3 support, so they should be comparable with the last gen and it can be seen that only 4090 shines and rest is pretty bad.
 
Am I understanding correctly that DLSS now has frame interpolation like TVs have been doing for years, making movies look like soap operas?
If those TVs were using a DNN to do the frame interpolation, then sure, it's the same thing.
I wonder what the actual improvement on pure rasterization without DLSS 3 or SER will be.
Hard to pin down exactly, given the various architectural changes, but one can at least identify the theoretical maximum throughputs:

noloveforlovelace.png

I've assumed that Nvidia have stuck to the same SM/TMU structure as from Ampere, as it's a structure they've been using for years now. ROPs are decoupled from the SMs, so I can't guesstimate those, unfortunately. But it terms of sheer FP32 performance, the new cards are very good -- notably better than their predecessors.

I'm a little concerned about the local memory bandwidth, though. Micron only now do 21 or 24 Gbps GGDR6X, so the new cards will be one or the other. The 4090 is clearly going to be fine, but the two 4080s have considerably less bandwidth than any of 3080s.

If Nvidia have massively ramped up the amount of L2 cache in new chips, then it shouldn't be that much of a problem (see AMD and the Infinity Cache), but if it's only a small increase, then it's a worry. However, given the transistor count of the AD102 (76b versus 28.3b for the GA102), I suspect that the new 40 series will be packing a serious amount of cache.
 
I'm on the market for a new GPU *right now*

I still use my old GTX 1050 Ti from 2017. I'm interested to get RTX 4060 or 4060 Ti early next year.

He was saying that using RT as a criteria for your next GPU is laughable, unless you are biuying a $1k GPU to go back and play old games with RT. Because new games don't use RT... the industry doesn't care about RT.

When Console game Developers start using RT in their games, then the industry has moves to RT. Until then, buy a GPU solely for RT and 4 games is absurd and laughable. Because most people don't want RT, they want faster cards.
 
A quick estimation is that the new cores and clocks are 25% faster core to core than the 30 series based on the 4090s leaked Time Spy bench. That really is a decent jump. The 4090 is 90% faster than the 3090 though only because it has 50% more cores. So using that same estimation the 4080 16GB should be right at 40% faster than the vanilla 3080 and the 4080 12GB should be about 10% faster. This is not taking RT or Tensors into the equation, just rasterization performance. This is also an estimation, but based on Nvidia's own current and future games chart, the 4080 12GB performs below the 3090 Ti and the 16GB seems to be right in line with the 3090 Ti in terms of raw performance. Which makes that estimation seem fairly accurate. We will not know of course until real benchmarks are done for games that are not hand selected by Nvidia.

So you are paying a $200 premium for the 4080 12GB because its 10% faster than the nearly 3 year old 3080, but also has better DLSS and RT. Which is really what you are paying for. Sure it uses about 50 less watts to achieve that as well. But, in terms of raw performance you are paying a lot more for a generational jump that is well below average (because you are really getting a 4070 rebranded).
For the real 3080, Nvidia is asking your to pay $500 more for ~40% faster card. The generational jump is pretty big there, but the price increase is enormous.
If you have to have the best and have no problems dropping $1600, that seems like a deal over the 3090 launch price considering you truly are getting nearly 2X the performance for only a $100 price increase, but that is more than most people want to spend on their entire gaming computer.

Verdict: Wait for AMD.
 
He was saying that using RT as a criteria for your next GPU is laughable, unless you are biuying a $1k GPU to go back and play old games with RT. Because new games don't use RT... the industry doesn't care about RT.

When Console game Developers start using RT in their games, then the industry has moves to RT. Until then, buy a GPU solely for RT and 4 games is absurd and laughable. Because most people don't want RT, they want faster cards.
what do you mean? Spider-Man Remastered looks beautiful with ray-tracing. I bet Miles Morales will be too. that's just games that caught on my mind.

I consider RT performance for future-proofing. I don't think RT will just go, now that console has it and developers start using it more.
 
I don't think RT will just go, now that console has it and developers start using it more.
Ray-tracing is the future for sure. Games take too long to make currently and a lot of time is spent making environments look good with fake lighting.

There's a really good video out there from digital foundry who went to the Devs who made metro exodus and they showed the time it takes to light a church vs using ray-tracing and its crazy how much faster it is. We're talking 40 minutes to an hour vs 5 minutes for RT.
 
Ray-tracing is the future for sure. Games take too long to make currently and a lot of time is spent making environments look good with fake lighting.

There's a really good video out there from digital foundry who went to the Devs who made metro exodus and they showed the time it takes to light a church vs using ray-tracing and its crazy how much faster it is. We're talking 40 minutes to an hour vs 5 minutes for RT.
RT is definitely the future and Metro Exodus Enhanced showed us that we are not that far off, but... we're not quite there yet either. For one thing the game had to run a pretty low resolution to achieve 60fps on consoles and the console versions did not run with all the bells and whistles of the PC version. The 3090 and 3090 Ti are about the only cards currently available that could run 4K native resolution and 60fps with that game and let's not forget that the geometry for MEE was based off a game originally released on PS4. So for PS5/XSX RT is really more of a novelty than a serious game enhancement, few games will feature RT at all and fewer still will feature RT for complete global illumination like MEE did. RTX 40 and perhaps RDNA 3 will be the first generation of GPUs where devs can perhaps get serious about implementing RT, but, still considering most AAA games are developed with consoles in mind and ported to PC these days, RT implementation is still going to be sparse. RTX cards already carried a heft premium for RT and to purchase a GPU even now for its RT performance seems like a waste. Maybe in two years there are enough RT games to justify paying more for RT, but by then there will at least be RTX 40 refresh or perhaps even RTX 50 and hopefully a return to sanity with pricing.
 
It is really funny and sad at the same time how people get excited about something what degrade visual quality and call it revolutionary. Still, latest and greatness for 2000$ in Cyberbug 2077 with RT can only achieve 25fps. Passss
Make me think of "The Emperor's New Clothes" story.
 
what do you mean? Spider-Man Remastered looks beautiful with ray-tracing. I bet Miles Morales will be too. that's just games that caught on my mind.

I consider RT performance for future-proofing. I don't think RT will just go, now that console has it and developers start using it more.
You think just like the 2080 & 3080 owners..

People don't buy a $1k GPU to play Spider-Man or for RT. Gamers upgrade their GPU for higher frames, not lowering them.

If you buying a RTX 4080+ for future RT games, makes u unrational. Only an obsessed fanboi would pay to go back and play Cybertrash all over again.
 
The 2X performance over 3080 Ti is almost definitely taking RT/DLSS 3.0 into consideration, not raw gaming performance. So you can expect to see that kind of performance boost only in limited scenarios. I expect the 4080 16GB will in reality be 25-40% better than the 3080 10GB and the 12GB version probably 15-25% faster in raw performance. All the bells and whistles are nice, but Nvidia is relying heavily here on enhancements from new tensor cores and rt cores. I can say this because we have a pretty good understanding already that the raw performance of the 4090 is 90% faster than the 3090, but it also has >50% more cores and faster clocks. The 4080 16 GB has only <12% more cores than the 3080 and the 12 GB version actually has <12% less cores than the 3080. The 4080 just is not going to be 2X the performance of the 3080 ti or even 3080 for that matter if the 4090 required 50% more cores to get 90% better raw performance. It is nice to see cards that should finally be able to handle real time RT and make it a practical enhancement, but, its unrealistic to think you'll get a return on that $1200 GPUs feature set for quite some time.

Edit: Actually its very evident from Nvidia's webpage that DLSS 3.0 and RT "Overdrive" are exactly the reason they are claiming 2X-4X faster than the 3080 Ti. This was even done in DLSS performance mode likely to maximize the performance uplift percentage.

But there is one more problem I have with your post. How many "it's just $100-$200 more for X more performance" should we be okay with? The 1080 Ti, the best gaming GPU you could buy at the time was only $600-$700. The 4090 here is starting at $1600 now and the 4080 (not Ti) is now $1200 (or starting at $900 if you want to pretend the 12GB version is not really a 4070). I know there has been quite a bit of inflation since then, but should a high-end gaming GPU really be more than $700-800? Apparently Nvidia thinks so, you now have to shell out at least $900 to get into the "high-end" club.
I agree there is a discussion to be had regarding price of GPUs. But, like anything, it comes down to value. First, we've had some pretty nasty inflation over the past 18 months. That can't help keep cost down. Second, my understanding is that Nvidia cards are pricey to make and profits are lower. When looking at the 1080Ti, a great card, its MSRP was $699, the same as the regular 3080. Given that the 3080 is 75-80% faster (in benchmarks) I'd say that makes the 3080 a pretty good value, if you assume that the 1080 was priced appropriately.

Now consider the 40 series. You have a new GPU that cost $899. That's about a 28% price increase. Will you get 28% more performance out of it? I guess we will have to wait and see. But, if you get 50% or more performance at that cost, that would seem like a reasonable value. Again, assuming that the 3080 was priced appropriately.

I think a lot of people are upset because they are used to getting more performance for the same price-point as the previous generation. Maybe that's not possible any more.

I agree that the benchmarks leveraged DLSS and ray tracing. That's one of the areas where Nvidia tends to prevail over AMD. And, higher resolutions (4K) as well. The question is at what point do higher FPS become a moot point? Do you really need 300, 400 or 500 fps in games? Maybe, but I'm thinking there is diminishing value there. Maybe there is a 40 series that will deliver 100-200 fps in most games, with DLSS and ray-tracing turned on at 4K. The question is what should that cost?

For me, the Nvidia announcement didn't really impress enough to wait for a 40 series card. I'll likely go 3080 w/12G or wait to see what AMD brings out next week. I'd love to find a card for $500-600 w/3080Ti performance, but I'm not hopeful for that happening short of buying a used card.
 
You think just like the 2080 & 3080 owners..

People don't buy a $1k GPU to play Spider-Man or for RT. Gamers upgrade their GPU for higher frames, not lowering them.

If you buying a RTX 4080+ for future RT games, makes u unrational. Only an obsessed fanboi would pay to go back and play Cybertrash all over again.
if you actually bothered to read my previous comment, I'm actually interested in getting RTX 4060 or 4060 Ti, not 2080 or 3080.
 
if you actually bothered to read my previous comment, I'm actually interested in getting RTX 4060 or 4060 Ti, not 2080 or 3080.
Right^... and you sound just like all the other RTX owners (including myself), looking forwards to Jensen lies.... my 2080 hasn't gotten any better with Ray Tracing, nor has ray tracing meant anything to gamers since I've bought my card.

My last 5 Gaming gpus have been EVGA and now it is kinda of obvious why they left nVidia.


There will be no 4060... because the now $599 3080 is just as fast.
 
Fulljack, I just want to clarify what I was saying. I was not saying that RT is completely useless, in fact I do think it is the future. But the key word is "future". Future-proofing is something that everyone does to some extent in this hobby but I think there are wise examples of future-proofing and unwise examples. As it stands in Sept 2022, I would not be future-proofing on the RT feature.

The way I think about this, I would take a non-RT focused GPU and pocket the savings now and upgrade to a more RT-capable GPU in 2 to 4 years when it matters more in games. Miles Morales may benefit from RT but that is one game. RT adoption in games will not increase substantially until more RT hardware is available at cheaper price tiers for consumers.

Finally, looking at the console games, many games are still not native to next-gen consoles but are instead cross-gen. RT will not become important until games are exclusive to next-gen consoles which do support RT.

TLDR = RT will be important but not this or next year.
 
Right^... and you sound just like all the other RTX owners (including myself), looking forwards to Jensen lies.... my 2080 hasn't gotten any better with Ray Tracing, nor has ray tracing meant anything to gamers since I've bought my card.

My last 5 Gaming gpus have been EVGA and now it is kinda of obvious why they left nVidia.


There will be no 4060... because the now $599 3080 is just as fast.
Where are you seeing $599 3080s? All I see are 10G versions running $750-799 and 12G versions running $799 and up. It seems like prices are actually increasing, not decreasing as everyone thought.
 
Where are you seeing $599 3080s? All I see are 10G versions running $750-799 and 12G versions running $799 and up. It seems like prices are actually increasing, not decreasing as everyone thought.

It's a likely scenario that'll continue to play out as Nvidia holds back Ampere inventory to keep prices higher on the 3000 cards. I think it's a ploy for them to try and force hands of people to make a decision:

1) buy the new 4000 series for a bit higher price over what Ampere is priced at
OR
2) pick up a new 3000 series that's still overpriced so they can continue to clear out inventory.

If the reviews of the 4000 series are not what they're cracked up to be, people may just say "F it" and go back to Ampere. If this is the case, the inventory that is out there will begin to diminish and pricing on them will start to creep back up since demand will outweigh inventory because Ampere production is EOL.

Hopefully AMD will come crashing in with solid performance and better pricing and shove this whole Nvidia pricing fiasco right up their rears. I'd strongly suggest anyone that didn't go with Ampere or RDNA2, wait for both Nvidia and AMD to release before you make a decision of what to buy. Don't just jump on Nvidia's junk as soon as they launch, maybe, just maybe, we can stick it to Nvidia.
 
Right^... and you sound just like all the other RTX owners (including myself), looking forwards to Jensen lies.... my 2080 hasn't gotten any better with Ray Tracing, nor has ray tracing meant anything to gamers since I've bought my card.

My last 5 Gaming gpus have been EVGA and now it is kinda of obvious why they left nVidia.


There will be no 4060... because the now $599 3080 is just as fast.
yes, there'll be no RTX 4060 for now. both Nvidia and AMD releases mid and low tier card at later date, so I don't understand your point. will you eat your shoes if Nvidia releases RTX 4060 later down the road?

I choose xx60 card because it'll has lower TDP and also hopefully doesn't break the bank. if RTX 4060 Ti MSRP still costs 399$ like it's predecessor then I could still swallow that.

I'm looking forward for RT. I don't care if you don't use it, I'm just interested on playing games with it. sure, there are performance hit, but there's DLSS or FSR for that.
 
It's a likely scenario that'll continue to play out as Nvidia holds back Ampere inventory to keep prices higher on the 3000 cards. I think it's a ploy for them to try and force hands of people to make a decision:

1) buy the new 4000 series for a bit higher price over what Ampere is priced at
OR
2) pick up a new 3000 series that's still overpriced so they can continue to clear out inventory.

If the reviews of the 4000 series are not what they're cracked up to be, people may just say "F it" and go back to Ampere. If this is the case, the inventory that is out there will begin to diminish and pricing on them will start to creep back up since demand will outweigh inventory because Ampere production is EOL.

Hopefully AMD will come crashing in with solid performance and better pricing and shove this whole Nvidia pricing fiasco right up their rears. I'd strongly suggest anyone that didn't go with Ampere or RDNA2, wait for both Nvidia and AMD to release before you make a decision of what to buy. Don't just jump on Nvidia's junk as soon as they launch, maybe, just maybe, we can stick it to Nvidia.
I think you're spot on, at least this is how I see it as well. I did buy a 3070 for my grandson as we are building a PC for him as a birthday present. It's more than good enough for what he needs and he will have an upgrade path when things start to settle down, if they ever do.

If you're looking to build a decent mid-range gaming machine, you can do that today with existing GPU and CPU options out there. It's when you start to get into the high-end that things get murky and certainly now isn't the time to be buying any high-end CPU or GPU.
 
yes, there'll be no RTX 4060 for now. both Nvidia and AMD releases mid and low tier card at later date, so I don't understand your point. will you eat your shoes if Nvidia releases RTX 4060 later down the road?

I choose xx60 card because it'll has lower TDP and also hopefully doesn't break the bank. if RTX 4060 Ti MSRP still costs 399$ like it's predecessor then I could still swallow that.

I'm looking forward for RT. I don't care if you don't use it, I'm just interested on playing games with it. sure, there are performance hit, but there's DLSS or FSR for that.

You don't seem to understand... a rtx4060 is Not more powerful than a 3080... that nVidia is selling for $599 right now and can't get rid of.

You will not be buying a RTX 4 card, nVidia is going to force Glitz-buyers to cheap outdated RTX 3 cards.

AND yes, buying a $400 card for Ray tracing might be one of the most hilarious things we've read in a while..
 
You don't seem to understand... a rtx4060 is Not more powerful than a 3080... that nVidia is selling for $599 right now and can't get rid of.

You will not be buying a RTX 4 card, nVidia is going to force Glitz-buyers to cheap outdated RTX 3 cards.

AND yes, buying a $400 card for Ray tracing might be one of the most hilarious things we've read in a while..
I do not see where Nvidia or any AIB is selling 3080s for $599. I see 3070s in that price range but not 80s.

Based on recent reports I don’t see 30 series getting dumped for cheap prices. I believe this is one of the issues Evga had and that is selling 3080s and 3090s at a big discounts would result in losses that could be significant.

I think Nvidia is happy to keep 30 series prices pretty close to where they are. 3080s in the $700-800 range, 3090s $900-1000, 4080/12 $1200 and so on. At least until we get a good look at the new Radeon cards in head-to-head benchmarks.

If AMD can bring similar performance at a lower price point it may force Nvidias hand. If they just have parity or slightly higher perf at the same price I doubt we will see much movement on Nvidia prices.
 
You don't seem to understand... a rtx4060 is Not more powerful than a 3080... that nVidia is selling for $599 right now and can't get rid of.

You will not be buying a RTX 4 card, nVidia is going to force Glitz-buyers to cheap outdated RTX 3 cards.

AND yes, buying a $400 card for Ray tracing might be one of the most hilarious things we've read in a while..
I'm not in the US, though? While I hope I could get RTX 3080 that cheap, it's just not reliable for me to rely on used market prices as it doesn't fluctuate heavily like in the US.

if I bought RTX 4060 or 4060 Ti sometimes in 2023, will you eat your shoes?
 
I'm not in the US, though? While I hope I could get RTX 3080 that cheap, it's just not reliable for me to rely on used market prices as it doesn't fluctuate heavily like in the US.

if I bought RTX 4060 or 4060 Ti sometimes in 2023, will you eat your shoes?

What are you talking about? Nobody is talking used prices....

AND nobody cares what u buy, just hilarious you are saying "in the future"...
 
I do not see where Nvidia or any AIB is selling 3080s for $599. I see 3070s in that price range but not 80s.

Based on recent reports I don’t see 30 series getting dumped for cheap prices. I believe this is one of the issues Evga had and that is selling 3080s and 3090s at a big discounts would result in losses that could be significant.

I think Nvidia is happy to keep 30 series prices pretty close to where they are. 3080s in the $700-800 range, 3090s $900-1000, 4080/12 $1200 and so on. At least until we get a good look at the new Radeon cards in head-to-head benchmarks.

If AMD can bring similar performance at a lower price point it may force Nvidias hand. If they just have parity or slightly higher perf at the same price I doubt we will see much movement on Nvidia prices.
"recent reports..."

My Newegg account is 19 years old, recent report was checked just 40 seconds ago.

BTW, the 3090 is selling as close to the 6950XT because AMD beats it in many games...

in 40 days Radeon 6k series will get a drop... thus all rtx3 cards
 
Status
Not open for further replies.
Back