Nvidia announces GeForce RTX 40 series GPUs based on Ada Lovelace architecture

Status
Not open for further replies.
What a crap marketing ploy. Give us apples to apples performance uplift comparison and not a DLSS2 vs DLSS3 demo.

What happen to all the talk of ray tracing two or path tracing that was to be way better justifying increase in price and power?
 
My understanding is that there are more cores as well.


No kidding, It seems like it may already be happening. I'm looking at a 3080 and prices are holding pretty stable. If I can't get a 3080 12G or Ti for $600-650 USD I'll just get the 4080. For $100 over current prices you'll get 2x performance.

Be mindful of the way the graphs are presented. Read the fine print below them. The performance graphs of the 4090 and 4080s vs the 3090Ti and 3080Ti, it tells you below the graphs that the 40 series are using DLSS in performance mode. It also doesn't specify if the games tested are using the new 3.0 DLSS or using a 2.x version, we don't know how much of a performance increase there could be between the two versions of DLSS (also, Ampere will not have support on the new DLSS 3.x version).

The 3080Ti and 3090Ti are not being used with DLSS, otherwise they would have to specify in the fine print on the graphs.

Don't let Nvidia marketing sweet talk you into think you're magically getting 2-4x the performance over the 30x0 series cards.
 
Last edited:
Well let's see, I paid 4 years ago a 2080 rtx for 930 euros and a few months later we had super variant at the same price ,after that we had the mining spree and everything from the 30.. generation was inflated to prices from hell ,waited and waited and prices dropped but still overpriced and without a significant performance difference, add to that the electricity bill and the dream of an upgrade soon just isn't possible, soon after the new hype of 40 series came and soon I was thinking maybe now it's the time ,now I am thinking .....Upgrade???it would take more than 2500 euros to make a build with the new series including a new generation of CPU RAM MOTHERBOARD PSU and maybe that's an understatement plus the cost of electricity!!!thinking again ...DO I NEED now all this new **** to just play DCS ,WARTHUNDER ,APEX LEGENDS ;;;; and let's put in a thought of VR in a couple of years from now that will be good enough to enjoy the SIMS....RESULT Not now NO way in hell ,don't even need it for what I am playing ,let's wait a couple of years more ,don't need to be a fool with these PRICES ,games ,VR AND ALL THE **** are still way back on using that much power with a real difference on performance ,No way I am paying for no difference, I will wait AMD INTEL new VR and price drops ,I have the time and the patience not to overpay any of them ,especially when I see companies scalping before the scalpers on my pocket ,NO THANKS NVIDIA or any other company that thinks I am a fool, get your heads around and drop price to what really is the value of them !! Don't Know if I make any sense but sense is the way !!!
 
I don't see these prices being sustainable with the influx of inventory from miners or just distributors trying to dump inventory. Everyone has a price and whenever the mining cards hit that price it will impact the 40 series.

I don't think we've seen the end of the long term effects of this last crypto boom and the damage nVidia did to themselves with it. $900 for what should be a 4070? Just because they call it a 4080 doesn't stop it from effectively being what a 4070 should be. And, frankly, $1200 for a 4080? Especially since we are entering a recession.
 
Well, to be fair, the original 3080 only had 10G and was listed at $699. The 12G version had a much inflated MSRP, closer to $1,100-1,200. Current pricing of the 12G variant is sub-$799 right now. so for $100-200 more you're getting 2x performance. That's not horrible. Whether we should be paying $699 or $899 for a GPU is certainly a topic for discussion.

I guess we will wait for AMD to see where they land. Nividia could be having a 40 series sale even before they're available.
The 2X performance over 3080 Ti is almost definitely taking RT/DLSS 3.0 into consideration, not raw gaming performance. So you can expect to see that kind of performance boost only in limited scenarios. I expect the 4080 16GB will in reality be 25-40% better than the 3080 10GB and the 12GB version probably 15-25% faster in raw performance. All the bells and whistles are nice, but Nvidia is relying heavily here on enhancements from new tensor cores and rt cores. I can say this because we have a pretty good understanding already that the raw performance of the 4090 is 90% faster than the 3090, but it also has >50% more cores and faster clocks. The 4080 16 GB has only <12% more cores than the 3080 and the 12 GB version actually has <12% less cores than the 3080. The 4080 just is not going to be 2X the performance of the 3080 ti or even 3080 for that matter if the 4090 required 50% more cores to get 90% better raw performance. It is nice to see cards that should finally be able to handle real time RT and make it a practical enhancement, but, its unrealistic to think you'll get a return on that $1200 GPUs feature set for quite some time.

Edit: Actually its very evident from Nvidia's webpage that DLSS 3.0 and RT "Overdrive" are exactly the reason they are claiming 2X-4X faster than the 3080 Ti. This was even done in DLSS performance mode likely to maximize the performance uplift percentage.

But there is one more problem I have with your post. How many "it's just $100-$200 more for X more performance" should we be okay with? The 1080 Ti, the best gaming GPU you could buy at the time was only $600-$700. The 4090 here is starting at $1600 now and the 4080 (not Ti) is now $1200 (or starting at $900 if you want to pretend the 12GB version is not really a 4070). I know there has been quite a bit of inflation since then, but should a high-end gaming GPU really be more than $700-800? Apparently Nvidia thinks so, you now have to shell out at least $900 to get into the "high-end" club.
 
Last edited:
I'm taking my 3070ti out for deep cleaning so that it will stay with me for the next 5 years before I can change to AMD. These prices are hilarious.
I did not have any bad opinion about NVIDIA till this year, after the Hardware Unboxed scandal, EVGA and now this, I definitely change to AMD for my next GPU, and I will recommend any friends of mine to change as well.
 
I'm taking my 3070ti out for deep cleaning so that it will stay with me for the next 5 years before I can change to AMD. These prices are hilarious.
I did not have any bad opinion about NVIDIA till this year, after the Hardware Unboxed scandal, EVGA and now this, I definitely change to AMD for my next GPU, and I will recommend any friends of mine to change as well.

Well exactly the 3070 is $499 when the 4070 comes out base on these prices you will be lucky if it is $700.

Look at the 3080 is $699 and the 3080 12GB is $799 and 3080 Ti is $1199.

And now what they want 4080 it is $899 for 12 GB and $1199 for 4080 16 GB.

The prices are really going up this time with the economy is really bad and energy cost is going up.
 
Last edited:
These are priced to sell off Ampere stock.
Actually, that makes a lot of sense of this insanity. No way would I pay what they are asking for these. I expect to see deep discounts for these cards next year, kind of like that laughable 3090Ti that is now $400-$500 less than the launch 3090.
 
I don't see these prices being sustainable with the influx of inventory from miners or just distributors trying to dump inventory. Everyone has a price and whenever the mining cards hit that price it will impact the 40 series.

I don't think we've seen the end of the long term effects of this last crypto boom and the damage nVidia did to themselves with it. $900 for what should be a 4070? Just because they call it a 4080 doesn't stop it from effectively being what a 4070 should be. And, frankly, $1200 for a 4080? Especially since we are entering a recession.
Do not underestimate the stupidity of the loyal rabid fanbois.

Remember, fans of the Kartrashians were donating money to one of them so she could officially be a billionaire.
 
Am I understanding correctly that DLSS now has frame interpolation like TVs have been doing for years, making movies look like soap operas?

Does that mean I can instead connect my PC to a TV, lock a game at 24FPS, and turn on the TV's soap opera effect for free? To get 120 "FPS" from a 1060 or 580 at 4K.

Wow, technology.

TVs add crazy delay when you enable that option. This supposedly doesn't add that extra delay.
 
It's not just the price, the marketing focus on DLSS performance over raw rasterisation bothers me. The exclusivity of DLSS 3 to 4xxx is also an eyebrow raiser, as is the rebucketing of the chips into the product tiers. I'll definitely be waiting to see how these compare with RDNA 3 when the reviews and stock come in.

After a 6 year wait, I'm definitely in the market for a new GPU, but is it just me, or does something seems very off about this announcement?
 
I really want to move to all-AMD build, but Nvidia doesn't make it easy with how well their graphics card are.

I know RT doesn't improves graphics that much when compared to well-built global illumination and cast shadow using modern rasterization technique, but it's still an improvement regardless.

Ampere are already faster 1.5 to 2x than RDNA2 in RT performance and now this? AMD needs to step up their game more, hopefully RDNA3 gives us a robust improvement in RT department.

that way, the consumers win with how fierce the competitions are.
Let's be real for a minute. By the time that games come out with native DLSS 3.0 support, you will be on the market for a new GPU. Why invest in RT cutting-edge hardware now when the games that support this type of performance are years away from release? The money you will save from not being on the cutting edge will let you upgrade your GPU more often.

You are gonna want a new GPU in 3 to 4 years anyway with the advances in performance/watt, RT and AI learning, and on-cache chiplets.
 
After a 6 year wait, I'm definitely in the market for a new GPU, but is it just me, or does something seems very off about this announcement?
I'm in the same boat and yes, something is off with the announcement.
For starters, Nvidia didn't release any details, as in, they don't explain in any of the benchmarks what they're comparing to, or what the test system used was, or what it's performance is like to any competition. All benchmarks being shown had every feature turned on, Ray-Tracing, DLSS. They didn't want to show the performance for the 99% of the rest of the games out there that don't support either.

When AMD show off their stuff in the same fashion, they full on call out the competition and they even show games that run slightly worse on their own hardware, it adds a bit of honesty and builds trust. Nvidia's whole presentation for the new GPU performance was a bit untrustworthy purely from the way they displayed the performance metrics. Like they were deliberately hiding something.

They're also trying to be downright dirty when it comes to the naming and pricing of the 40 series, the 12GB model is effectively a 4070 and all three of them are overpriced. Whether that's to sell 30 series stock or hoping enough early adopters make up some sort of shortfall on an accountants bank sheet, we'll never know.

Here's where my money would be though, these things will drop in price early next year, AMD's GPU's won't be quite as good with ray-tracing but make up for it with non-ray-traced performance being at least on par for a lower price.

Intel won't really be competing with these two for another generation or two. Hopefully they get their act together in 2026/2027 but Intel have the most to gain here, they produce their own chips, therefore they can price lower or make more money depending on the performance vs the competition.
 
Typical Apple move. Complete focus on a fancy new feature that is not available on older models to ensure their obsolescence. Oh, and while we’re at it, let’s make sure the base model is so crippled compared to the next step up that you definitely don’t want to go with the base and of course the top end model is such a leap over the one below, that you really -really- want to see if you can somehow find the money in the budget to just go all out. Oh yeah, as an Apple product user, I know all about that scheme. Somebody needs to make a Rule 34 of Jensen and Cook showing their new found love for each other.

But it’s ok. I’m heavily into Microsoft Flight Simulator 2020 and this could be the magic bullet to get a constant 60+ fps, something which my current 3090/12900K system is not. I have 2700 euros just stashed away for this purpose. Here’s my wallet Nvidia. I’ll bend over and take it. Assuming DLSS3 actually works without introducing wonky artifacts or making the cockpit instruments blurry. 14 day free return policy in Europe rocks, so I’m not afraid to test it out. At least we get something for our taxes.

That said. What’s in there for the average gamer? They said nothing really about rasterization and that’s usually a BAD sign. What can the 4090 deliver in non-DLSS3 games that will justify the price and power? I suspect that for mainstream gamers on a budget during a time of inflation and increasing energy costs, the 40 series will be a hard pass, especially if AMD can come up with something offering similar performance in non-DLSS3 titles but at a much more reasonable price. Time will tell.
 
Unlike many here, I will wait until after AMD releases their GPUs AND proper unbiased reviews are posted.

Meanwhile, lets enjoy the posts that even without proper reviews, the loyal ones are already posting, praising the gpu and already in line to purchase these gpus.
Absolutely this. Let's see the performance on actual games with release drivers with realistic 1080p, 1440p, and 4K Medium-detail and High-Detail settings and compare to the competition (AMD).

I wonder what the actual improvement on pure rasterization without DLSS 3 or SER will be.
 
I'm in the same boat and yes, something is off with the announcement.
For starters, Nvidia didn't release any details, as in, they don't explain in any of the benchmarks what they're comparing to, or what the test system used was, or what it's performance is like to any competition. All benchmarks being shown had every feature turned on, Ray-Tracing, DLSS. They didn't want to show the performance for the 99% of the rest of the games out there that don't support either.

When AMD show off their stuff in the same fashion, they full on call out the competition and they even show games that run slightly worse on their own hardware, it adds a bit of honesty and builds trust. Nvidia's whole presentation for the new GPU performance was a bit untrustworthy purely from the way they displayed the performance metrics. Like they were deliberately hiding something.

They're also trying to be downright dirty when it comes to the naming and pricing of the 40 series, the 12GB model is effectively a 4070 and all three of them are overpriced. Whether that's to sell 30 series stock or hoping enough early adopters make up some sort of shortfall on an accountants bank sheet, we'll never know.

Here's where my money would be though, these things will drop in price early next year, AMD's GPU's won't be quite as good with ray-tracing but make up for it with non-ray-traced performance being at least on par for a lower price.

Intel won't really be competing with these two for another generation or two. Hopefully they get their act together in 2026/2027 but Intel have the most to gain here, they produce their own chips, therefore they can price lower or make more money depending on the performance vs the competition.

You are right, the 4080 12GB should be called a 4070 because it’s not even the same die as the 16GB AND it’s not a cut down die either. So even though they don’t share the same memory config, die, and core counts are different enough for a decent performance delta…they’re calling them both a 4080.

This is Nvidia just pushing tier pricing up again in an attempt to keep the ridiculous crypto boom profits going to try and keep investors happy. Watch…the 4050 may crest the $300 price point for MSRP while I’ve still yet to see a 3050 for sale at MSRP.

If you haven’t seen the JayzTwoCents video about Jensen openly admitting to price fixing by intentionally withholding RTX 30 series stock presently, I.e. manipulation of the market, I would highly suggest you do so. JTC and GamersNexus both have excellent videos on EVGA’s departure which is enlightening and a bit sickening honestly.

Prices on 3080 to 3090 ti were intentionally reduced to where they have fallen currently to allow the “4080” 12 GB to slot in with the severely reduced 3000 cards. Leaving the actual 4080 and 4090 to sit atop the pricing and performance stack. AIB partner cards won’t likely be able to hit the prices that Nvidia is advertising for the FE because they even gouge their board partners on the price of the dies. And they don’t even tell the AIB partners the price of the silicon until it is officially announced! How messed up is that?!

Have you noticed that pricing on 3070 ti and below hasn’t really even fallen below MSRP yet? Why would they reduce pricing on them if there’s not a new product to replace it, and you’re intentionally reducing available stock to keep prices high.

If AMD comes in as competitive in pure rasterization performance as RX 6000 was to RTX 3000, with pricing that isn’t ****ing delusional, and better efficiency again, I believe they will have a winning formula to suck some more market share away from Nvidia.
 
AMD will be cacking themselves silly over the pricing. 7800XT looking better all the time and AMD can manufacture RDNA3 far cheaper than Nvidia can make Lovelace so don't expect ludicrous pricing. Nvidia also has to price the Lovelace so high because of the massive Ampere glut both new and from miners.
 
Status
Not open for further replies.
Back