My understanding is that there are more cores as well.
No kidding, It seems like it may already be happening. I'm looking at a 3080 and prices are holding pretty stable. If I can't get a 3080 12G or Ti for $600-650 USD I'll just get the 4080. For $100 over current prices you'll get 2x performance.
The 2X performance over 3080 Ti is almost definitely taking RT/DLSS 3.0 into consideration, not raw gaming performance. So you can expect to see that kind of performance boost only in limited scenarios. I expect the 4080 16GB will in reality be 25-40% better than the 3080 10GB and the 12GB version probably 15-25% faster in raw performance. All the bells and whistles are nice, but Nvidia is relying heavily here on enhancements from new tensor cores and rt cores. I can say this because we have a pretty good understanding already that the raw performance of the 4090 is 90% faster than the 3090, but it also has >50% more cores and faster clocks. The 4080 16 GB has only <12% more cores than the 3080 and the 12 GB version actually has <12% less cores than the 3080. The 4080 just is not going to be 2X the performance of the 3080 ti or even 3080 for that matter if the 4090 required 50% more cores to get 90% better raw performance. It is nice to see cards that should finally be able to handle real time RT and make it a practical enhancement, but, its unrealistic to think you'll get a return on that $1200 GPUs feature set for quite some time.Well, to be fair, the original 3080 only had 10G and was listed at $699. The 12G version had a much inflated MSRP, closer to $1,100-1,200. Current pricing of the 12G variant is sub-$799 right now. so for $100-200 more you're getting 2x performance. That's not horrible. Whether we should be paying $699 or $899 for a GPU is certainly a topic for discussion.
I guess we will wait for AMD to see where they land. Nividia could be having a 40 series sale even before they're available.
I'm taking my 3070ti out for deep cleaning so that it will stay with me for the next 5 years before I can change to AMD. These prices are hilarious.
I did not have any bad opinion about NVIDIA till this year, after the Hardware Unboxed scandal, EVGA and now this, I definitely change to AMD for my next GPU, and I will recommend any friends of mine to change as well.
Actually, that makes a lot of sense of this insanity. No way would I pay what they are asking for these. I expect to see deep discounts for these cards next year, kind of like that laughable 3090Ti that is now $400-$500 less than the launch 3090.These are priced to sell off Ampere stock.
Do not underestimate the stupidity of the loyal rabid fanbois.I don't see these prices being sustainable with the influx of inventory from miners or just distributors trying to dump inventory. Everyone has a price and whenever the mining cards hit that price it will impact the 40 series.
I don't think we've seen the end of the long term effects of this last crypto boom and the damage nVidia did to themselves with it. $900 for what should be a 4070? Just because they call it a 4080 doesn't stop it from effectively being what a 4070 should be. And, frankly, $1200 for a 4080? Especially since we are entering a recession.
Am I understanding correctly that DLSS now has frame interpolation like TVs have been doing for years, making movies look like soap operas?
Does that mean I can instead connect my PC to a TV, lock a game at 24FPS, and turn on the TV's soap opera effect for free? To get 120 "FPS" from a 1060 or 580 at 4K.
Let's be real for a minute. By the time that games come out with native DLSS 3.0 support, you will be on the market for a new GPU. Why invest in RT cutting-edge hardware now when the games that support this type of performance are years away from release? The money you will save from not being on the cutting edge will let you upgrade your GPU more often.I really want to move to all-AMD build, but Nvidia doesn't make it easy with how well their graphics card are.
I know RT doesn't improves graphics that much when compared to well-built global illumination and cast shadow using modern rasterization technique, but it's still an improvement regardless.
Ampere are already faster 1.5 to 2x than RDNA2 in RT performance and now this? AMD needs to step up their game more, hopefully RDNA3 gives us a robust improvement in RT department.
that way, the consumers win with how fierce the competitions are.
I'm in the same boat and yes, something is off with the announcement.After a 6 year wait, I'm definitely in the market for a new GPU, but is it just me, or does something seems very off about this announcement?
Absolutely this. Let's see the performance on actual games with release drivers with realistic 1080p, 1440p, and 4K Medium-detail and High-Detail settings and compare to the competition (AMD).Unlike many here, I will wait until after AMD releases their GPUs AND proper unbiased reviews are posted.
Meanwhile, lets enjoy the posts that even without proper reviews, the loyal ones are already posting, praising the gpu and already in line to purchase these gpus.
I'm in the same boat and yes, something is off with the announcement.
For starters, Nvidia didn't release any details, as in, they don't explain in any of the benchmarks what they're comparing to, or what the test system used was, or what it's performance is like to any competition. All benchmarks being shown had every feature turned on, Ray-Tracing, DLSS. They didn't want to show the performance for the 99% of the rest of the games out there that don't support either.
When AMD show off their stuff in the same fashion, they full on call out the competition and they even show games that run slightly worse on their own hardware, it adds a bit of honesty and builds trust. Nvidia's whole presentation for the new GPU performance was a bit untrustworthy purely from the way they displayed the performance metrics. Like they were deliberately hiding something.
They're also trying to be downright dirty when it comes to the naming and pricing of the 40 series, the 12GB model is effectively a 4070 and all three of them are overpriced. Whether that's to sell 30 series stock or hoping enough early adopters make up some sort of shortfall on an accountants bank sheet, we'll never know.
Here's where my money would be though, these things will drop in price early next year, AMD's GPU's won't be quite as good with ray-tracing but make up for it with non-ray-traced performance being at least on par for a lower price.
Intel won't really be competing with these two for another generation or two. Hopefully they get their act together in 2026/2027 but Intel have the most to gain here, they produce their own chips, therefore they can price lower or make more money depending on the performance vs the competition.