Nvidia unveils the GeForce RTX 3080 Ti: RTX 3090-like performance at $1,200; RTX 3070...

midian182

Posts: 9,741   +121
Staff member
What just happened? More than six months after rumors of their existence began, Nvidia has revealed the RTX 3080 Ti and RTX 3070 Ti at its Computex keynote. Much of what we'd already heard about the former, more powerful card proved accurate, apart from the slightly higher $1,199 MSRP. The 3070 Ti, meanwhile, is priced from $599 and arrives on June 10.

Talk of an RTX 3080 Ti has been around since Ampere was unveiled. Once thought to feature 20GB of GDDR6X, Nvidia confirmed the card comes with 12GB—half that of the RTX 3090. Its GA102-225 GPU features 10,240 CUDA cores with a total of 80 Streaming Multiprocessors (SM) units, 320 third-generation Tensor Cores, and 80 second-generation RT cores.

Nvidia threw out some comparison figures during its event: The RTX 3080 Ti boasts 1.5 times more performance than the GeForce RTX 2080 Ti and double the performance of the RTX 1080 Ti. It's essentially very similar to the RTX 3090, with less VRAM being the only significant difference—and the price.

The RTX 3080 Ti has 12GB of GDDR6X memory at 19 Gbps and uses a 384-bit bus interface, giving a total theoretical total bandwidth of 912 GB/s—only slightly behind the RTX 3090's 936.2 GB/s.

Elsewhere, the card has a 350W TDP, an expected base clock of 1440 MHz/1665 MHz Boost, 320 TMUs, and 112 ROPs. As with the rest of the consumer Ampere line, it's built on Samsung's 8N process technology. Going off looks alone, you might struggle to tell the difference between the RTX 3080 and RTX 3080 Ti Founders Editions. It has a single 12-pin Microfit connector—Nvidia includes an adapter for eight-pin cables—one HDMI 2.1 and three DP 2.0 connectors.

The RTX 3080 Ti launches on June 3 for $1,199. Expect custom models to arrive soon after. What we can also expect, sadly, is for it to suffer the same availability and scalping issues as every other graphics card right now.

Nvidia also unveiled the RTX 3070 Ti during the keynote, though it was a bit light on specs. We do know the card will feature 8GB of GDDR6X memory (the standard RTX 3070 has 8GB of GDRR6), 6144 CUDA cores, and use the full GA104 GPU core found in the vanilla version, utilizing all 48 SMs. There's also a 256-bit bus interface (608.3 GB/s bandwidth), and the base clock is expected to be 1575 MHz with a 1770 MHz boost.

Nvidia says the RTX 3070 Ti will offer a 50% performance boost compared to the RTX 2070 Super and a 20% improvement over the RTX 3070.

The RTX 3070 Ti will be available on June 10, starting at $599. The same availability/price caveats as the RTX 3080 Ti apply. You can watch Nvidia's entire keynote right here.

Permalink to story.

 
I wonder when this chip shortage is over how will they price future cards. Will they act like shortage is still happening when it is in fact over, will they keep higher prices or will they go back to how it once was? I would soon be in the market for a GPU, RTX30/RX6, if this wasn't happening but I would gladly wait as long as I need for the prices to go down since I don't suffer from needing the latest item.
 
I wonder when this chip shortage is over how will they price future cards. Will they act like shortage is still happening when it is in fact over, will they keep higher prices or will they go back to how it once was? I would soon be in the market for a GPU, RTX30/RX6, if this wasn't happening but I would gladly wait as long as I need for the prices to go down since I don't suffer from needing the latest item.
It is really hard to predict because of the multitude of factors (sortage, crypto, recurrance of COVID-19 lockdowns, etc). The biggest unknown is when and how strong Intel entry's would be. A third company, with Fabs of its own, could reshape the current market.
 
Well I think nVidia decided that gloves are off. 1200$. If you want to only game and nothing else then yes. 3080Ti looks decently well, but most of awesomeness of Ampere is not in gaming but in compute where difference vs previous 2 generations is just murderous. If you want to work in 3D then 3080Ti for nearly same amount of $ as 3090 is bad investment.

I'll explain why. On 3090 I can render solid 15 4K frames/still renders and keep all of that in VRAM. With 12GB you cannot do that ever. So if time is money it's easier to do scene/model/mesh correction with 15 previews than with 4-5. And if you're into high-res animation only 3090 or go home.

If it was 1000 vs 1500 it would be a different discussion, but it isn't.
 
So, a 60-70% price increase for a 10% performance uplift....bargain!!!

Nothing but legalised black market profiteering.


 
I'm impressed on the generational gain, the 2080 Ti was a beast no doubt so 50% performance improvement on that is really cool, and same for the 3070 Ti vs the 2070 Super. I got a 2070 Super in July 2019 and I didn't have any intention of upgrading - the only game I've struggled with so far is MS Flight Sim 2020 which we all know is a titan of requirements.
 
One more card that we can only get at reasonable prices from a major OEM pre-build.

If DELL updates its line-up for 3080-Ti, it might be incentive enough to buy just for that.
 
I'm impressed on the generational gain, the 2080 Ti was a beast no doubt so 50% performance improvement on that is really cool, and same for the 3070 Ti vs the 2070 Super. I got a 2070 Super in July 2019 and I didn't have any intention of upgrading - the only game I've struggled with so far is MS Flight Sim 2020 which we all know is a titan of requirements.

With MSFS2020 you’re up against CPU bottlenecks as well. Same with DCS World. I went from a 2080ti to a 3090 and I certainly didn’t see a 50% uplift even at 4k where CPU bottlenecking is less of an issue. I basically sit at the same FPS now (I shoot for 60 locked or 40 locked V-sync on 120hz monitor to sync up with TrackIR head tracking which runs 120hz), but with a bit more graphical options turned on/increased.
 
Speaking only from the gaming POV and nothing against those who've made the jump to 3-series but...

They won't compare it to a vanilla 3080, only comparing it to 1- and 2- series cards, and then only at 4K. That's because it isn't significantly better than a 3080 for the price premium... not that gaming 'value' needs to be the top priority for most buyers. They're seem to be trying to convince 1080ti gamers and up that 3- series is where it's at, but I really think it's worth waiting another generation if your current hardware isn't smoking. As for the 4K comparisons, I've never had any illusions about being able to game at 4K with my 1080ti and I'll be holding onto it until they can get supply problems under control, thanks.

Also, still seems really hard to tell the difference between RTX on and off in most scenes?
 
The difference in the 3080Ti and 3090 makes it worth just spending more for the 3090.

Thank God I got mine.

Have you got a 3090? You haven’t mentioned.

I think I’ll swap mine for a 3080 Ti FE if the gaming performance is basically the same. The 3090 thermals are out of control on my card compared to the FE coolers and I don’t use it for mining or professional applications.
 
I'm impressed on the generational gain, the 2080 Ti was a beast no doubt so 50% performance improvement on that is really cool, and same for the 3070 Ti vs the 2070 Super. I got a 2070 Super in July 2019 and I didn't have any intention of upgrading - the only game I've struggled with so far is MS Flight Sim 2020 which we all know is a titan of requirements.
It's a lot less impressive when you consider the RTX 3080 will offer 97% of the performance for a $500 lower MSRP 6 months ago.

Ampere is impressive, but the 3080ti is a total ripoff.
 
The difference in the 3080Ti and 3090 makes it worth just spending more for the 3090.

Thank God I got mine.
Yes, this card makes no sense. For a little more you can get the full 3090. For people buying these expensive cards, that difference is nothing.
 
Speaking only from the gaming POV and nothing against those who've made the jump to 3-series but...

They won't compare it to a vanilla 3080, only comparing it to 1- and 2- series cards, and then only at 4K. That's because it isn't significantly better than a 3080 for the price premium... not that gaming 'value' needs to be the top priority for most buyers. They're seem to be trying to convince 1080ti gamers and up that 3- series is where it's at, but I really think it's worth waiting another generation if your current hardware isn't smoking. As for the 4K comparisons, I've never had any illusions about being able to game at 4K with my 1080ti and I'll be holding onto it until they can get supply problems under control, thanks.

Also, still seems really hard to tell the difference between RTX on and off in most scenes?
I'm completely with you on this, my 1080Ti still happily plays anything I throw at it.

Ray-tracing is the future for sure, after seeing some of the behind-the-scenes of the latest version of Metro: Exodus over on Digital Foundry it's impressive what it brings to the table.

I agree it's not immediately noticeable because of how good we've managed to fake it all these years but not only is ray-tracing far more accurate, it's the speed at which developers can introduce light into their games. In the video they show the traditional way of lighting a scene and it took over 47 minutes, with Ray-tracing enabled, it was just under 2 minutes for the same scene.

That's a huge amount of time saved for developers whilst also making their games look better.

I'm still plenty happy with my 1080Ti and happy to wait until ray-tracing matures a bit more before pulling the trigger on a new GPU, plus, you know, Scalpers, lack of stock and general crap situation the markets in right now makes it pointless upgrading at this very moment.
 
nVidia is a joke of the Gaming Industry....

RTX 30 is such a flop and the more nVidia keeps trying to pressure Ampere's architecture down out throats, the more backlash they will see. Ampere is NOT gaming architecture, it is left-over Enterprise/Prosumer architecture, that nVidia is trying to pass off as gaming tech, by PAYING game devs to utilize proprietary hardware in a few AA games, to get noticed.

So sad...
 
I'm going to take a slightly different take here:

The high price, relative to the 3090, does not leave a lot of meat on the bone for scalpers. It may help to cut scalping. A bit. Maybe.

That said, I certainly ain't buying a 3080 ti at that price.
 
The fact that they did not increase RAM on 3070Ti vs 3080Ti is a joke. $600 for a high end card with 8GB is just silly now. I'd rather go the AMD way with FSR around the corner where for about the same kind of money I get 12 GB of RAM.
 
Back