Watch Nvidia's CES 2022 keynote right here at 8am PT / 11am ET: RTX 3090 Ti incoming?

midian182

Posts: 9,738   +121
Staff member
Why it matters: Tech fans have been waiting in anticipation for this day ever since Nvidia, AMD, and Intel announced they're holding their respective CES 2022 virtual events within a few hours of each other. Team Green’s starts at 8am PT / 11am ET, and you can watch the whole thing right here. We’re expecting some exciting reveals, including the long-rumored RTX 3090 Ti.

Given the flood of reports that claim an RTX 3090 Ti will arrive in January, along with alleged photographs of the card/packaging, the announcement of a new Ampere flagship seems pretty much inevitable.

The RTX 3090 Ti is expected to feature a whopping 450W TDP—100W more than that vanilla RTX 3090—along with a GA102-250 GPU with 10,752 CUDA cores (256 more than the RTX 3090), 24GB GDDR6X VRAM, 84 clusters, 84 RT cores, 336 Tensor cores, and 128 ROPs. We’ve also heard that it will come with 21 Gbps memory allowing for a total theoretical bandwidth of over 1TB/s and feature 2GB modules instead of the 1GB modules used on the current model.

While the RTX 3090 Ti sounds impressive, the chances of it being in plentiful stock and at a price that isn’t much higher than an already substantial MSRP are slim to none. Nvidia will be hoping those issues are less apparent at the other end of the Ampere hierarchy, where we’re expecting to see a desktop version of the RTX 3050 announced.

The entry-level RTX 3050 is rumored to feature the GA106-150 GPU and may rival Intel’s upcoming Arc Alchemist A380.

We could see some new laptop GPUs at the event—a mobile RTX 3070 Ti has been rumored—and possibly some refreshed versions of current cards, such as an RTX 3080 12GB and an RTX 3070 Ti 16GB.

Away from graphics cards, Nvidia says it will be showcasing “the latest breakthroughs in accelerated computing” during the conference, which covers design, simulation, and autonomous driving, as well as games. Come back later today to see how many predictions we got right.

Permalink to story.

 
In this storm of shortage, I don't even want to imagine the price of 3090ti and later on this year, the rtx 4000 series... I'm waiting to see the 3d v-cache line up, if there is a 5600x with 3d v-cache then definitely that will be my goal to upgrade my 3600x.
 
If you still can't run 4K at high detail then there's still no point, and nothing I've seen or heard suggests any iteration of the current gen can do that. For that matter, a 2070 can do anything you want at 1440p so if you're lucky enough to have one you're not missing a thing. I'm still kicking myself for not jumping on a $1200 dollar machine with a 3070 I could've had in fall of 2020..I'd be set for the next five years.
 
If you still can't run 4K at high detail then there's still no point, and nothing I've seen or heard suggests any iteration of the current gen can do that. For that matter, a 2070 can do anything you want at 1440p so if you're lucky enough to have one you're not missing a thing. I'm still kicking myself for not jumping on a $1200 dollar machine with a 3070 I could've had in fall of 2020..I'd be set for the next five years.

They can already run 4k at max settings, I've been doing it for a couple of years now with a 1080ti and now a 3080ti. I get 60 or more fps based on the game.
 
How much performance gain over 3090 can anyone guess it?
Marginal is my uneducated guess. As in, you can measure it, but you won't be able to actually feel it.

Considering the 100w extra, I highly suspect they're brute forcing extra performance by boosting higher. The real question is, how much better is it compared to a 3090 OCed to 450w and how much further can you push the 3090ti?

At the end of the day, it feels a bit pointless at this moment in time. I don't know why Nvidia has decided to release this. I guess it's whale milking season again? If you have nothing else to spend your money on, sure - knock yourself out and have fun. If you're the type who is prone to getting buyer's remorse though, I wouldn't take the bait. Lovelace and RDNA3 are rumoured to bring big performance gains to the table this year. Given the current situation, anyone who can live with their current GPU until the end of the year (possibly as soon as Q3 I think I read?) should try to do so.

I'm rocking a 3090 I got at launch before the price exploded, so I have zero complaints. To be honest though, I could have kept my 2080ti and for the most part be fine. It's only Microsoft Flight Simulator and DCS World that I need all that raw performance for and you can just lower settings to compensate. I basically got it because I had the money, didn't know what else to spend it on and all the 3080s were gone and being scalped at 3090 prices, so why the heck not? On the positive side, my nephew is super happy that I sold it to him for less than what a 3060 costs in my country at this time. I don't think anyone would complain over such a deal. It even still had one year of warranty at the time.
 
If you still can't run 4K at high detail then there's still no point, and nothing I've seen or heard suggests any iteration of the current gen can do that. For that matter, a 2070 can do anything you want at 1440p so if you're lucky enough to have one you're not missing a thing. I'm still kicking myself for not jumping on a $1200 dollar machine with a 3070 I could've had in fall of 2020..I'd be set for the next five years.
I'm not sure what you mean but I have been playing in 4K since 2014. Since Vega56 there was no issue with 4K gaming for the majority of titles.... My Vega was retired in late 2020 for a 6800 and now I literally get 60FPS+ in every single title I've ever tried even on high settings in 4K.... the majority of titles are already into the 100-140 FPS categories and eSports games are typically in the 200+FPS category and hitting CPU limitations before GPU limitations.
 
Back