Alleged picture of the Galax GeForce RTX 3090 Ti Boomstar suggests it will be a space...

nanoguy

Posts: 1,355   +27
Staff member
In context: Nvidia’s much-awaited RTX 3090 Ti has been like a mirage ever since the company decided to reveal its existence. Originally announced at CES in January this year, Team Green’s “monster GPU” was supposed to secure the performance crown more firmly into its hands. Instead, the new card has been delayed to the point where the little hype that has been generated around it has largely died down.

The most recent rumors point to a March 29 launch date for the RTX 3090 Ti, with sales expected to start on the same day. However, those same murmurs suggest the MSRP will hover around $1,999, which is $500 above the suggested price of the regular RTX 3090. Retail GPU pricing has been improving as of late, so there’s hope that at least some gamers will be able to afford one.

Thanks to popular leaker @wxnod (via Videocardz), we now know what the Galax RTX 3090 Ti Boomstar will look like. The upcoming card is the Chinese-exclusive version, and it looks like it will occupy three slots, which isn’t all that surprising given the cooling requirements of a top-end GPU these days. If the leaked picture is any indication, it will use the 16-pin 12VHPWR power connector that’s part of the PCIe 5.0 standard and is capable of providing up to a whopping 600 watts of power.

AIB variants of the RTX 3090 Ti are said to require up to 450 watts in order to feed an overclocked GA102 GPU with no less than 10,752 CUDA cores and 24 gigabytes of 21 Gbps GDDR6X memory. The Founders Edition might still feature Nvidia’s proprietary 12-pin connector, and that would explain the long delay in bringing the new card to market.

In the meantime, AMD is preparing a response in the form of the Radeon RX 6950 XT for a late April launch. And according to @greymon55, the company will bring back the Midnight Black cooler design for the reference version of the new card.

Permalink to story.

 
IMO, it will be interesting to see reviews of these things when they come out. I've got a couple of different computers with power supplies that are capable of powering these things and I am planning on rebuilding those systems, but I don't think I am going to like the price, or the power consumption figures if they are actually as high as these rumors are saying they are.

EDIT: If those are actual power consumption figures, I have to wonder why nVidia is releasing these things, and once they are released, how well they will go over with the enthusiast market, or the miner market, for that matter.
 
450W - WOW and no thanks!

I really think you do not understand - you will be able to have all the improvements switched on and be able to get 5 more fps at 4K .
This is massive - admittedly not all the bells and whistles can even make a subjectively better viewing experience from say a RTX 3080.
But you would know - that it's there - if only by sound , heat, bank balance , posts on the gram and pcmaster race, and power bill like mentioned .
A woman does not buy a $2000 dress not to wear it on a cold show day .
 
At least 5lbs worth of aluminum... Say good bye to your PCIE slot.

I moved away from standard mid/full tower cases simply due to how heavy high-end GPUs are these days. After having my 980Ti AMP! Omega and that card weighing right around 3.5lbs, a couple of years mounted the PCI-E slot was almost damaged - the physical slot was bent on the MB, but everything still worked.

I moved to a new case - CM HAF XB Evo - where the MB mounts parallel with the ground. No more GPU sag. Also, I get better cooling in this case over any full tower I was using in the past, with fewer fans.

I could stuff one of these stupid 3090Ti cards in the case (as long as it's not longer than 13 inches) and not have to worry about GPU sag.
 
Just because a power connector can handle 600 watts doesn't mean the card will pull 600 watts.

"450 watts in order to feed an overclocked GA102 GPU" No overclocked anything is efficient, everyone is making this out to be the standard power draw as soon as they see the overclocked numbers.
 
Back