Nvidia's GeForce RTX 3080 Ti 20GB graphics card shows up at Russian retailer

nanoguy

Posts: 1,355   +27
Staff member
In brief: It looks increasingly likely that Nvidia has been preparing an RTX 3080 Ti with a more healthy 20 gigabytes of video memory, but it's not clear when you'll be able to buy one, even if you can afford to pay an inflated price. The more worrying part is that we won't have to wait for next-gen graphics cards to arrive in order to see single GPUs pushing over 400 watts -- the mid-generation hardware refreshes look hotter and more power hungry than ever.

When our own Steve Walton tested Nvidia's GeForce RTX 3080 Ti graphics card, he couldn't help but ask himself what the intended audience of this product was. The RTX 3080 Ti has proven to be a powerful member of the Ampere family, but at the same time it was an expensive, poorly-timed release with only half as much VRAM as the RTX 3090.

Rumors have been circulating for months about an RTX 3080 refresh with 20 gigabytes of VRAM, but when the RTX 3080 Ti arrived, it left Nvidia fans disappointed with a mere 12 gigabytes, less than any of AMD's high-end Radeon 6000 series cards. Still, both Asus and Gigabyte had slip ups last year where they revealed unreleased RTX 3080 Ti SKUs with 20 gigabytes of video memory, so it was only a matter of time before they surfaced again.

This week, an anonymous user uploaded what appears to be the firmware for a Gigabyte Aorus GeForce RTX 3080 Ti Xtreme (GV-N308TAORUS X-20GD) graphics card, which was spotted by @momomo_us. The device ID also appears to be different than the existing 12 gigabyte model that was released in June, indicating this could indeed turn out to be the long-awaited 20 gigabyte model.

Other than that, the new model looks to be equipped with the same 19 Gbps GDDR6X memory over a slightly smaller, 320-bit memory bus. If true, this would mean the new RTX 3080 Ti will have 17 percent less memory bandwidth than the current model. Be that as it may, a more worrying aspect is the power target of 420 watts, which could be a troubling sign of things to come.

If this doesn't convince you, apparently there's a Russian retailer called HARDVAR that is already selling this new graphics card in Saint Petersburg for 225,000 rubles -- the equivalent of $3,071. Another one called ZSCOM is also jumping in on the action. As usual, cryptocurrency miners have already gotten their hands on it to test its ability to turn watts into millions of hashes that certify transactions on the Ethereum network.

One such miner shared his results on YouTube, and by the looks of it, the card doesn't appear to have a hash rate limiter. After some tweaking, he was able to get 94 MH per second, which is lower than an RTX 3090 but significantly better than the existing RTX 3080 Ti, which is only able to reach around 65-66 MH per second. The YouTuber also explained that he's been able to get close to 98 MH per second but the card was less stable at that speed.

Given the fact that one Russian retailer is already selling the RTX 3080 Ti 20GB and another is already listing the new model on its website, we won't have to wait much longer before we see it popping up in more places. As for the price, it's possible the MSRP will be somewhere between that of the RTX 3080 Ti 12GB and the RTX 3090, which carry an MSRP of $1,200 and $1,500, respectively -- not that you can expect to find this new RTX 3080 Ti model at anywhere near its recommended price.

Rumors are also picking up about a monstrous RTX 3090 Super being in the works, and that too will have an inflated TGP of over 400 watts. If anything, it looks like the era of single GPUs demanding in excess of 400 watts is already here, despite indications that it will be ushered by next-generation models from Nvidia and AMD.

In the meantime, Intel is working hard on its Alchemist GPUs, but they're far from ready to battle the beasts coming out of the dungeons of the other two manufacturers.

Permalink to story.

 
This is just ultimately stupid.

The reason I bought 3080Ti instead of 3090 wasn't the price difference, it was the fact that for the extra money I would get extra heating related-issues created by dual-side memory design, which is what causes overheating inside 3090, and requiring much more cooling. And I would get the extra memory for which I have no use whatsoever. 12GB is plenty for everything, apart from heavy images analysis which I will never need.

3080Ti has a one-side memory design, which is a great advantage over 3090, and these guys turned it into a subpar 3090. LOL.
 
Yet another pointless Nvidia card that will be just as unavailable as the 3080, regular 3080 Ti, and 3090. Redundantly redundant.
 
This card will sell out on day one. Some will go to enthusiasts - people who don’t give a dam about Nvidia, AMD or Intel’s “consumer practises” etc. These are people who enjoy bleeding edge technology and understand that it costs money. These people do exist, it’s not just QuantumPhysics.

But most will go to miners probably.
 
I am growing tired of Nvidia and their policies. Thank God I bought a Radeon with healthy 16 GB in VRAM a few months ago. The way Nvidia plays with their customers is just disgusting.

I wish AMD performed well in video editing, RIP
 
Is this better then two RTX3090s FTW3? Asking for a friend.
702.jpg
 
I wish AMD performed well in video editing, RIP
Serious question, isnt that the case because of the software vendors not supporting AMD properly?

And if thats the case, weird they do that, since they also support Macs, which only use Apple or AMD gpus.
 
Serious question, isnt that the case because of the software vendors not supporting AMD properly?

And if thats the case, weird they do that, since they also support Macs, which only use Apple or AMD gpus.

Partly but also NVENC is OP.

Right now I am pretty much ****ed. £10,000 for new camera setup and £3,000 for new PC, RIP
 
As much as I appreciate the site, this article is all sorts of wrong.

Several other sites (including pc gamer of all things) are far more detailed about this card.

It is not an official launch, and this will not exist in mass markets.
What they have is a legit card.
The plan was that nVidia was going to launch the 3080ti using the same memory bandwidth as the 3080 (hence the 20GB variant) as a reaponse to AMD and their higher VRam GPU's.
This was supposed to be the original form of the 3080ti.

Some AIB partners went through with full production runs (Gigabyte specifically) for the GPU to be ready for market, then nVidia last minute decided to go with a gimped 3090 instead of upgraded 3080 (hence the 12GB VRam on the actual 3080ti, at the same memory bus speed as the 3090).

This card is literally only good for mining. There is no official driver support for it available for gaming.

These are 'pre-production' (so to speak) units based on original blueprints that Gigabyte jumped the gun on prior to spec changes from nVidia.

 
Partly but also NVENC is OP.

Right now I am pretty much ****ed. £10,000 for new camera setup and £3,000 for new PC, RIP
The benchmarks above don't seem to be relevant anymore because it uses an older version and it seems that the latest versions have seen big optimisations for RDNA2. And the Nvidia cards struggle to do 6K or 8K editing because of the limited VRAM, this is something you need at least 16GB to do properly (the article says that 20GB is optimal for 8K).

DaVinci 17.3 was a huge update in terms of speed (especially for the M1 Apple SOC)
 
Last edited:
The benchmarks above don't seem to be relevant anymore because it uses an older version and it seems that the latest versions have been big optimisations for RDNA2. And the Nvidia cards struggle to do 6K or 8K editing because of the limited VRAM, this is something you need at least 16GB to do properly (the article says that 20GB is optimal for 8K).

DaVinci 17.3 was a huge update in terms of speed (especially for the M1 Apple SOC)

Those were the last Puget benchmarks I have found, the case is still very much the same. Anything above 20GB VRAM requirement will require you to run Quadro cards anyway.

I am currently on 6K timelines and the 2080ti does just fine. AMD is still quite behind on being taken seriously in this department. If you are shelling out £1.5K for a GPU, you will be going for NVIDIA regardless as AMD is not really an option.

There are also multiple tests shown where sub 12GB VRAM beats cards which are higher VRAM (NVIDIA vs NVIDIA or AMD).

Content Creation support from NVIDIA is simply, stronger.
 
Those were the last Puget benchmarks I have found, the case is still very much the same. Anything above 20GB VRAM requirement will require you to run Quadro cards anyway.

I am currently on 6K timelines and the 2080ti does just fine. AMD is still quite behind on being taken seriously in this department. If you are shelling out £1.5K for a GPU, you will be going for NVIDIA regardless as AMD is not really an option.

There are also multiple tests shown where sub 12GB VRAM beats cards which are higher VRAM (NVIDIA vs NVIDIA or AMD).

Content Creation support from NVIDIA is simply, stronger.
I get your whole post, but I would word it a bit differently, since I dont think is entirely AMD's fault, seems like the vendor decided to go that route for other reasons that are not hardware related.

Not blaming you per se and I understand that in the end, it does affects you as stated though.
 
Those were the last Puget benchmarks I have found, the case is still very much the same. Anything above 20GB VRAM requirement will require you to run Quadro cards anyway.

I am currently on 6K timelines and the 2080ti does just fine. AMD is still quite behind on being taken seriously in this department. If you are shelling out £1.5K for a GPU, you will be going for NVIDIA regardless as AMD is not really an option.

There are also multiple tests shown where sub 12GB VRAM beats cards which are higher VRAM (NVIDIA vs NVIDIA or AMD).

Content Creation support from NVIDIA is simply, stronger.
I guess it depends on the project. The resolution is not the only thing that affects VRAM usage.

If it works for you well then kudos to you, the 2080ti is not a bad card for such workloads, especially if you limit yourself to 6K, or do some light 8K videos.

But I would still stay clear of the 3070-3070ti if you do 6K or 8K (even the 3080 has very little VRAM).

I am going to assume that NVIDIA will use the "more RAM" marketing slogan for their next gen cards. 8 or 10GB is just stupid in my opinion for such expensive cards.
 
Back