Rumor: Nvidia moves RTX 3080 Ti launch to February because Big Navi offers little threat

midian182

Posts: 9,741   +121
Staff member
Rumor mill: We’ve been hearing rumors about the RTX 3080 Ti for quite a while now, most of which suggested it would arrive in January. But the latest word on the grapevine is that Nvidia is pushing that date back to February, allowing the company to focus on improving supply of its current cards. It also seems that concern over Big Navi's threat to the vanilla RTX 3080 has proved unfounded.

Last month brought rumors of an RTX 3080 Ti launching in January, and the card was one of several unreleased Ampere products spotted on HP’s OEM driver list last week. According to VideoCardz’s sources, that January launch has now been moved to February, a claim echoed by Igor Wallosek of Igor’s Lab.

Part of the reasoning behind the postponement is that AMD’s Big Navi cards aren’t as much of a competitor to the RTX 3080 as Nvidia feared. In our testing, the Radeon RX 6800 XT is just mildly ahead at 1440p across 18 games, while the RTX 3080 is the better-performing card at 4K. Even the Radeon RX 6900 XT is only faster by a few fps. As such, team green doesn’t feel the need to rush out an RTX 3080 Ti, which is said to double the standard version’s VRAM to 20GB of GDDR6X—both the RX 6800 XT and 6900 XT have 16GB of GDDR6.

Another reason behind the move could be the availability issues plaguing the current crop of Ampere cards. Nvidia has said these shortages will continue until February 2021, the same month the RTX 3080 Ti is now rumored to arrive.

Additionally, both publications believe that there will be two versions of the RTX 3060 (6GB and 12GB), both of which were also on HP’s list. The more powerful variant is due to arrive between January 11 and 14, the same time as CES, while the 6GB card is said to hit at the end of January or early February. Interestingly, that 6GB RTX 3060 model is thought to have started life as the RTX 3050 Ti before Nvidia rebranded it.

Permalink to story.

 
I'm quite tempted by a 3080Ti but I dunno, except for Cyberpunk, I still don't have a reason to replace my 1080Ti, next round maybe?

Hopefully AMD can turn up the heat on ray-tracing performance next round.

1080Ti still does a fine job if you care not about ray tracing. However an RTX3080 is clearly a big upgrade now especially at 4K. So I guess it's whether you would benefit or not with what you play and at what resolution.

When the RTX3000 series are in wider stock a lot of people will move on from the GTX1000 series in particular having hung on through Turing unimpressed, waiting for a worthwhile leap.

I think Ampere is a significant step now so they will shift many 3060/3060Ti cards. Particularly to those still on GTX1060 and GTX1070, which is a lot of people according to Steam survey.
 
1080Ti still does a fine job if you care not about ray tracing. However an RTX3080 is clearly a big upgrade now especially at 4K. So I guess it's whether you would benefit or not with what you play and at what resolution.
Yeah I'm not sold on 4k to be honest, unless you spend a load of money on the screen, most 4k monitors have a horrible HDR implementation as well so I'm still surprised to this day to see so many people talk about it.

I've got a decent 4k TV but it's not quite big enough for the distance I sit so I'm hard pressed to see the difference from a 1080p Blu-ray vs a 4k (although a good HDR implementation on the movie does make a big difference)

Ultimately, I don't care for 4k performance, 1440p is really where it's at and they don't quite wow me performance wise still, kinda feels like we're going backwards at times when you need to turn on DLSS on a 1440p monitor to play the latest games.
 
Yeah I'm not sold on 4k to be honest, unless you spend a load of money on the screen, most 4k monitors have a horrible HDR implementation as well so I'm still surprised to this day to see so many people talk about it.

I've got a decent 4k TV but it's not quite big enough for the distance I sit so I'm hard pressed to see the difference from a 1080p Blu-ray vs a 4k (although a good HDR implementation on the movie does make a big difference)

Ultimately, I don't care for 4k performance, 1440p is really where it's at and they don't quite wow me performance wise still, kinda feels like we're going backwards at times when you need to turn on DLSS on a 1440p monitor to play the latest games.

With you there. Framerate is so much more important to me than resolution - 90+ fps at 1440p is the sweet spot.

I'm gonna hang on til 3080Ti releases, feels like it should be at least a 2x performance upgrade across the board compared to 1080Ti.
 
Obviously the 6800XT is not a threat since it's not available. And prices listed aren't thrilling either.

Then again, you could say the same thing the other way around, so if nVidia can't supply the 3080, why would they be able to offer better supplies of a higher specced version of the chip ?

Neither are for me, I'm more interested in 6700 series price and availability and 3060Ti availability.
 
I really do think this is a supply driven choice, not a competition driven one. They know they can beat the Radeon 6900 XT at $999 if they want, so why would they worry about it when they can't even keep up with demand at the current prices? I would believe this rumor more if we saw the RTX 3090 in stock anywhere near MSRP.
 
Since less than 3% (Steam hardware survey) game at 4K, it would seem advisable to also list 1080P and 1440P averages. They are available here, but for whatever reason, the decision was made not to include them:

Then the more likely reason for the Ti becomes clear.

EDIT:

1080p.png



1440p.png
 
Last edited:
What I'm seeing from this is there is no stock of current cards so no point releasing another card you won't be able to buy. Makes sense to fix stock on current models. As for no competition not really sure I believe that. The 6800XT is competitive with the 3080. The 3080 TI would compete with the 6900XT and the 3090 would stay at the top of the stack.

The amount of people I know with 4k screens are close to Zero.

1080p and 1440p are the majority of displays.
 
Last edited:
What I'm seeing from this is there is no stock of current cards so no point releasing another card you won't be able to buy. Makes sense to fix stock on current models. As for no competition not really sure I believe that. The 6800XT is competitive with the 3080. The 3080 TI would compete with the 6900XT and the 3090 would stay at the stock of the stack.

The amount of people I know with 4k screens are close to Zero.

1080p and 1440p are the majority of displays.
I agree, much of the same old muchness taking out the 4k monitors.
 
I have a 4k TV, a 1440p monitor, and a 1200p monitor.

The 1200p is my favorite, and I still keep it connectedfor viewing media of any kind, because tis an asus proart display and obliterates VA panels in image quality.

The 1440p does 90hz with freesync, and it's great for gaming. Even then, some older games dont like running above 1080p, and many dont like running at 90 hz.

I see no reason to run at 4k. On a monitor, you have to increase DPI so you cans ee what youre doing, and at that point you're basically running at 1080p anyway. Games have to run at 1080p because if they do run at 4k many dont rezide the UI elements correctly, so again you cant see what you're doing.

4K is great with HDR for movies, but for games still seems a bit odd man out. Raytracing doesnt even factor in for me, because my the time its widespread in 5 years none of the current cards will be worth talking about.
 
Makes sense to wait. Big Navi isn't even listed at two major Canadian etailers.

I saw it listed at Canada computer on launch day for maybe 1 hour, I saw two items in stock they were both gone soon after and then they removed the listing of those radeons.

I have a 4k TV, a 1440p monitor, and a 1200p monitor.

The 1200p is my favorite, and I still keep it connectedfor viewing media of any kind, because tis an asus proart display and obliterates VA panels in image quality.

The 1440p does 90hz with freesync, and it's great for gaming. Even then, some older games dont like running above 1080p, and many dont like running at 90 hz.

I see no reason to run at 4k. On a monitor, you have to increase DPI so you cans ee what youre doing, and at that point you're basically running at 1080p anyway. Games have to run at 1080p because if they do run at 4k many dont rezide the UI elements correctly, so again you cant see what you're doing.

4K is great with HDR for movies, but for games still seems a bit odd man out. Raytracing doesnt even factor in for me, because my the time its widespread in 5 years none of the current cards will be worth talking about.

This for me to use a 4k monitor and it to be comfortable for me it would have to be 32` so I don' t have to be messing with scaling options and enlarging everything. My current monitor is also 1200p and I still enjoy using it. I watch movies on my TV no need to view that on my computer monitor.
 
Obviously the 6800XT is not a threat since it's not available. And prices listed aren't thrilling either.

Then again, you could say the same thing the other way around, so if nVidia can't supply the 3080, why would they be able to offer better supplies of a higher specced version of the chip ?

Neither are for me, I'm more interested in 6700 series price and availability and 3060Ti availability.
I'm going to skip this generation entirely. Both the pricing and availability of the RDNA2 cards have been garbage. I mean, sure the higher resolution performance is better but I've encountered a really weird thing. I use a 55" 4K TV and gaming on it is glorious but I came across a strange thing.

I wanted to see what the real difference was between 1080p, 1440p and 2160p so I ran the Far Cry 5 benchmark at all three resolutions and looked closely to see if I could really pick out the difference between them. It turns out that I couldn't which shocked the hell out of me. Maybe my eyes are getting old or maybe my TV does a really good job of upsampling, I really don't know.

I'm starting to wonder if getting my RX 5700 XT was a mistake because if I can't see the difference between 1080p and 2160p, it's very possible that I could have just stayed happy with my R9 Fury.
Since less than 3% (Steam hardware survey) game at 4K, it would seem advisable to also list 1080P and 1440P averages. They are available here, but for whatever reason, the decision was made not to include them:

Then the more likely reason for the Ti becomes clear.

EDIT:

1080p.png



1440p.png
The reason that Steve doesn't bother with 1080p is that it doesn't make any sense to spend this much money on cards for 1080p gaming. Hell, my old R9 Fury can run Godfall at 1080p so there's no reason to upgrade if 1080p is all that you're going to do. I'd even go so far as to recommend buying used if you want to game at 1080p.
 
Last edited:
1080Ti still does a fine job if you care not about ray tracing. However an RTX3080 is clearly a big upgrade now especially at 4K. So I guess it's whether you would benefit or not with what you play and at what resolution.

When the RTX3000 series are in wider stock a lot of people will move on from the GTX1000 series in particular having hung on through Turing unimpressed, waiting for a worthwhile leap.

I think Ampere is a significant step now so they will shift many 3060/3060Ti cards. Particularly to those still on GTX1060 and GTX1070, which is a lot of people according to Steam survey.
I have a 1070 OC'd edition in my desktop (similar performance to a 1080) and I don't really encounter any issues at 1080p in any games. I also have an RTX 5000 in my workstation, which I do use for VR work & rendering, and at 1080p I can run everything but FS2020 at max settings.

Frankly, I have been disappointed with RT in the games I have tried. I usually can't tell any difference without freeze framing the scene and comparing RT on vs RT off.

Now, when I used to do CGI work the effects of true RT are very impressive ... however, for games, I see no strong need for it. I think it will need a few more years before it starts to make a real difference.
 
Since less than 3% (Steam hardware survey) game at 4K, it would seem advisable to also list 1080P and 1440P averages. They are available here, but for whatever reason, the decision was made not to include them:

Then the more likely reason for the Ti becomes clear.
Indeed, the resolutions that actually matter they're being beaten in, I don't understand how they could bring out a 3080Ti at 6900XT prices and it be faster than the 3090 @1440p without either lowering the prices across the board or simply really pushing the 3090 is for content creation.
 
"Big" navi, Not a big deal. over blown hype. that said, amd's new marketing team sure knows how to market people into a frenzy of hype.
 
I'm quite tempted by a 3080Ti but I dunno, except for Cyberpunk, I still don't have a reason to replace my 1080Ti, next round maybe?

Hopefully AMD can turn up the heat on ray-tracing performance next round.

Same boat, but since I'm moving up to a 4k120 display, and I desperately want VRR, I need a HDMI 2.1 GPU. That means the 1080Ti.
 
Same boat, but since I'm moving up to a 4k120 display, and I desperately want VRR, I need a HDMI 2.1 GPU. That means the 1080Ti.
You're moving up to a screen that supports 4k120 but doesn't support G-Sync?

Brings up another good point though, the 3000 series do not support DisplayPort 2.0, Guess that'll be on the 4000 series?
 
You're moving up to a screen that supports 4k120 but doesn't support G-Sync?

No, HDMI VRR. Which requires HDMI 2.1.

It's kind of sad the best gaming displays are TVs now. The Display market is a good 2-3 years behind at this point. It's honestly sad. Someone, seriously, make a 32" OLED please.
 
No, HDMI VRR. Which requires HDMI 2.1.

It's kind of sad the best gaming displays are TVs now. The Display market is a good 2-3 years behind at this point. It's honestly sad. Someone, seriously, make a 32" OLED please.
Speaking of displays, have you gamed comfortably via screen mirroring?
 
Back