Nvidia GPUs with nearly 8,000 CUDA cores spotted in benchmark database (updated)

I have a 2080 Ti which I got used for much less than retail 4 months ago and Netflix app always plays 4K, you can check the resolution by pressing Ctrl + Shift + Alt + D. You need the HEVC codec from app store.

I will skip the 3080Ti as my OCed 2080Ti can run anything maxed at 4K60Hz which is the max of my monitor. Also it will be much more expensive vs 2080Ti because of no competition and lack of stock due to corona virus. I will upgrade in 5 years when I get a 4K120Hz OLED(?) monitor.

By much less do you mean $1100? I don’t think the 3080ti will be as expensive if AMD and Intel start eating at Nvidias main cash generator the low end gpus. The most popular graphics card is the Gtx 1050 and Gtx 1060. Nvidia sells way fewer 2080ti’s and the profit margins are much lower for the higher end cards. If you look at Nvidias financial statements their gaming revenue has drastically gone down, not including their data center revenue, especially because of their ridiculous prices. And with increased competition from the Xbox series X, PlayStation 5, and AMD Navi, I highly doubt they can continue to charge ridiculous prices.
 
By much less do you mean $1100? I don’t think the 3080ti will be as expensive if AMD and Intel start eating at Nvidias main cash generator the low end gpus. The most popular graphics card is the Gtx 1050 and Gtx 1060. Nvidia sells way fewer 2080ti’s and the profit margins are much lower for the higher end cards. If you look at Nvidias financial statements their gaming revenue has drastically gone down, not including their data center revenue, especially because of their ridiculous prices. And with increased competition from the Xbox series X, PlayStation 5, and AMD Navi, I highly doubt they can continue to charge ridiculous prices.
I hope so that prices come down. I got my Asus Strix OC 2080Ti for 1500$CA total 4 months ago from ebay and its now selling for about 2000$CA with taxes. Yep, we pay 15% taxes here in Quebec Canada and that sucks. Its often cheaper to buy from the US as taxes are more than import fees.
 
You spent $1200 on the 2080ti which is not future proofed with HDMI 2.1 or....

Netlicks streams 4k with a GEFORCE RTX 2080 Ti, and it will with a GTX 960!

Technology Support
Hardware Accelerated Ray Tracing
NVIDIA® GeForce Experience
NVIDIA Ansel
NVIDIA® Highlights
NVIDIA G-SYNC™
Game Ready Drivers
Microsoft® DirectX® 12 API, Vulkan API, OpenGL 4.6

DING ~> DisplayPort 1.4, HDMI 2.0b

DING ~> HDCP 2.2


It needs a 7th gen Intel but older processors work and AMD's work. You have to use Edge or the app, Netflix doesn’t stream 4K to Macs they have to run Windows in a VM.
 
By much less do you mean $1100? I don’t think the 3080ti will be as expensive if AMD and Intel start eating at Nvidias main cash generator the low end gpus. The most popular graphics card is the Gtx 1050 and Gtx 1060. Nvidia sells way fewer 2080ti’s and the profit margins are much lower for the higher end cards. If you look at Nvidias financial statements their gaming revenue has drastically gone down, not including their data center revenue, especially because of their ridiculous prices. And with increased competition from the Xbox series X, PlayStation 5, and AMD Navi, I highly doubt they can continue to charge ridiculous prices.

GN did a poll (and do note it won't be super accurate but they did conduct it with some controls) on this: https://www.gamersnexus.net/industry/3393-rtx-vs-gtx-sales-what-our-viewers-bought

The 2080 Ti did not sell anywhere near the level the 1080 Ti did. We are talking 2% for the 2080 Ti vs 64.6% for the 1080 Ti (although I suspect a portion of this is due to GN's userbase) Other RTX cards have not done as well as their less gen counterparts either.

Nvidia failed to meet it's goals the same quarter it launched the RTX series as well:


In short, Nvidia is pricing it's own customers out of the market. There are always some PC gamers that will pay anything for the top end cards but that's a tiny fraction of the market. I really don't see another price increase going well for them either. Forcing current 1080 Ti owners to now have to buy perhaps only a 3070 level card is insulting. It's like "well you can only afford mid range now". That's not something that gets people excited, it just pisses them off.
 
Last edited:
OK, I will ask.........but can the new mystery cards play Crysis??

And furthermore. does that mean the 2080TI will drop to, say $199 anytime soon???
I read up recently on this 'But can it run Crysis' meme and there's apparently still some quality headroom left on that game. I missed it during it's heyday, so it's still new to me.
 
I have a 2080 Ti which I got used for much less than retail 4 months ago and Netflix app always plays 4K, you can check the resolution by pressing Ctrl + Shift + Alt + D. You need the HEVC codec from app store.

I will skip the 3080Ti as my OCed 2080Ti can run anything maxed at 4K60Hz which is the max of my monitor. Also it will be much more expensive vs 2080Ti because of no competition and lack of stock due to corona virus. I will upgrade in 5 years when I get a 4K120Hz OLED(?) monitor. Windows 7 cannot 4k etc sad.
Yeah I've done all that. It looks like it streams 4k on my 1440p if I up my desktop res to 4k. Playing bit rate 3480x2160. Altered carbon also seems to have a 1440p playing bit rate. I had to wait like 15 seconds to see it start to move up. Windows 7 cannot 4k etc sad.
Free HVEC

Complete article

p.s. I have no idea if it's pointless to upscale to 4k on a 1440p monitor.
 
I have a 2080Ti so I think I'll skip the 3080Ti and get the 4080Ti.

It's gonna be a while before any game is made to challenge the 2080Ti at 1440p so I don't see myself needing an upgrade till Tax Return 2025
Same here but with my current 1080Ti, will be very welcomed in my old Mac Pro this year! Unless Intel and AMD can come up with a good plan and finally challenge Nvidia.
 
I read up recently on this 'But can it run Crysis' meme and there's apparently still some quality headroom left on that game. I missed it during it's heyday, so it's still new to me.
There will always be headroom, because CRYSIS is terribly optimized.
nVidia may not be releasing a Ampere gaming card, just like they skipped Volta. The rumored cards in this article are for Enterprise, not gaming.

Subsequently, the new Xbox will have the power of the 2080 SUPER, so many people will not be upgrading their dGPU, & just getting a new console, instead. Or one of the new rdna2 gaming cards coming, since nVidia won't have a new gpu for games, until the end of the year.
FP rate =! framerate. You'd think people would figure that out by now. You really think PC players are gonna switch to console sbecause they have rDNA2? You'd have to be totally delusional to think that. Totally different markets. Most Pc players do not want the closed console system, and most console players dont want to venture into the big world of PCs.

As for rDNA2, what makes you think that big card is coming? Hell, what makes you think nvidia wont release until the end of the year? Nvidia has historically released their high end cards at the tail end of 2nd quarter, followed by a slow rollout of mid range cards. I'd bet we see the 3080 before the end of june, or at the latest end of july. As for AMD, god only knows, there were plenty of rumors of a RX 490 and RX 590, the actual 590 wasnt the rumored 590, ece. Big Navi may never come out, or will be for enterprise only, just like so many think Ampere will be.

Of course, if rDNA 2 does turn out to be this massive leap in power, PC usres will get it too, without TDP or silicon limits that consoles impose, so the Xbone max or whatever its called will still end up on the low end of the totem pole like it always does.
 
Last edited:
Same here but with my current 1080Ti, will be very welcomed in my old Mac Pro this year! Unless Intel and AMD can come up with a good plan and finally challenge Nvidia.
I don't know about Intel but AMD - that is probably part of the reason nVidia may be getting aggressive with the performance boost supposedly included in this next generation.

Competition is always a good thing; however, like Intel set the price bar for HEDT CPUs, nVidia has set it for GPUs. If AMD comes out with anything that can keep pace with the highest-end nVidia GPUs, expect them to be as costly as nVidia's if not more so.
 
I hate when we see words like “Obliterates” to express a 30-40% increase....

There has to be a better descriptive word
30-40% seems like a pretty substantial jump to me. How is that not "obliterates"? If there is no point side-by-siding the cards, what would you call it?
 
30-40% seems like a pretty substantial jump to me. How is that not "obliterates"? If there is no point side-by-siding the cards, what would you call it?

"destroy utterly; wipe out. "

"cause to become invisible or indistinct; blot out. "

Those are the definitions of obliterate. A card card that is 30-40% faster in no way meets either of those. The 980 Ti, which was 68% slower then the 1080 Ti, still was an excellent card when the 10xx series launched.

Sure it was no longer high end but it was definitely not blotted out. This is why I'd revere that word for 90-100%+. The word implies that it utterly destroys something into irrelevance.

I'm not saying everyone has to have the same definition of a words as I'd rather not be a vocab Nazi but you are far off on the undertones of the word here. It has a very strong meaning.
 
"destroy utterly; wipe out. "

"cause to become invisible or indistinct; blot out. "

Those are the definitions of obliterate. A card card that is 30-40% faster in no way meets either of those. The 980 Ti, which was 68% slower then the 1080 Ti, still was an excellent card when the 10xx series launched.

Sure it was no longer high end but it was definitely not blotted out. This is why I'd revere that word for 90-100%+. The word implies that it utterly destroys something into irrelevance.

I'm not saying everyone has to have the same definition of a words as I'd rather not be a vocab Nazi but you are far off on the undertones of the word here. It has a very strong meaning.
You can't just look at the 980Ti in isolation when you make that claim though. You have to look at it NEXT to the 1080Ti which indeed makes it irrelevant in high end benchmarks.

That's like saying "if I pick a **** enough benchmark, my old card does great" rather than "if I pick a future-leaning benchmark that the 1080Ti looks good"...

Ps I do love the 980Ti - I run one. But I'm under no illusions. 68% slower is absolutely obliterated sorry. 30-40% mustn't be far off...
 
I read up recently on this 'But can it run Crysis' meme and there's apparently still some quality headroom left on that game. I missed it during it's heyday, so it's still new to me.

The game was released in 2007 to great acclaim with amazing graphics (even today they look great!). I was able to play it on very decent / high setting with my GTX 8800 .

It's definitely a great game. here are some fun facts (there are btw 5 Crysis games!):
 
You can't just look at the 980Ti in isolation when you make that claim though. You have to look at it NEXT to the 1080Ti which indeed makes it irrelevant in high end benchmarks.

That's like saying "if I pick a **** enough benchmark, my old card does great" rather than "if I pick a future-leaning benchmark that the 1080Ti looks good"...

Ps I do love the 980Ti - I run one. But I'm under no illusions. 68% slower is absolutely obliterated sorry. 30-40% mustn't be far off...

You seem to be under the impression that only the absolute top end cards with the highest performance matter. You ignore the fact that

1) Not everyone is in a position to benefit from said increased performance. It's no surprise that not everyone has a high resolution monitor and only plays the latest games. In fact I can't remember the last time I played a game that requires the full power of my 1080 Ti.
2) The 1080 Ti did not fundamentally change anything. It just brought extra performance. Otherwise the 980 Ti can do everything the 1080 Ti could.

The 1080 Ti was launched and it changed literally nothing for 980 Ti owners. The extra performance was there if they wanted to buy a 1080 Ti but absolutely nothing about the 1080 Ti and upcoming games rendered that card irrelevant.

In order to use such strong words you need to pass a threshold of benefits gamers are going to see. Go and look at 980 Ti gameplay and then 1080 Ti gameplay and tell me the 980 Ti is irrelevant. I'd bet that in fact both are going to look fantastic though.
 
You seem to be under the impression that only the absolute top end cards with the highest performance matter.

Millions of console gamers know nothing about high-performance PC graphics cards, high cost PC components are of little/no importance. They shouldn't know, esoteric high-dollar PC's are fringe. Gaming isn't, it's spread around all devices, mobiles, handhelds/tablets, consoles and a lot of 500-600 dollar PC boxes. Those are everywhere, $3000 - $6000 computers are not.

In order to use such strong words you need to pass a threshold of benefits gamers are going to see. Go and look at 980 Ti gameplay and then 1080 Ti gameplay and tell me the 980 Ti is irrelevant. I'd bet that in fact both are going to look fantastic though.

As good a performer the 980 is, it is absolutely unquestionably OBLITERATED by a 1080. Titanium version. Hands down, across the board, in all resolutions from 1080, 1440, to 4our K. I own one vanilla 1080 but TI's are better. The comparison of the 980 vs 1080 martches across the top three superlative terms honorably posted, ITT.

A 1080 TI Wallops, Trounces, and Dominates {{D o M I N a T e S }} a GTX 980 TI -- by 90 - 100%, by 60 - 89%, by 40 - 59%, and more. The 2080 steps up to a 1080 and takes a good swing but it's not even close to the margin of victory the Hero of Graphics, a toastin' and jammin' GeForce GTX 1080 Ti - which absolutely trounces it's pathetic groveling competition, holds.

Braggin' done right! RISC YOU EEIES ON PEECEE MaSTeR CMPONENT!!!!



It
ain't
called the
1080 TI BEAST
for nothin'.
 
"Obliterates" is too harsh a word. Usually it's just about 20% faster. As usual.
 
Nvidia has historically released their high end cards at the tail end of 2nd quarter, followed by a slow rollout of mid range cards.

2080 Ti: September 2018 (not tail end of Q2)
1080 Ti: March 2017 (not tail end of Q2)
980 Ti: May 2015 (hey, indeed tail end of Q2: good for you! This has to be the one you remembered...)
780 Ti: November 2013 (not tail end of Q2)

Is it that difficult to check the facts?
 
"Obliterates" is too harsh a word. Usually it's just about 20% faster. As usual.
In the benchmark, the best card almost doubled the 2080Ti - and the article claims that it’s probably an underperforming sample... I think that qualifies as “obliterates”!!
 
"Obliterates" is too harsh a word. Usually it's just about 20% faster. As usual.
2080TI-> 130,000
New card 1->222,000
New card 2->184,000
New card 3->164,000
About 20% faster you say? I think your math skills need a bit of help.

Even if we presume that the top card will become the next Titan and the second card in the list will become the next TI, that's 180,000 which is still much more than a 20% improvement.
 
Since the last few (read: most) comments aren't particularly topic, let's try to bring it back into focus.

At a guess, I’d say that this trio are prototypes of next generation Quadro flagships. Their respective memory capacities of 48 GB, 32 GB, and 24 GB exclude them from being gaming cards.
It's also possible that the mystery GPU is a GV100 (Volta) replacement, in which case it would be a Tesla model, rather than a GeForce or Quadro. The V100 add-in card has a base clock of 1.23 GHz and the HBM2 is clocked to 0.876 GHz, so the indicated clocks in the unknown GPU aren't wildly different from these.

Just reference, this is the same test done using a Titan X (Pascal):


Finding a random Tesla V100 result, that's done using a 6 core/12 thread CPU, gives:


The results are a little down on the unnamed device in some area, ahead in others:


Sobel 46 vs 69.3 Gpixels/sec
Canny 7.39 vs 12.0 Gpixels/sec
Stereo Matching 890.4 vs 873.2 Gpixels/sec
Histogram Eq 26.0 vs 30.5 Gpixels/sec
Gaussian 10.6 vs 16.2 Gpixels/sec
DoF 5.85 vs 7.38 Gpixels/sec
Face Detection 302.9 vs 307.0 images/sec
Horizon Detection 3.68 vs 5.44 Gpixels/sec
Feature Matching 0.899 vs 5.44 Gpixels/sec
Particle Physics 23082 vs 19714 fps
SFFT 1.86 vs 2.60 Tflops

The Feature Matching results are vastly different, though. Here's another V100 result, but this time with a far larger CPU (48 cores):


It would seem that the CPU, unfortunately, has a significant impact on the CUDA Compute results. making it difficult to properly compare these new findings.
 
Back