Nvidia GPUs with nearly 8,000 CUDA cores spotted in benchmark database (updated)

I mean, when talk about some like the 2080Ti, an already rediculous card, I think obliterates is appropriate. I'd guess that that 3070ti version of this will likely be ~5-10% slower.
To me obliterates would at least denote around 100%. I imagine a hammer smashing a block into pieces. Although with 30% to 40% faster, you could reasonably say it trounces. There are plenty of words to described one besting the other. 30% is still an absolute success of a node in just one generation.
 
Triple SLI actually. Maybe I'll drop in a 2080 Ti just for PhysX. You know, since it has been adopted in so many modern games.

I feel personally attacked since I used to run SLi Titan X (maxwell) with a 980Ti for Physx, at least I did for about 2 weeks before I managed to sell the 980 to fund my x34 purchase.
 
To me obliterates would at least denote around 100%. I imagine a hammer smashing a block into pieces. Although with 30% to 40% faster, you could reasonably say it trounces. There are plenty of words to described one besting the other. 30% is still an absolute success of a node in just one generation.

90 - 100%+ - Dominates
60 - 89% - Trounces
40 - 59% - Wallops
20 - 39% - Beats
0-19% - Edges out
0% - Ties
 
90 - 100%+ - Dominates
60 - 89% - Trounces
40 - 59% - Wallops
20 - 39% - Beats
0-19% - Edges out
0% - Ties
Geez guys, didn't know it was such an issue! To be perfectly honest, I write so many of these articles (given tech continues to improve pretty rapidly) that my vocabulary gets tested a little. But I'll totally use some of the community's suggestions next time, and try to get the modality just right. You're welcome to suggest more.
 
About Mystery GPU 1 there, it kinda resemble with the new device I got from NVidia GeForce Now.

I was playing Destiny 2 with free account and when changing graphic setting I saw the GPU it's using is NVidia Pascal P40 24GB and I was amazed.

Maybe it will give you a clue. I don't really know what is going on but I have the SS to prove its true

Thankyou ?
 
Take out that 2nd mortgage and you can have one in your home tomorrow ..... assuming the bank will allow you to stay in the house .......
 
I have a 2080Ti so I think I'll skip the 3080Ti and get the 4080Ti.

It's gonna be a while before any game is made to challenge the 2080Ti at 1440p so I don't see myself needing an upgrade till Tax Return 2025
That's what I said when I started gaming at 1440p@100hz in 2012 with a GTX 690. I managed to hold out until I upgraded my monitor to 1440p@144hz with a 1080ti. Now a handful of 1440p 240hz+ displays are on the horizon, once I give in and buy one, I'll have to upgrade my card (I think that'll be a 4080ti, unless the 3080ti is much more reasonably priced than I expect).
 
That's what I said when I started gaming at 1440p@100hz in 2012 with a GTX 690. I managed to hold out until I upgraded my monitor to 1440p@144hz with a 1080ti. Now a handful of 1440p 240hz+ displays are on the horizon, once I give in and buy one, I'll have to upgrade my card (I think that'll be a 4080ti, unless the 3080ti is much more reasonably priced than I expect).

When the 3000 series hits, they will absolutely have to be priced above the 2000 series. The 2000 series may have some price drops but the run on the market will be for them. No different than the run on 1080Ti when the 2080/Ti released.

The difference this time is there is no cryptocurrency market causing inflation and the recession from Coronavirus is going to disrupt supply for a while.
 
Truly amazing numbers. For rendering 3xxx series already look amazing and this is only Quadro rumor.

RTX2xxx is not very impressive as a gaming card vs 1xxx, but as a render node it blows previous generation into middle of next week. Seems this will be another CUDA sledgehammer. Can't wait as 2070S already has 80% rendering power of two 1080Tis.

Only scary thing: how much RTX3xxx will cost. Truly terrifying prospect.

One thing I would love to see back (from 1080Ti) is one slot option (with liquid cooling). These 2.5-3 slot cards are ridiculous, we going back to (in size) to GTX690 or 590 and other behemoths.
 
Only scary thing: how much RTX3xxx will cost. Truly terrifying prospect.

So true, as much as I rain on AMD's parade for their well deserved reputation and decade long struggle with drivers, they are really the only true competition Nvidia has and sadly its not enough to force Nvidia's hand at dropping the prices lately.
 
I'm calling it now, the ti card is going to be impossible to get for less than 2 grand and you'll see plenty going for over 3.
 
About Mystery GPU 1 there, it kinda resemble with the new device I got from NVidia GeForce Now.

I was playing Destiny 2 with free account and when changing graphic setting I saw the GPU it's using is NVidia Pascal P40 24GB and I was amazed.

Maybe it will give you a clue. I don't really know what is going on but I have the SS to prove its true

Post that screenshot so we can find out if its fake!
 
I have a 2080Ti so I think I'll skip the 3080Ti and get the 4080Ti.

It's gonna be a while before any game is made to challenge the 2080Ti at 1440p so I don't see myself needing an upgrade till Tax Return 2025

You spent $1200 on the 2080ti which is not future proofed with HDMI 2.1 or DP 2.0 to play on a 1440p monitor???, when the current xbox one x lets you play at 4k and the next xbox series x has 12 tflops of performance, which will let you play at 4k 120fps with variable rate shading. Should have waited for the RTX 3080ti which might be cheaper, should have HDMI 2.1 to play at 4k 120fps HDR on a monitor like the Asus PG27UQX or an LG C9 55in OLED 120hz. What a waste of $1200.
 
You spent $1200 on the 2080ti which is not future proofed with HDMI 2.1 or DP 2.0 to play on a 1440p monitor???, when the current xbox one x lets you play at 4k and the next xbox series x has 12 tflops of performance, which will let you play at 4k 120fps with variable rate shading. Should have waited for the RTX 3080ti which might be cheaper, should have HDMI 2.1 to play at 4k 120fps HDR on a monitor like the Asus PG27UQX or an LG C9 55in OLED 120hz. What a waste of $1200.
(edit look below to the person who responded to me.)
not to mention Netflix, Hulu etc play at 720p by default on PC. 1080p with the "app" and maybe never at 4k, no matter how many hoops you jump through. But consoles will stream 4k out the box.
 
Last edited:
OK, I will ask.........but can the new mystery cards play Crysis??

And furthermore. does that mean the 2080TI will drop to, say $199 anytime soon???
 
You spent $1200 on the 2080ti which is not future proofed with HDMI 2.1 or DP 2.0 to play on a 1440p monitor???, when the current xbox one x lets you play at 4k and the next xbox series x has 12 tflops of performance, which will let you play at 4k 120fps with variable rate shading. Should have waited for the RTX 3080ti which might be cheaper, should have HDMI 2.1 to play at 4k 120fps HDR on a monitor like the Asus PG27UQX or an LG C9 55in OLED 120hz. What a waste of $1200.



I've wasted way more money than that. It just feels good to have it.
 
nVidia may not be releasing a Ampere gaming card, just like they skipped Volta. The rumored cards in this article are for Enterprise, not gaming.

Subsequently, the new Xbox will have the power of the 2080 SUPER, so many people will not be upgrading their dGPU, & just getting a new console, instead. Or one of the new rdna2 gaming cards coming, since nVidia won't have a new gpu for games, until the end of the year.
 
not to mention Netflix, Hulu etc play at 720p by default on PC. 1080p with the "app" and maybe never at 4k, no matter how many hoops you jump through. But consoles will stream 4k out the box.
I have a 2080 Ti which I got used for much less than retail 4 months ago and Netflix app always plays 4K, you can check the resolution by pressing Ctrl + Shift + Alt + D. You need the HEVC codec from app store.

I will skip the 3080Ti as my OCed 2080Ti can run anything maxed at 4K60Hz which is the max of my monitor. Also it will be much more expensive vs 2080Ti because of no competition and lack of stock due to corona virus. I will upgrade in 5 years when I get a 4K120Hz OLED(?) monitor.
 
Also benchmarks are always faster than games so those cards won't be as fast as those benchmarks in games.
 
Back