Nvidia's GeForce RTX 3080 and 3090 could enter mass production in August

489ueHC.png

Mine are based on facts not wishfull thinking. where did you get thiss 40 to 75% from ???? the 980 is only half the speed as the 2080 on this graph and how many generations is that.....

Cherry pick much?
 
You will be surprised. Brand new arch on 7nm TSMC = Big gains.

You sound like someone who bought 2000 series recently.

2000 series never delivered a true generational leap (and was not meant for 12nm to begin with). 3000 series will.

My 1080 Ti will be replaced by 3080/3080Ti/3090 very soon.

But the key will be what will the 1080Ti's be worth on the secondary market when you do?
 
Even the 3080 will beat 2080 Ti, at 500 bucks less. Over time, 3080 will pull more and more ahead too.

You must be new at this if you think the biggest Ampere consumer card is not going to be more than 25% faster than 2080 Ti hahah.

Rumours suggests "3090" will be 60-90% faster than 2080 Ti and this is also my expectation considering new arch and 7nm/7nm+ TSMC. Huge leap incoming.

Also, AMD actually will bring a top-card this time. Meaning Nvidia needs to deliver a decent bump. You won't see a crappy 1000 -> 2000 series step this time. Mark my words. C ya here in a month or two when we know specs.
Nvidia doesn't need to deliver anything. They've been at the top of the performance spectrum for the past 3 generations with little to no real competition from AMD.
It's AMD that has to deliver.

There are 2 areas that make Nvidia a lot of money:
1. enterprise GPUs (ai, automation & GPGPU accelerated workloads)
2. enthusiast GPUs

Why do you think they've been gradually increasing enthusiast prices with each gen now for the past 3 gens?
They have no competition there so they can.

Mark my words:
- the 3080 will be similar to the 2080ti in 3D perf and they will boast about superior RT performance (like that even matters with how many games have it and their **** implementation) while the price will be just slightly cheaper than it (think $50 off)
- the 3080ti/3090 will likely be $1400 - $1500 and with a max of 25% perf over the 2080ti in regular 3D workloads
- by the time AMD comes around with their GPUs, they will have the $600-$1000 segment covered with other models
- I wouldn't be surprised if they introduce a $2000 Titan at some point
- all the typical performance improvements that you would have seen from a 5-7nm fab process will be spent on RT so even if AMD will be competitive with them in terms of perf you will still think "hhm, but RT is way better on nvidia"
- and let say that, by some miracle, AMD really does come up with something amazing.. they can just reduce prices and everyone will still stay with nvidia
- given how entrenched they are, it would take AMD two generations of kicking their asses for them to get to a point where they "need to deliver a decent bump"
 
Last edited:
Nvidia doesn't need to deliver anything. They've been at the top of the performance spectrum for the past 3 generations with little to no real competition from AMD.
It's AMD that has to deliver.

There are 2 areas that make Nvidia a lot of money:
1. enterprise GPUs (ai, automation & GPGPU accelerated workloads)
2. enthusiast GPUs

Why do you think they've been gradually increasing enthusiast prices with each gen now for the past 3 gens?
They have no competition there so they can.

Mark my words:
- the 3080 will be similar to the 2080ti in 3D perf and they will boast about superior RT performance (like that even matters with how many games have it and their **** implementation) while the price will be just slightly cheaper than it (think $50 off)
- the 3080ti/3090 will likely be $1400 - $1500 and with a max of 25% perf over the 2080ti in regular 3D workloads
- by the time AMD comes around with their GPUs, they will have the $600-$1000 segment covered with other models
- I wouldn't be surprised if they introduce a $2000 Titan at some point
- all the typical performance improvements that you would have seen from a 5-7nm fab process will be spent on RT so even if AMD will be competitive with them in terms of perf you will still think "hhm, but RT is way better on nvidia"
- and let say that, by some miracle, AMD really does come up with something amazing.. they can just reduce prices and everyone will still stay with nvidia
- given how entrenched they are, it would take AMD two generations of kicking their asses for them to get to a point where they "need to deliver a decent bump"

Yes, Nvidia needs to deliver a decent bump. Big Navi coming soon and Next Gen consoles too. Big Navi is out before these consoles, AMDs official statement. Xbox Series X already showcased 2080 Ti performance in a Halo game.

Gaming/Consumer segment still stands for 50%+ of Nvidia's income. No, they are not downplaying this market.

95% or more of consumer GPU's sold are sub 500 dollars. This is why AMD is still competitive in the GPU market, since 5700 XT custom rivals 2060 Super/2070/2070 Super in many games. Big Navi will deliver x2 perf, pretty much. Beating 2080 Ti easily.

AMD overtook Nvidia in GPU shipments in Q4 2019.
Reason for Super releases.

Super releases was a stop-gap solution. Ampere is NEEDED soon and it will have to deliver. I'm 100% sure you won't see a mediocre release again. 2000 series never sold well, compared to 1000 series and 900 series that is.

So you pay 800-1000 bucks for 25% more perf going from 3080 to 3090? Sounds legit :D
 
We will wait and see and I will remember to say I told you so when the new cards are less than 25% faster than the top 2080ti
75 % honestly

From Techpowerup's most recent GPU review:

2080 53% faster than 1080
2080Super 62% faster than 1080
2080Ti 48% faster than 1080Ti

From Techpowerup's Pascal reviews:

1080 72% faster than 980
1080Ti 85% faster than 980Ti

25% is unlikely. 75% is wishful thinking but *did* happen the last time a die shrink and arch change happened. We shall see...
 
I like to use TPU's 4K numbers (most GPU-bound) from their most recent GPU review, in this case the 5600XT. The numbers you link to are at 1080p and totally legit, but I prefer to isolate the GPU as much as possible so I go for the 4K numbers.


Numbers taken directly from this chart:

relative-performance_3840-2160.png
 
So you decided that only 4k results matter?
Meh, I prefer the average number, much closer to the truth.
 
So you decided that only 4k results matter?
Meh, I prefer the average number, much closer to the truth.

Yes.

Only 4K results matter.

In fact nothing else matters, not even the CPU.

/s

Things in the world are not that black and white, but for the same reason that all TechSpot's CPU reviews are done with a 2080Ti to minimize the GPU bottleneck, if you want to see the capability difference between GPUs, you need to minimize the CPU bottleneck and that's best done at 4K.

And remember that we're talking about the 3080 and 3090 GPUs coming out, using previous generational improvements as a guide. These are 1440p minimum and 4K ideal GPUs so looking at the 4K numbers is relevant here. You could argue that 1440p numbers are equally as relevant but IMO 1080p numbers for a $700-1200 GPU are not nearly as relevant.
 
Things in the world are not that black and white, but for the same reason that all TechSpot's CPU reviews are done with a 2080Ti to minimize the GPU bottleneck, if you want to see the capability difference between GPUs, you need to minimize the CPU bottleneck and that's best done at 4K.
You might as well decide that you should make the difference on RT performance only, since RT is the future.
The huge majority of players play at 1080p still.
Anyways, I'm not gonna make this any longer, your numbers are not the generally accepted numbers, they thereby are skewed.
 
I think plenty of people skip generations, so make that 50%+. Even for those buying at the top end, say $1,000, spread over 4 years it's like $20/month. (And most people are buying at prices considerably lower down the line.)

Overall I think gaming is a high-value hobby that gives many hours of enjoyment at little cost compared to alternatives. Someone who goes for an evening out even just once more a month probably spends more.
You are pretty much not supposed to buy every gen. A GPU should be a 3-5 year purchase cycle if you are buying the higher end tier cards people are still rocking 980tis but the cheaper you card is the more often you will end up having to replace it. Depending on how you invest, you can predict what will give you the best bang for buck, same goes for motherboards + CPUs + Ram grant to Intel made that cycle almost a 10year deal thanks to the marginal improvements
 
You might as well decide that you should make the difference on RT performance only, since RT is the future.
The huge majority of players play at 1080p still.
Anyways, I'm not gonna make this any longer, your numbers are not the generally accepted numbers, they thereby are skewed.

The "huge majority" of players do not use a 2080 or 2080Ti or will buy a 3080 or 3090. But this article is specifically about the 3080 and 3090 and the discussion is about the expected improvement in games using those GPUs. People spending $1000 on those GPUs will not often be playing at 1080p, they will play at 1440p and if the 3080 and 3090 are as many people expect, they will also be playing at 4K.

1080p numbers are most relevant for the 2070 and below but are not the target audience for 2080 to 3090 users.

As for "my numbers" being "generally accepted" or not. They are not mine, they are TechPowerUp's and they had better be accepted or why is TPU even doing those tests?
 
I dont care what any of you say.
all you will get is at most a 25% performance boost vs the 2080ti.
I will tell you so when the time comes.
 
The "huge majority" of players do not use a 2080 or 2080Ti or will buy a 3080 or 3090. But this article is specifically about the 3080 and 3090 and the discussion is about the expected improvement in games using those GPUs. People spending $1000 on those GPUs will not often be playing at 1080p, they will play at 1440p and if the 3080 and 3090 are as many people expect, they will also be playing at 4K.

1080p numbers are most relevant for the 2070 and below but are not the target audience for 2080 to 3090 users.
The few people that I know owning 2080Ti play 1080 165Hz, while good 4k monitors are practically inexistent. Anyways, that is not the main point.
As for "my numbers" being "generally accepted" or not. They are not mine, they are TechPowerUp's and they had better be accepted or why is TPU even doing those tests?
You picked one graph from one review from Techpowerup and you present it as the representative number, it is, therefore, your number.

The generally accepted number is in the graph called relative performance on this page.
It is averaged over a large number of brands and it encompasses the results for several resolutions.
 
I dont care what any of you say.
all you will get is at most a 25% performance boost vs the 2080ti.
I will tell you so when the time comes.
Are you under the impression that current 2080ti owners will be the majority buyers of the new 3080/3090 cards?

Again I might be biased from my own situation, but I'd expect more of the demand would come from folks from prior generations. Maybe even particularly so this time around, if people felt underwhelmed by the 2080 etc.
 
The few people that I know owning 2080Ti play 1080 165Hz, while good 4k monitors are practically inexistent. Anyways, that is not the main point.

Fair enough, we can agree to disagree. From the Steam Survey we can get some rough numbers about this however:

1440p: 6%
4K: 2%

Not much. But:

2080: 0.95%
2080 Ti: 0.8%

It's reasonable to think that many or most of the 2% of gamers at 4K are using a 2080 or 2080 Ti, and are at least considering the 3080 or 3090. Same goes for those 1440p players. I will seriously consider the 3080 as I'm currently using a 1080 at 1440p.

You picked one graph from one review from Techpowerup and you present it as the representative number, it is, therefore, your number.

The generally accepted number is in the graph called relative performance on this page.
It is averaged over a large number of brands and it encompasses the results for several resolutions.

That graph is their average for *all* their cards, not just one card as you imply. Same website, same testers, same data pool. It encompasses the very same cards that you refer to at 1080p, just tested at 4K.

It's all good; you like 1080p, I like 1440p, 2% of gamers like 4K even though less than than number even use a 2080 or 2080 Ti. Have a look at your link, TPU even recommends both the 2080 and 2080Ti for 4K gaming with a green box.
 
Fair enough, we can agree to disagree.
Yepp, anyways, that was not the main point from when we started the discussion. The main point was that for the last generation difference between 2080Ti and 1080Ti was only 37%, while for the previous generation it was 68% (1080Ti to 980Ti). That is why some people are lead to believe that this generation we'll see a similarly small progression.
I am not of this opinion, all good leaks point out to the fact that Nvidia took the AMD threat very seriously and that we'll see a larger gap this time. It will be larger in rasterized gaming, and even larger in RT gaming.
It's all good; you like 1080p, I like 1440p, 2% of gamers like 4K even though less than than number even use a 2080 or 2080 Ti. Have a look at your link, TPU even recommends both the 2080 and 2080Ti for 4K gaming with a green box.
That is off-topic, but I actually prefer ultrawide (21:9) 1440p, but I won't be getting the top card from either Nvidia or AMD, I find the performance per dollar at this level too poor. And I don't think the pricing will change, unfortunately.
 
We will wait and see and I will remember to say I told you so when the new cards are less than 25% faster than the top 2080ti
75 % honestly

I didn’t say 75% across the board, but it will be more like 50% easily averaged for 3090 vs 2080Ti, hell 3080 will be on par if not faster than 2080Ti and if you do ray tracing it will crush the 2080Ti.
 
I like the logic... but your numbers don't quite add up... your $1000 is JUST the video card. The entire gaming PC costs a fair amount more - if the GPU is $1000, the PC is probably $2-3000 at the least...
Not really. You can build a high end gaming PC with 1.5-1.6k even if the GPU costs 1k by itself. Especially if you are gaming at higher resolutions, you don't need a top of the line CPU - ram or mobo, you can buy all 3 of them with like 300-350€
 
So you decided that only 4k results matter?
Meh, I prefer the average number, much closer to the truth.
Yes, when you are compaing gpus 4k results matter, 1080p don't. The reason is obvious, there is CPU bottlenecking in lower resolutions. There is a reason you should be testing GPUs at high resolution and CPU's at low resolution
 
I dont care what any of you say.
all you will get is at most a 25% performance boost vs the 2080ti.
I will tell you so when the time comes.

Hahaha, you bought 2000 series for sure :joy: One of the worst generations Nvidia ever released. You will see FAR MORE than 25% jump. Lets talk again when first Benchmarks are out.
 
Not really. You can build a high end gaming PC with 1.5-1.6k even if the GPU costs 1k by itself. Especially if you are gaming at higher resolutions, you don't need a top of the line CPU - ram or mobo, you can buy all 3 of them with like 300-350€
You can... but you shouldn’t... and most don’t... if your GPU is $1,000, you’d be a fool to buy a cheap motherboard/cpu... and even then, you still haven’t accounted for the HD (SSD), monitor, speakers, case... that adds a few hundred more... $2,000 is about the bare minimum for a $1000 GPU...
 
Back