BoboOOZ
Posts: 63 +56
That number is announced for raytraced games, and frankly, it sounds reasonable, given that the present RT implementation is pretty poor.75 % honestly
That number is announced for raytraced games, and frankly, it sounds reasonable, given that the present RT implementation is pretty poor.75 % honestly
Mine are based on facts not wishfull thinking. where did you get thiss 40 to 75% from ???? the 980 is only half the speed as the 2080 on this graph and how many generations is that.....
You will be surprised. Brand new arch on 7nm TSMC = Big gains.
You sound like someone who bought 2000 series recently.
2000 series never delivered a true generational leap (and was not meant for 12nm to begin with). 3000 series will.
My 1080 Ti will be replaced by 3080/3080Ti/3090 very soon.
Nvidia doesn't need to deliver anything. They've been at the top of the performance spectrum for the past 3 generations with little to no real competition from AMD.Even the 3080 will beat 2080 Ti, at 500 bucks less. Over time, 3080 will pull more and more ahead too.
You must be new at this if you think the biggest Ampere consumer card is not going to be more than 25% faster than 2080 Ti hahah.
Rumours suggests "3090" will be 60-90% faster than 2080 Ti and this is also my expectation considering new arch and 7nm/7nm+ TSMC. Huge leap incoming.
Also, AMD actually will bring a top-card this time. Meaning Nvidia needs to deliver a decent bump. You won't see a crappy 1000 -> 2000 series step this time. Mark my words. C ya here in a month or two when we know specs.
But the key will be what will the 1080Ti's be worth on the secondary market when you do?
Nvidia doesn't need to deliver anything. They've been at the top of the performance spectrum for the past 3 generations with little to no real competition from AMD.
It's AMD that has to deliver.
There are 2 areas that make Nvidia a lot of money:
1. enterprise GPUs (ai, automation & GPGPU accelerated workloads)
2. enthusiast GPUs
Why do you think they've been gradually increasing enthusiast prices with each gen now for the past 3 gens?
They have no competition there so they can.
Mark my words:
- the 3080 will be similar to the 2080ti in 3D perf and they will boast about superior RT performance (like that even matters with how many games have it and their **** implementation) while the price will be just slightly cheaper than it (think $50 off)
- the 3080ti/3090 will likely be $1400 - $1500 and with a max of 25% perf over the 2080ti in regular 3D workloads
- by the time AMD comes around with their GPUs, they will have the $600-$1000 segment covered with other models
- I wouldn't be surprised if they introduce a $2000 Titan at some point
- all the typical performance improvements that you would have seen from a 5-7nm fab process will be spent on RT so even if AMD will be competitive with them in terms of perf you will still think "hhm, but RT is way better on nvidia"
- and let say that, by some miracle, AMD really does come up with something amazing.. they can just reduce prices and everyone will still stay with nvidia
- given how entrenched they are, it would take AMD two generations of kicking their asses for them to get to a point where they "need to deliver a decent bump"
We will wait and see and I will remember to say I told you so when the new cards are less than 25% faster than the top 2080ti
75 % honestly
From Techpowerup's most recent GPU review:
2080 53% faster than 1080
So you decided that only 4k results matter?
Meh, I prefer the average number, much closer to the truth.
You might as well decide that you should make the difference on RT performance only, since RT is the future.Things in the world are not that black and white, but for the same reason that all TechSpot's CPU reviews are done with a 2080Ti to minimize the GPU bottleneck, if you want to see the capability difference between GPUs, you need to minimize the CPU bottleneck and that's best done at 4K.
You are pretty much not supposed to buy every gen. A GPU should be a 3-5 year purchase cycle if you are buying the higher end tier cards people are still rocking 980tis but the cheaper you card is the more often you will end up having to replace it. Depending on how you invest, you can predict what will give you the best bang for buck, same goes for motherboards + CPUs + Ram grant to Intel made that cycle almost a 10year deal thanks to the marginal improvementsI think plenty of people skip generations, so make that 50%+. Even for those buying at the top end, say $1,000, spread over 4 years it's like $20/month. (And most people are buying at prices considerably lower down the line.)
Overall I think gaming is a high-value hobby that gives many hours of enjoyment at little cost compared to alternatives. Someone who goes for an evening out even just once more a month probably spends more.
You might as well decide that you should make the difference on RT performance only, since RT is the future.
The huge majority of players play at 1080p still.
Anyways, I'm not gonna make this any longer, your numbers are not the generally accepted numbers, they thereby are skewed.
The few people that I know owning 2080Ti play 1080 165Hz, while good 4k monitors are practically inexistent. Anyways, that is not the main point.The "huge majority" of players do not use a 2080 or 2080Ti or will buy a 3080 or 3090. But this article is specifically about the 3080 and 3090 and the discussion is about the expected improvement in games using those GPUs. People spending $1000 on those GPUs will not often be playing at 1080p, they will play at 1440p and if the 3080 and 3090 are as many people expect, they will also be playing at 4K.
1080p numbers are most relevant for the 2070 and below but are not the target audience for 2080 to 3090 users.
You picked one graph from one review from Techpowerup and you present it as the representative number, it is, therefore, your number.As for "my numbers" being "generally accepted" or not. They are not mine, they are TechPowerUp's and they had better be accepted or why is TPU even doing those tests?
Are you under the impression that current 2080ti owners will be the majority buyers of the new 3080/3090 cards?I dont care what any of you say.
all you will get is at most a 25% performance boost vs the 2080ti.
I will tell you so when the time comes.
The few people that I know owning 2080Ti play 1080 165Hz, while good 4k monitors are practically inexistent. Anyways, that is not the main point.
You picked one graph from one review from Techpowerup and you present it as the representative number, it is, therefore, your number.
The generally accepted number is in the graph called relative performance on this page.
It is averaged over a large number of brands and it encompasses the results for several resolutions.NVIDIA GeForce RTX 2080 Ti Specs
NVIDIA TU102, 1545 MHz, 4352 Cores, 272 TMUs, 88 ROPs, 11264 MB GDDR6, 1750 MHz, 352 bitwww.techpowerup.com
Yepp, anyways, that was not the main point from when we started the discussion. The main point was that for the last generation difference between 2080Ti and 1080Ti was only 37%, while for the previous generation it was 68% (1080Ti to 980Ti). That is why some people are lead to believe that this generation we'll see a similarly small progression.Fair enough, we can agree to disagree.
That is off-topic, but I actually prefer ultrawide (21:9) 1440p, but I won't be getting the top card from either Nvidia or AMD, I find the performance per dollar at this level too poor. And I don't think the pricing will change, unfortunately.It's all good; you like 1080p, I like 1440p, 2% of gamers like 4K even though less than than number even use a 2080 or 2080 Ti. Have a look at your link, TPU even recommends both the 2080 and 2080Ti for 4K gaming with a green box.
We will wait and see and I will remember to say I told you so when the new cards are less than 25% faster than the top 2080ti
75 % honestly
Not really. You can build a high end gaming PC with 1.5-1.6k even if the GPU costs 1k by itself. Especially if you are gaming at higher resolutions, you don't need a top of the line CPU - ram or mobo, you can buy all 3 of them with like 300-350€I like the logic... but your numbers don't quite add up... your $1000 is JUST the video card. The entire gaming PC costs a fair amount more - if the GPU is $1000, the PC is probably $2-3000 at the least...
Yes, when you are compaing gpus 4k results matter, 1080p don't. The reason is obvious, there is CPU bottlenecking in lower resolutions. There is a reason you should be testing GPUs at high resolution and CPU's at low resolutionSo you decided that only 4k results matter?
Meh, I prefer the average number, much closer to the truth.
I dont care what any of you say.
all you will get is at most a 25% performance boost vs the 2080ti.
I will tell you so when the time comes.
You can... but you shouldn’t... and most don’t... if your GPU is $1,000, you’d be a fool to buy a cheap motherboard/cpu... and even then, you still haven’t accounted for the HD (SSD), monitor, speakers, case... that adds a few hundred more... $2,000 is about the bare minimum for a $1000 GPU...Not really. You can build a high end gaming PC with 1.5-1.6k even if the GPU costs 1k by itself. Especially if you are gaming at higher resolutions, you don't need a top of the line CPU - ram or mobo, you can buy all 3 of them with like 300-350€