Next-gen graphics cards rumored to offer significant leap in performance

VR still struggles on the best of the current cards so I do see the need for them.
I agree with you that most people don't need anything like what these cards will be able to deliver.
Good thing these $1000+ cards aren't just for games considering the 4090 is rumoured to have 48GB. GPU's just like CPU's are for more than just games. Clearly noted by the addition of Studio drivers for GTX/RTX cards. RTX X090 is just the new "Titan". No one should be surprised they push the envelope.

"NVIDIA Studio Drivers provide artists, creators and 3D developers the best performance and reliability when working with creative applications. To achieve the highest level of reliability, Studio Drivers undergo extensive testing against multi-app creator workflows and multiple revisions of the top creative applications from Adobe to Autodesk and beyond."

Applications
Provides the optimal experience for the latest releases of top creative apps, including Autodesk Maya 2019, 3ds Max 2020, Arnold 5.3.1.0, Blackmagic Design DaVinci Resolve 16, and Daz 3D Daz Studio.
 
By the time Intel releases the Alchemist GPU's they won't be much better than using IG. They are quickly getting left in the dust! With the recent improvements in prices and availability, Intel is going to have sell these for less than cost to get any takers.
 
Again, if this is true, both companies are completely failing to read the room regarding power efficiency of their cards, as if they are completely oblivious to the rampaging energy costs across many areas in the world- it seems to obscene to think we could soon have graphics cards that could draw the same power as kitchen appliances, and we already try to get the most efficient type we can with those as a lot of them are required for day to day life, and yet Nvidia/AMD think we will just gloss over the fact because it's a graphics card....
I think the people failing to read the room are the people who think that Nvidia and AMD shouldn’t make these high power, expensive GPUs. They both make by far the most profit on these products and they are always hardest to secure stock of.

The marketplace is sending a clear and obvious signal to these companies execs that they will lap up these expensive parts. I mean have you seen how hard it is to get a 3090 ti right now?
 
I personally love ultra-wide monitors approaching 40". 34" and 38" to be exact.

Anyone with one of these monitors is gaming in 1080p or 1440p. If they buy a 4K card capable of 60fps, they'll likely be underutilizing it but getting high FPS at up to 1440p.

I guess the next generation of cards will be offering 120fps at 4K on the high end.

Bottom Line: I don't care about the energy use. I just want performance. Just get a 1500 W PSU.
Even if you are loaded, the underlying problem persists: Heat.
 
Even if you are loaded, the underlying problem persists: Heat.
Take India right now for example: Even if you're 'loaded' unless you are also a full blown doomsday prep-er as well it doesn't matter how much money you have if entire cities are experiencing rolling black outs that affects literally everything: Power to your house, being able to get gas for your car, groceries and medical supplies, all things are breaking down due to unprecedented power consumption due to the heat wave

You can expect that to come to places that are as bad or even more susceptible to disruptions like Texas once they have to contend with 50c temps for days on end (120+ F).

So if you 'Don't care about power consumption' I hope you have like 40,000 USD worth of solar panels and batteries, truckloads of gas (And let's face it probably not safely stored) and food and medicine rations to survive for weeks or months on end without the basics because that's what coming and if you don't already have all those things, well last year people were already scalping gas generators in Texas so you can expect those bits of infrastructure and supplies you might be inclined to prep for to just cost you a hell of a lot more.

...Or you can just build like a modest and sane PC that uses under 300 watts of power peak and have a laptop as a backup so you don't literally have to build your life around being prepared for not caring about how much power you're using once you're facing the extremely real and extremely immediate possibility of you just not having enough power being supplied to you.

And in the process maybe you make it slightly less likely to experience rolling blackouts by not wasting 600 watts of power to your PC to get 20% more performance out of it vs mid range parts that use sane amounts of power.
 
So tl;dr I don't think nearly as many people *need* a 100 teraflops product no matter how much they think they do, it's just the clever marketing the PCMR people, likely unbeknownst to them, have been doing to help Nvidia's arms race.

Fortunately, we live in a world where you may aspire to more than mere subsistence.

I think it's really great if a gallon of water and a bowl of rice a day is enough for your frugal lifestyle, but I want more.
 
"Next-gen graphics cards rumored to offer significant leap in performance"

Ok, so Adrian Potoroaca is techspot's official "Redundant King of Redundant" because this headline could have been used at any time in history when the release of a new generation of video cards was imminent. This is about as generic as it gets and makes me wonder how many seconds the author took to decide upon it. The new generation always offers a significant leap in performance, otherwise there isn't a new generation.

I wonder how many times this headline has been recycled over the years. It's almost as cringe-worthy as when an author refers to the newest CPU or GPU as "the fastest ever made" (and everyone here HAS seen that redundant description more than once).

"OF COURSE IT'S THE FASTEST EVER MADE, IT'S THE NEWEST ONE!":laughing:
 
Last edited:
I think the people failing to read the room are the people who think that Nvidia and AMD shouldn’t make these high power, expensive GPUs. They both make by far the most profit on these products and they are always hardest to secure stock of.

The marketplace is sending a clear and obvious signal to these companies execs that they will lap up these expensive parts. I mean have you seen how hard it is to get a 3090 ti right now?
My bet is that the end-user market is probably the smallest of either AMD's or nVidia's market at this point. HPC is almost certainly more profitable. Intel's latest compiler offerings - their oneAPI compilers - allow developers to much more easily take advantage of GPGPU capabilities.
 
Again, if this is true, both companies are completely failing to read the room regarding power efficiency of their cards, as if they are completely oblivious to the rampaging energy costs across many areas in the world- it seems to obscene to think we could soon have graphics cards that could draw the same power as kitchen appliances, and we already try to get the most efficient type we can with those as a lot of them are required for day to day life, and yet Nvidia/AMD think we will just gloss over the fact because it's a graphics card....

They will only draw as much power as you demand from them. If you want to draw less, set a framerate limit in the driver, and run at lower resolutions (just as you would with a less capable and more power efficient card).
 
If this is real, then we might just see one of the biggest performance increases over a generation, of all time.
But personally, this seems to good to be true.
Chances are, the real numbers will be significantly lower.
The real world performance also might be significantly lower especially when these new throttle monsters plateau into equilibrium.
 
How about giving us 10 bit output without requiring a quadro and how about offering much better fp32 and fp64 performance
 
Just remember electricity prices have jumped in many places, especially in EU. Some folks have seen double the cost of electricity this year since 2020. It might not sound like much, but if you were paying $100 a month for electricity in 2020, you're now paying upwards of $200 a month - that's an extra $1200 a year.

You're right that a few extra $ a month for running a high power draw GPU might not be much, but when costs are going up for electricity it just means less money in your pocket for everything else that's going up in price.

Well, inflation is an entirely separate issue.

If your not making 8-10% more than you were 2 years ago then you got a pay cut and didn't even realize it.

Pretty sure the majority of us got the pay cut…
 
I'd be interesting to look at another statistic: How many gamers that own a 3080 or better would actually notice if we put them on medium and high settings, cap their framerate at "only" 120hz and tell them "This is a 3080" while they actually run a 3060ti or 3070 and honestly on a practical sense, don't even notice or care about the actual performance tier they buy?

I bet it would be the majority because so many responses overvalue high refresh rating and are dead set on remaining on 1440p monitors and don't even need to drive 4k or refresh rates past 90-120hz anyway (I maintain that 99% of people don't have fast enough reflexes in competitive gaming to benefit for more than 120-144hz refresh rates)

So tl;dr I don't think nearly as many people *need* a 100 teraflops product no matter how much they think they do, it's just the clever marketing the PCMR people, likely unbeknownst to them, have been doing to help Nvidia's arms race.

I disagree. These cards are catered to a younger generation that are tech savy enough to know and feel the difference. Heck, I have to manually turn off the FPS counter that came with my Alienware monitor. And the high end mobos/graphics cards are all sold out (here in Canada) because enthusiasts have become a bigger demographic than they anticipated.
 
I disagree. These cards are catered to a younger generation that are tech savy enough to know and feel the difference. Heck, I have to manually turn off the FPS counter that came with my Alienware monitor. And the high end mobos/graphics cards are all sold out (here in Canada) because enthusiasts have become a bigger demographic than they anticipated.
If on the one hand you say it's trivial to know the difference and then immediately follow that with pointing out how trivial is to have FPS counters, the very thing that is actually telling you there might be a difference between 120hz and 144hz for example, runs kind of counter to your claim again, in my opinion.

Because yes, this is just a matter of opinion and entirely subjective in the end: You can't tell someone they're not enjoying themselves at "only" 90hz for example. And most statistics show most gamers, even PC gamers, are not playing at high refresh rates.
 
I personally love ultra-wide monitors approaching 40". 34" and 38" to be exact.

Anyone with one of these monitors is gaming in 1080p or 1440p. If they buy a 4K card capable of 60fps, they'll likely be underutilizing it but getting high FPS at up to 1440p.

I guess the next generation of cards will be offering 120fps at 4K on the high end.

Bottom Line: I don't care about the energy use. I just want performance. Just get a 1500 W PSU.

1500 W PSU really?? Do you know most homes in the USA have bedrooms on a 15amp circuit. At 120 volts, that is 1800 watts (15 x 120volts) max load. But wait, it's recommended for continuous use, keep the load no more than 80% of the max, so 1440 watts. Basically at that much wattage you're going to reach a point where you need to consider upgrading your home electrical wiring or eliminate all other appliances running on that same circuit.
 
I'd be interesting to look at another statistic: How many gamers that own a 3080 or better would actually notice if we put them on medium and high settings, cap their framerate at "only" 120hz and tell them "This is a 3080" while they actually run a 3060ti or 3070 and honestly on a practical sense, don't even notice or care about the actual performance tier they buy?

I bet it would be the majority because so many responses overvalue high refresh rating and are dead set on remaining on 1440p monitors and don't even need to drive 4k or refresh rates past 90-120hz anyway (I maintain that 99% of people don't have fast enough reflexes in competitive gaming to benefit for more than 120-144hz refresh rates)

So tl;dr I don't think nearly as many people *need* a 100 teraflops product no matter how much they think they do, it's just the clever marketing the PCMR people, likely unbeknownst to them, have been doing to help Nvidia's arms race.
Gaming isn't about "need" and it never has been. What you think another person wants or should spend their money on doesn't matter. Enthusiasts buy the best parts they can afford because they like tech and not because of value for money. If a person has a 3080 today and wants to upgrade to the next generation then they aren't the type of person that cares about frames per dollar.
You make it sound like NVIDIA making more powerful video cards is a bad thing.
If someone gets a top tier card and doesn't upgrade for a while they will fair much better in 5 years than a person who bought a card good enough for today's games.
 
1500 W PSU really?? Do you know most homes in the USA have bedrooms on a 15amp circuit. At 120 volts, that is 1800 watts (15 x 120volts) max load. But wait, it's recommended for continuous use, keep the load no more than 80% of the max, so 1440 watts. Basically at that much wattage you're going to reach a point where you need to consider upgrading your home electrical wiring or eliminate all other appliances running on that same circuit.
Without knowing the actual power usage there's no reason to panic. Also not everyone puts their computer in their bedrooms and not everyone has 15amp limits.
 
Well the story tells that if you are Just gaming all of these card are useless ,I have a 8700k with 2080 rtx and I am thinking of upgrading in maybe 3 years or more ,a 3080 would be sufficient for even 8 years ,I hear all about new pci 5 protocol new psu over 1000watt DDR5 motherboards to match new cpus and goes on,give me one good reason why I should spend 2000 euros or more upgrading my gaming rig right now ! There is none!! I can wait at least 3 more years or even more to get all the new stuff at half the price or even better ,all the new tech won't make a significant change on my gaming experience, I will wait until all the new tech gets mature enough to be stable and for less money , there are no games out there to utilise that much horsepower, ok found one reason VR ,naaahhh ...to soon I think ,let's wait for VR to get better ,still to soon for that !!
So anybody to give me a good reason ?? I will wait 3 years for an answer!!
 
Next gen graphics & power hungry - I can't think of anything I want less. I simply want this gen graphics at an acceptable price. Quieter and more efficient would just be icing on the cake.
 
Again, if this is true, both companies are completely failing to read the room regarding power efficiency of their cards, as if they are completely oblivious to the rampaging energy costs across many areas in the world- it seems to obscene to think we could soon have graphics cards that could draw the same power as kitchen appliances, and we already try to get the most efficient type we can with those as a lot of them are required for day to day life, and yet Nvidia/AMD think we will just gloss over the fact because it's a graphics card....

I suggest you being engineer then. Both camps are aiming at 4K 120FPs or so and you cant reach there without GPU's that consume more power.

 
Well the story tells that if you are Just gaming all of these card are useless ,I have a 8700k with 2080 rtx and I am thinking of upgrading in maybe 3 years or more ,a 3080 would be sufficient for even 8 years ,I hear all about new pci 5 protocol new psu over 1000watt DDR5 motherboards to match new cpus and goes on,give me one good reason why I should spend 2000 euros or more upgrading my gaming rig right now ! There is none!! I can wait at least 3 more years or even more to get all the new stuff at half the price or even better ,all the new tech won't make a significant change on my gaming experience, I will wait until all the new tech gets mature enough to be stable and for less money , there are no games out there to utilise that much horsepower, ok found one reason VR ,naaahhh ...to soon I think ,let's wait for VR to get better ,still to soon for that !!
So anybody to give me a good reason ?? I will wait 3 years for an answer!!
On a mid-range rig, the RTX 2080 is nice. But for those with higher-end setups, there is ample room for improvement.

YOU may feel that these cards are useless; that in itself does not make them useless. A 2080Ti wouldn't cut it for my monitor to get the maximum performance, unfortunately. :)

I have a 3440x1440p running at a 175Hz refresh rate. For high settings (not Ultra) even the RTX3080Ti won't get me there in many AAA games.

Bring on the RTX 4080 - I hope it destroys the RTX 3090.

To heck with the price or power consumption- after all, nobody buys a Ferrari for the gas mileage. If I have to upgrade my PSU, so be it. I bought a large case to future-proof, and it will now pay off. A custom cooling loop will be in the works once I secure an RTX 4080!
 
Last edited:
I'd be interesting to look at another statistic: How many gamers that own a 3080 or better would actually notice if we put them on medium and high settings, cap their framerate at "only" 120hz and tell them "This is a 3080" while they actually run a 3060ti or 3070 and honestly on a practical sense, don't even notice or care about the actual performance tier they buy?

I bet it would be the majority because so many responses overvalue high refresh rating and are dead set on remaining on 1440p monitors and don't even need to drive 4k or refresh rates past 90-120hz anyway (I maintain that 99% of people don't have fast enough reflexes in competitive gaming to benefit for more than 120-144hz refresh rates)

So tl;dr I don't think nearly as many people *need* a 100 teraflops product no matter how much they think they do, it's just the clever marketing the PCMR people, likely unbeknownst to them, have been doing to help Nvidia's arms race.

Id absolutely take you up on that offer. Ive always been a big visuals are more important than performance above 60fps in most games. More pretty please but whatever my 3080ti FTW3 Ultra can offer I'll take. I'll lower settings and compare the differences between settings then play with resolution for the best possible visuals at the same...

3200x1800 on a 4k display looks fantastic but there is a difference. I remember going from the 2080ti to the 30 and I played Metro at max settings above 60 fps in HDR with VRR and it was fantastic. It was quite the change from the 2080ti FTW3 Ultra. It was the slight differences I noticed the most because of the change and how I spent so much time optimizing everything before.

I do agree most people won't notice differences in most things if the setting randomly flipped would you spot it? What's an acceptable frame rate to want in games at high settings? 60 but preferably 120? 144? Etc. Resolution... How much can it be lowered below 4k before it looks noticable

But like pretty and fast. I can spot the difference immediately during direct gameplay just from the experience of growing up with the all the GPU's that got us here... Seeing new levels of visuals was amazing but it was also expensive... Absolutely not for everyone and I don't blame them.
 
Back