Nvidia RTX 5000 graphics cards rumored to launch next year with massive performance increase

Author(s):
How about keeping it real and honest and not hyping Nvidia over some claim by some speculator trying to clickbait revenue more than one year ahead of release...

You're better than that...

We've seen you too weeping on disappointing performance gains and poor value, so please...

At least don't clickbait on these basis.
 
So why not buy a PRO card...?

Or, do you just like the way nVidia markets gaming cards to creators...? (Bcz it's a cheaper alternative, then actually buying a Pro card.)


And do you think NVidia markets it gaming card (as creator cards) so they can claim more people buy their cards for gaming... when it is not true...? Because (TODAY), almost all the people I know who buy nVidia cards, do so because of non-gaming cuda requirements..
Suggesting to buy "pro" cards is annoying as hell. And NVIDIA is not marketing these cards just to gamers.
 
I'm so sick of hearing about DLSS.

You sure mention it more than anyone else in here for hating it so much. Don't buy Nvidia, and if you do by accident, don't turn it on I guess. It's optional.

I don't really consider it a selling point, at least not one worth paying more for, just a nice to have - and that it is, nice to have.

I think I've found the most strongly anti-Nvidia community on the internet right here, more so than r/AMD, and even AyyMD most takes are ironic than unironic. That's a real achievement.



 
You sure mention it more than anyone else in here for hating it so much. Don't buy Nvidia, and if you do by accident, don't turn it on I guess. It's optional.

I don't really consider it a selling point, at least not one worth paying more for, just a nice to have - and that it is, nice to have.

I think I've found the most strongly anti-Nvidia community on the internet right here, more so than r/AMD, and even AyyMD most takes are ironic than unironic. That's a real achievement.

It's used used to falsely advertise the capabilities of cards; at best marketing fluff and worst intentional misleading. It's no different than the consoles claiming they can do 4K@60fps, when they aren't producing true 4K. Cap that off with the Frame Gen trying to hyper-inflate the card framerates, it's understandable the exhaustion with the tech and its mention.
 
It's used used to falsely advertise the capabilities of cards; at best marketing fluff and worst intentional misleading. It's no different than the consoles claiming they can do 4K@60fps, when they aren't producing true 4K. Cap that off with the Frame Gen trying to hyper-inflate the card framerates, it's understandable the exhaustion with the tech and its mention.
How is it a false capability, they more often than not show side by sides with on and off, so the off performance is stated upfront, marketing fluff might be the only thing I can agree with, should you think it's a useless gimmick. With publications like Hardware Unboxed themselves, among many other reputable ones singing it's praises, and stating it can offer better than native IQ, and that you'd be silly not to use it, Nvidia would be a fool not to lean into it, hard. It's also obvious why the users like it. Now, if you're foolish enough to buy a RTX card weaker than you need, counting on DLSS to always make up the shortfall, that's on you when the facts are so very easy to find.

From where I'm sitting the exhaustion is from one small subset in the scheme of the whole market, people that have sworn away from NVidia for whatever reason of their choosing, and more often than not don't own a card capable of using it. Given the vast majority of people who bought a GPU in the last 5 years bought an RTX card, the sentiment is almost entirely not coming from them, bar some who became disgruntled over time for whatever reason of their choosing.

Nvidia is the company the vocal minority love to hate, and DLSS is just the Nvidia tech-du-jour that is a lightning rod for much of it.
 
RTX will ONLY be small die shrink. RTX40 already on agressive 4nm. 3nm or even 2nm only improve performance and logic scaling a little.. like 10-20%. Its 20 year record low sales. How to make $$$ = smaller chips.
 
How is it a false capability, they more often than not show side by sides with on and off, so the off performance is stated upfront, marketing fluff might be the only thing I can agree with, should you think it's a useless gimmick. With publications like Hardware Unboxed themselves, among many other reputable ones singing it's praises, and stating it can offer better than native IQ, and that you'd be silly not to use it, Nvidia would be a fool not to lean into it, hard. It's also obvious why the users like it. Now, if you're foolish enough to buy a RTX card weaker than you need, counting on DLSS to always make up the shortfall, that's on you when the facts are so very easy to find.

From where I'm sitting the exhaustion is from one small subset in the scheme of the whole market, people that have sworn away from NVidia for whatever reason of their choosing, and more often than not don't own a card capable of using it. Given the vast majority of people who bought a GPU in the last 5 years bought an RTX card, the sentiment is almost entirely not coming from them, bar some who became disgruntled over time for whatever reason of their choosing.

Nvidia is the company the vocal minority love to hate, and DLSS is just the Nvidia tech-du-jour that is a lightning rod for much of it.
Because they will tout broad statements of multiplied performance over the preceding generation or current competition without clearly notating that this is with these features on and how they are specifically being leveraged to reach that conclusion. I don't want upscaled frames which are notably not as good looking as true resolution frames, nor do I want fake frames that are going to generate input delay.

I'm just tired of hearing of it the same way I was tired of hearing about consoles claiming 40k@60fps when they were upscaling 1080p and barely maintaining 60fps. If it's not rendered at the true resolution, then you're trying to conflate the performance with something it's not when you put the fps at up at the upscale resolution and you put an asterisk with some tiny text indicating you actually only rendered at 1/4 of the claimed resolution.

The only reason I have sworn away from Nvidia this generation is the same one I've sworn away from AMD. Their prices are absurd for the amount of actual performance they bring to the table and that is just further soured by them trying to claim greater performance by leaning into these technologies as if they're actually producing the same results as raw rendering at the chosen resolution.
 
And we only entered Q3...

Entry level GPU's with Mid level naming and High level prices = Nvidia 2023!

Most gamers and nVidia fanbois are living in la la land. nVidia didnt like make pricing typo... high prices are only going to get higher.
ok.. maybe 10% chance 5080 will be only $1000.
 
Because they will tout broad statements of multiplied performance over the preceding generation or current competition without clearly notating that this is with these features on and how they are specifically being leveraged to reach that conclusion.
Seems clear to me when they show it, but I can see where you're coming from.
I'm just tired of hearing of it the same way I was tired of hearing about consoles claiming 40k@60fps when they were upscaling 1080p and barely maintaining 60fps. If it's not rendered at the true resolution, then you're trying to conflate the performance with something it's not when you put the fps at up at the upscale resolution and you put an asterisk with some tiny text indicating you actually only rendered at 1/4 of the claimed resolution.
I get this too, far less outrage against it though. And honestly I don't really mind what tricks and upscaling is done at the end of the day, they're all magically rendered pixels that make their way to my screen, I'm not going to draw a line in the sand of what is not acceptable based on something arbitrary. For DLSS (super resolution), when the output is often as good or better than native, with the best AA in the business right now, it's hard to be sour on the method when the final image is so excellent, with a performance boost.

If your game looks crap (or for example, looks like 1080p) and you're claiming 4k just because that's the output resolution, yeah that's abysmal, but if the output really is 4k-like, I'm not mad, I'm impressed.
 
Back