Nvidia launches the GeForce RTX 20 Series: Ray-tracing to the mainstream of gaming

My last few GPU upgrades were HD7750 to R9 270 to GTX 970 to 1080Ti - all pretty impressive leaps in performance (and in wallet emptying abilities).
I don't see anything here that would make me lust after a 2080Ti at the moment. Ray tracing is pretty an all but not £1,000 worth of pretty.
 
I would guess that the only people buying at these prices are the reviewing sites like TS. The rest of us will wait for the gaming comparison reviews to see if the purchase is worth it over what we have. And, seeing as all I'm hearing about the new cards is Ray Tracing, I'm guessing it won't be worth an upgrade on the 10 series.
 
I
I would guess that the only people buying at these prices are the reviewing sites like TS. The rest of us will wait for the gaming comparison reviews to see if the purchase is worth it over what we have. And, seeing as all I'm hearing about the new cards is Ray Tracing, I'm guessing it won't be worth an upgrade on the 10 series.
It's rarely worth upgrading 1 generation.... unless you simply have money to burn... I have my Maxwell Titans and I'm thinking the 2000 series will be my upgrade....

As an aside, those of you complaining about how there's "only" 8GB of memory... remember, this is DDR6 memory, so should perform better...
 
Let's suppose I'am a game developer. Programmer of 3D engine or artist making levels for your beloved FPS.
I HATE amount of tricks I needed to make my game looking 'real'. All of the 'baking static light', 'fake lighting sources', 'switching cube maps on-the-fly', 'almost real shadows'. And then I see RTX20x0 where is 'all just works'. Guess what? My next title will have two options - 'Real lighting with RTX' and.. And why do you need another option? :)
 
Let's suppose I'am a game developer. Programmer of 3D engine or artist making levels for your beloved FPS.
I HATE amount of tricks I needed to make my game looking 'real'. All of the 'baking static light', 'fake lighting sources', 'switching cube maps on-the-fly', 'almost real shadows'. And then I see RTX20x0 where is 'all just works'. Guess what? My next title will have two options - 'Real lighting with RTX' and.. And why do you need another option? :)
Since when does RTX mean that it all works?
 
Is there just a big dead chunk of silicon doing nothing if you do not enable RT features? It seems likely.

Intel Integrated graphics in CPU's just sit there doing nothing and no one complains about that. Why is it different if NVIDIA does it?

You know what I'm going to do? Wait for the reviews.....
 
Intel Integrated graphics in CPU's just sit there doing nothing and no one complains about that. Why is it different if NVIDIA does it?

You know what I'm going to do? Wait for the reviews.....

Because for a massive amount of people Intel's integrated GPUs don't sit there doing nothing. For every PC or laptop sold with a discrete GPU there is another 10 sold that use integrated.

For the ones where ultimate CPU performance really matters who are spending $500+ on their chip and die area shouldn't be wasted, it isn't even there. In any case it wasn't a complaint, it was just being pointed out that the usefulness of this extra hardware is yet to be determined. So I have questions and considerable doubts about it.

Thanks for letting me explain this though. I'll be waiting for the reviews as well to see what these cards can do.
 
Last edited:
More marketing BS from nVidia to sell yet another overpriced card.

Something that would help game developers perhaps as much without a hardware accelerated calculation engine is for them to learn Geometric Alegbra. Somehow, I doubt that is going to happen.

Although it seems highly unlikely at this point, one can hope for AMD to get their act together. I may just buy used again; even if I wanted to pay those prices, it sounds like the value is not there.

Value, IMO, is all important.

And if they have misjudged the backlash they will get if these cards prove to benchmark not much faster than the last generation.

The way I see it, is that there are people out there who are growing very weary of being bilked insane amounts of money for piddling performance improvements, first from sIntel and now nVidia. To me, it is easy to liken this to cord cutters. The value on each successive generation of silicon is diminishing in comparison to the prices.

There are people out there that I am sure will have to have these new cards and will pay any price for them, however, those who will actually use them will, as I see it, most likely be in the professional arena and will not buy these consumer cards.

I am seeing an immense backlash right now. I really haven't seen this many people respond so negatively to Nvidia's latest Milk-fest.
 
I am seeing an immense backlash right now. I really haven't seen this many people respond so negatively to Nvidia's latest Milk-fest.
You haven't read enough of this website then.... Any Nvidia, Apple or Microsoft thread generally has tons of hate....
 
Chip Puzzle is revealing itself.

First AMD annouces 7nm graphics cards in 2019, then Intel teases new graphics card they never done before and now Nvidia lunches overpriced garbage Turing graphics cards.

Now this is my interpretation of those events. Intel sees oportunity in gaining some money in graphics cards in 2020 thx to their superior production process of 10nm. I belive its quite possible they will gain some money, but only from OEMs. Gaming will still be AMD's and Nvidia's domain. Nvidia had to lunch Turing now because AMD gonna destroy it with their 7nm graphics cards in 2019. Ray tracing is just a gimmic just like Physx and hairworks. Amd will find a way to do it same way at lower cost.
I think Radeons gonna be insane in 2019. Nvidia will struggle to compete. There are no signs Nvidia developed any 7nm projects. Instead they claim all the bigg tasty revenue from Pascal.
 
Chip Puzzle is revealing itself.

First AMD annouces 7nm graphics cards in 2019, then Intel teases new graphics card they never done before and now Nvidia lunches overpriced garbage Turing graphics cards.

Now this is my interpretation of those events. Intel sees oportunity in gaining some money in graphics cards in 2020 thx to their superior production process of 10nm. I belive its quite possible they will gain some money, but only from OEMs. Gaming will still be AMD's and Nvidia's domain. Nvidia had to lunch Turing now because AMD gonna destroy it with their 7nm graphics cards in 2019. Ray tracing is just a gimmic just like Physx and hairworks. Amd will find a way to do it same way at lower cost.
I think Radeons gonna be insane in 2019. Nvidia will struggle to compete. There are no signs Nvidia developed any 7nm projects. Instead they claim all the bigg tasty revenue from Pascal.
While I’d love to believe this, I find it VERY difficult to believe that AMD will be producing anything that “destroys” Nvidia... they’ve been SO far behind for SO many years now that it doesn’t seem likely that they’ll magically overtake them next year just because they are fabricating on a smaller node... but I hope you’re right - they DID pull off a minor miracle with Zen...
 
Because for a massive amount of people Intel's integrated GPUs don't sit there doing nothing. For every PC or laptop sold with a discrete GPU there is another 10 sold that use integrated.

For the ones where ultimate CPU performance really matters who are spending $500+ on their chip and die area shouldn't be wasted, it isn't even there. In any case it wasn't a complaint, it was just being pointed out that the usefulness of this extra hardware is yet to be determined. So I have questions and considerable doubts about it.

Thanks for letting me explain this though. I'll be waiting for the reviews as well to see what these cards can do.

lol My point still stands.
The argument was about hardware not being used. Not everyone is going to use it. Period!

Gamers using dGPUs don't need the IGP, but they buy K chips anyway. How many people are using PhysX hardware? Where are those whiners? Or the Onboard audio whiners? Or the QuickSync whiners?
#stopwhining
 
lol My point still stands.
The argument was about hardware not being used. Not everyone is going to use it. Period!

Gamers using dGPUs don't need the IGP, but they buy K chips anyway. How many people are using PhysX hardware? Where are those whiners? Or the Onboard audio whiners? Or the QuickSync whiners?
#stopwhining

I believe I clearly explained the difference between maybe 90 percent of people using their iGPU on their chip costing less than $360 and the difference between buying a chip costing at least $699 and potentially having a lot of silicon on it doing nothing much of the time.

You're paying a big premium for RTX features, but not for the iGPU.

When it's there and a vast majority of people use it all the time, nobody will complain. But when it's there, few people use it often and you're paying a big premium for it, people may end up thinking twice.

Not sure what else you needed to be made clearer. The implications after this are self explanatory but I can always indulge, right?

I hope that the parts are fast and there are other uses for these RT areas on the chip. Otherwise you may indeed see a negative reaction to this technology if software support is not excellent.

AMD could potentially build a smaller chip as fast or faster in conventional games lacking RTX features, with a die size much smaller and therefore also with much lower prices. Then we would see where the consumer leans. Example: $400 AMD chip as fast or faster in every other game minus the effects, or $700 Nvidia chip to have the extra features in say a mere couple dozen titles.

It's a risk for Nvidia, albeit calculated.
 
Last edited:
I believe I clearly explained the difference between maybe 90 percent of people using their iGPU on their chip costing less than $360 and the difference between buying a chip costing at least $699 and potentially having a lot of silicon on it doing nothing much of the time.

You're paying a big premium for RTX features, but not for the iGPU.

When it's there and a vast majority of people use it all the time, nobody will complain. But when it's there, few people use it often and you're paying a big premium for it, people may end up thinking twice.

Not sure what else you needed to be made clearer. The implications after this are self explanatory but I can always indulge, right?

I hope that the parts are fast and there are other uses for these RT areas on the chip. Otherwise you may indeed see a negative reaction to this technology if software support is not excellent.

AMD could potentially build a smaller chip as fast or faster in conventional games lacking RTX features, with a die size much smaller and therefore also with much lower prices. Then we would see where the consumer leans. Example: $400 AMD chip as fast or faster in every other game minus the effects, or $700 Nvidia chip to have the extra features in say a mere couple dozen titles.

It's a risk for Nvidia, albeit calculated.

Percentage has nothing to do with this. In fact the initial comment I replied to was about not using the dedicated Ray Tracing hardware and it going to waste if you don't use it/it sucks.
You're off topic and need to get back on it! This is a specific scenario. Ray Tracing. 21 games support it. Not all of them are available. NONE of the cards are available! Is the hardware a waste of die space if you DON'T use it? I say 100% NO, since a LOT of hardware and features go unused by users.

Thanks for playing!
 
Percentage has nothing to do with this. In fact the initial comment I replied to was about not using the dedicated Ray Tracing hardware and it going to waste if you don't use it/it sucks.
You're off topic and need to get back on it! This is a specific scenario. Ray Tracing. 21 games support it. Not all of them are available. NONE of the cards are available! Is the hardware a waste of die space if you DON'T use it? I say 100% NO, since a LOT of hardware and features go unused by users.

Thanks for playing!
lol.... of course only a few games support it - it doesn't actually exist yet! But more games WILL support it.... I think that's the point...
 
lol.... of course only a few games support it - it doesn't actually exist yet! But more games WILL support it.... I think that's the point...

Your comment is EXTREMELY confusing. Read it again.
You're focused on games and support when the topic is unused hardware. SPECIFICALLY, is it a waste of die space if you don't use it/want to use it.
Keep trying.
 
Your comment is EXTREMELY confusing. Read it again.
You're focused on games and support when the topic is unused hardware. SPECIFICALLY, is it a waste of die space if you don't use it/want to use it.
Keep trying.
I believe the pot is calling the kettle black.... it’s not unnecessary hardware until we know if it will be used... as the cards haven’t even launched yet, we have no way to know if the hardware is “wasted” until then. In a year or so, if only 15-20 games support ray tracing, then maybe you’re right.... but WE DON’T KNOW YET!
 
I believe the pot is calling the kettle black.... it’s not unnecessary hardware until we know if it will be used... as the cards haven’t even launched yet, we have no way to know if the hardware is “wasted” until then. In a year or so, if only 15-20 games support ray tracing, then maybe you’re right.... but WE DON’T KNOW YET!

I stopped reading at "it's not unnecessary hardware until."
Work on your reading comprehension.
 
Percentage has nothing to do with this. In fact the initial comment I replied to was about not using the dedicated Ray Tracing hardware and it going to waste if you don't use it/it sucks.
You're off topic and need to get back on it! This is a specific scenario. Ray Tracing. 21 games support it. Not all of them are available. NONE of the cards are available! Is the hardware a waste of die space if you DON'T use it? I say 100% NO, since a LOT of hardware and features go unused by users.

Thanks for playing!

My original posts were precisely on topic. It was only your question for some reason moving onto CPU iGPUs that went off topic......

Therefore I explained very slowly and clearly what the difference was between the scenario you created without any rebuttal from you. One is much cheaper and gets used all the time by most consumers that pay for it. The other is expensive and remains to be seen how useful and how much it might be used by consumers. As far as we know, it'll only be in less than a few dozen games in the next 12 months.

Then I moved it back onto what the RTX hardware could mean for the consumer and market, entirely relevant to the topic. Thanks for giving me the opportunity to indulge in those explanations and help clear it up for those people that actually understood.

I feel there is nothing further to be added that isn't more speculation, since I covered your question and points.
 
Back