1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Nvidia launches the GeForce RTX 20 Series: Ray-tracing to the mainstream of gaming

By LemmingOverlrd · 70 replies
Aug 20, 2018
Post New Reply
  1. fadingfool

    fadingfool TS Booster Posts: 97   +93

    My last few GPU upgrades were HD7750 to R9 270 to GTX 970 to 1080Ti - all pretty impressive leaps in performance (and in wallet emptying abilities).
    I don't see anything here that would make me lust after a 2080Ti at the moment. Ray tracing is pretty an all but not £1,000 worth of pretty.
    Eugenia likes this.
  2. Capaill

    Capaill TS Evangelist Posts: 857   +460

    I would guess that the only people buying at these prices are the reviewing sites like TS. The rest of us will wait for the gaming comparison reviews to see if the purchase is worth it over what we have. And, seeing as all I'm hearing about the new cards is Ray Tracing, I'm guessing it won't be worth an upgrade on the 10 series.
    Eugenia likes this.
  3. Squid Surprise

    Squid Surprise TS Evangelist Posts: 2,507   +1,504

    It's rarely worth upgrading 1 generation.... unless you simply have money to burn... I have my Maxwell Titans and I'm thinking the 2000 series will be my upgrade....

    As an aside, those of you complaining about how there's "only" 8GB of memory... remember, this is DDR6 memory, so should perform better...
  4. Anton Skryaga

    Anton Skryaga TS Enthusiast Posts: 38   +17

    Let's suppose I'am a game developer. Programmer of 3D engine or artist making levels for your beloved FPS.
    I HATE amount of tricks I needed to make my game looking 'real'. All of the 'baking static light', 'fake lighting sources', 'switching cube maps on-the-fly', 'almost real shadows'. And then I see RTX20x0 where is 'all just works'. Guess what? My next title will have two options - 'Real lighting with RTX' and.. And why do you need another option? :)
  5. Squid Surprise

    Squid Surprise TS Evangelist Posts: 2,507   +1,504

    Since when does RTX mean that it all works?
  6. hahahanoobs

    hahahanoobs TS Evangelist Posts: 2,448   +864

    Intel Integrated graphics in CPU's just sit there doing nothing and no one complains about that. Why is it different if NVIDIA does it?

    You know what I'm going to do? Wait for the reviews.....
  7. Vulcanproject

    Vulcanproject TS Evangelist Posts: 712   +1,004

    Because for a massive amount of people Intel's integrated GPUs don't sit there doing nothing. For every PC or laptop sold with a discrete GPU there is another 10 sold that use integrated.

    For the ones where ultimate CPU performance really matters who are spending $500+ on their chip and die area shouldn't be wasted, it isn't even there. In any case it wasn't a complaint, it was just being pointed out that the usefulness of this extra hardware is yet to be determined. So I have questions and considerable doubts about it.

    Thanks for letting me explain this though. I'll be waiting for the reviews as well to see what these cards can do.
    Last edited: Aug 21, 2018
  8. CaptainTom

    CaptainTom TS Maniac Posts: 404   +212

    I am seeing an immense backlash right now. I really haven't seen this many people respond so negatively to Nvidia's latest Milk-fest.
  9. Squid Surprise

    Squid Surprise TS Evangelist Posts: 2,507   +1,504

    You haven't read enough of this website then.... Any Nvidia, Apple or Microsoft thread generally has tons of hate....
  10. uREJT

    uREJT TS Rookie

    Chip Puzzle is revealing itself.

    First AMD annouces 7nm graphics cards in 2019, then Intel teases new graphics card they never done before and now Nvidia lunches overpriced garbage Turing graphics cards.

    Now this is my interpretation of those events. Intel sees oportunity in gaining some money in graphics cards in 2020 thx to their superior production process of 10nm. I belive its quite possible they will gain some money, but only from OEMs. Gaming will still be AMD's and Nvidia's domain. Nvidia had to lunch Turing now because AMD gonna destroy it with their 7nm graphics cards in 2019. Ray tracing is just a gimmic just like Physx and hairworks. Amd will find a way to do it same way at lower cost.
    I think Radeons gonna be insane in 2019. Nvidia will struggle to compete. There are no signs Nvidia developed any 7nm projects. Instead they claim all the bigg tasty revenue from Pascal.
  11. Squid Surprise

    Squid Surprise TS Evangelist Posts: 2,507   +1,504

    While I’d love to believe this, I find it VERY difficult to believe that AMD will be producing anything that “destroys” Nvidia... they’ve been SO far behind for SO many years now that it doesn’t seem likely that they’ll magically overtake them next year just because they are fabricating on a smaller node... but I hope you’re right - they DID pull off a minor miracle with Zen...
  12. hahahanoobs

    hahahanoobs TS Evangelist Posts: 2,448   +864

    lol My point still stands.
    The argument was about hardware not being used. Not everyone is going to use it. Period!

    Gamers using dGPUs don't need the IGP, but they buy K chips anyway. How many people are using PhysX hardware? Where are those whiners? Or the Onboard audio whiners? Or the QuickSync whiners?
  13. Vulcanproject

    Vulcanproject TS Evangelist Posts: 712   +1,004

    I believe I clearly explained the difference between maybe 90 percent of people using their iGPU on their chip costing less than $360 and the difference between buying a chip costing at least $699 and potentially having a lot of silicon on it doing nothing much of the time.

    You're paying a big premium for RTX features, but not for the iGPU.

    When it's there and a vast majority of people use it all the time, nobody will complain. But when it's there, few people use it often and you're paying a big premium for it, people may end up thinking twice.

    Not sure what else you needed to be made clearer. The implications after this are self explanatory but I can always indulge, right?

    I hope that the parts are fast and there are other uses for these RT areas on the chip. Otherwise you may indeed see a negative reaction to this technology if software support is not excellent.

    AMD could potentially build a smaller chip as fast or faster in conventional games lacking RTX features, with a die size much smaller and therefore also with much lower prices. Then we would see where the consumer leans. Example: $400 AMD chip as fast or faster in every other game minus the effects, or $700 Nvidia chip to have the extra features in say a mere couple dozen titles.

    It's a risk for Nvidia, albeit calculated.
    Last edited: Aug 26, 2018
  14. hahahanoobs

    hahahanoobs TS Evangelist Posts: 2,448   +864

    Percentage has nothing to do with this. In fact the initial comment I replied to was about not using the dedicated Ray Tracing hardware and it going to waste if you don't use it/it sucks.
    You're off topic and need to get back on it! This is a specific scenario. Ray Tracing. 21 games support it. Not all of them are available. NONE of the cards are available! Is the hardware a waste of die space if you DON'T use it? I say 100% NO, since a LOT of hardware and features go unused by users.

    Thanks for playing!
  15. Squid Surprise

    Squid Surprise TS Evangelist Posts: 2,507   +1,504

    lol.... of course only a few games support it - it doesn't actually exist yet! But more games WILL support it.... I think that's the point...
  16. hahahanoobs

    hahahanoobs TS Evangelist Posts: 2,448   +864

    Your comment is EXTREMELY confusing. Read it again.
    You're focused on games and support when the topic is unused hardware. SPECIFICALLY, is it a waste of die space if you don't use it/want to use it.
    Keep trying.
  17. Squid Surprise

    Squid Surprise TS Evangelist Posts: 2,507   +1,504

    I believe the pot is calling the kettle black.... it’s not unnecessary hardware until we know if it will be used... as the cards haven’t even launched yet, we have no way to know if the hardware is “wasted” until then. In a year or so, if only 15-20 games support ray tracing, then maybe you’re right.... but WE DON’T KNOW YET!
  18. hahahanoobs

    hahahanoobs TS Evangelist Posts: 2,448   +864

    I stopped reading at "it's not unnecessary hardware until."
    Work on your reading comprehension.
  19. Vulcanproject

    Vulcanproject TS Evangelist Posts: 712   +1,004

    My original posts were precisely on topic. It was only your question for some reason moving onto CPU iGPUs that went off topic......

    Therefore I explained very slowly and clearly what the difference was between the scenario you created without any rebuttal from you. One is much cheaper and gets used all the time by most consumers that pay for it. The other is expensive and remains to be seen how useful and how much it might be used by consumers. As far as we know, it'll only be in less than a few dozen games in the next 12 months.

    Then I moved it back onto what the RTX hardware could mean for the consumer and market, entirely relevant to the topic. Thanks for giving me the opportunity to indulge in those explanations and help clear it up for those people that actually understood.

    I feel there is nothing further to be added that isn't more speculation, since I covered your question and points.
    Squid Surprise likes this.
  20. Squid Surprise

    Squid Surprise TS Evangelist Posts: 2,507   +1,504

    Yeah, I think it’s now quite obvious where the problem is...
  21. mailpup

    mailpup TS Special Forces Posts: 7,375   +611

    Enough personal conversations and comments, thank you.

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...