GeForce RTX 2080 incoming: Watch Jensen Huang's keynote at 9am PT / 12pm ET/ 4pm GMT

By midian182 · 21 replies
Aug 20, 2018
Post New Reply
  1. Update #2: We're live with the final GeForce RTX 20 Series announcement:
    Nvidia launches the GeForce RTX 20 Series: Ray-tracing to the mainstream of gaming

    Update (8/20): Tune in to watch Nvidia's founder and CEO, Jensen Huang, kick-off a special event in Cologne, Germany. Gamescom 2018, runs August 21-25 there, but before the show begins, Nvidia will be hosting an event where we expect the new GeForce RTX 2080 GPU will be unveiled. The keynote will be livestreamed in Twitch and Facebook. We also have it here (watch above), so you can follow the announcement while catching up on all your tech news.

    Nvidia revealed its new Turing architecture at SIGGRAPH last week, but gamers wanted to know what this meant for the company’s GeForce line. Nvidia obliged by releasing a trailer containing enough hidden clues to keep Sherlock Holmes busy: we see mention of user names “Mac-20” and “Eight Tee” (2080, see), RoyTex (RTX), Not_11, and Zenith20 (20 series). There are also coordinates for a location in Cologne (Gamescom’s home), another 2080 hint in the way that the date appears at the end, and a user called AlanaT (Alan Turing).

    So, while there’s a small chance that Nvidia will keep using the GTX name alongside the RTX cards, it appears that the latter is replacing the long-used former.

    We still don’t have any official performance information on the RTX 2080, but what we do know from the SIGGRAPH reveal is that ray tracing will be a star feature—the RTX name stands for real-time ray tracing.

    A simple explanation of ray tracing is that it’s a rendering process involving the mapping of light rays through an image to create realistic shading, reflections, and depth-of-field. Doing this in real-time while maintaining an acceptable frame rate requires a massive amount of power, which is where the Turing architecture’s RT cores help. As you can see in the video above, the results can be spectacular.

    The all-important price is another unknown element of the RTX 2080. A poster on Baidu recently claimed that it would be as low as $649. While that is a nice thought, it seems pretty unlikely to be so (comparatively) cheap. The same person also claims the card will have 3,072 CUDA cores running at 1,920MHz with a boost clock of 2,500MHz, though 1.7GHz/1.95GHz boost is more likely. The GTX 1080, for comparison, comes with 2,560 CUDA cores and a boost clock of 1,733MHz.

    We do know that Turing GPUs are being built on the 12nm FinFet process and will use GDDR6 memory. The RTX 2080 is expected to come with between 8GB -16GB of 14Gbps GDDR6 and have a TDP of 170W to 200W. Peak FP32 compute performance, meanwhile, is thought to be around 12.2 TFLOPS, putting it slightly above the GTX 1080 Ti’s 11.3 TFLOPS FP32

    Thankfully, we only have a few days left before Nvidia reveals all about the RTX 2080.

    Permalink to story.

     
  2. EEatGDL

    EEatGDL TS Evangelist Posts: 593   +275

    It was a great presentation as usual. I always get excited because of Jensen's excitement. He may not be the best presenter and people may be tired of his jacket; but I like his presentation format and he's pretty used to it.
     
    davislane1 likes this.
  3. davislane1

    davislane1 TS Grand Inquisitor Posts: 5,198   +4,302

    I enjoy his presentations. He has a passion for the technology that few of the other executives display on stage.
     
    Reehahs likes this.
  4. ForgottenLegion

    ForgottenLegion TS Maniac Posts: 241   +230

    One thing I have to say is nVidia haven't sat on their laurels due to lack of competition, they've kept pushing the technology forward each generation with meaningful performance gains.
    I can't say the same about Intel with their pathetic 5% IPC improvement each year.
     
  5. Lew Zealand

    Lew Zealand TS Enthusiast Posts: 58   +42

    Each year? What year? The last time Intel gave us an IPC bump was Skylake, 3 years ago.

    It appears that adding more video cores for increased parallelization is easier than making each core more efficient. I wonder what Nvidia or AMD's video card IPC improvements have been for each roughly 2 year generation?
     
    Reehahs likes this.
  6. EEatGDL

    EEatGDL TS Evangelist Posts: 593   +275

    NVIDIA has been able to pump higher frequencies each generation, they haven't hit a wall there like Intel. Also keep adding more cores and improving in many data paths. So right now they still don't need to focus too much on IPC, until they start hitting "walls" (hard limits).
    Problem with Intel is that frequency bumps have been minimal between generations, up to the last seven generations without increasing the core count, and 0 improvement in IPC between Skylake, Kaby Lake (Skylake with higher frequency, no IPC improvement), and at least Coffee Lake (Kaby with more cores, still no IPC improvement).

    So I don't know about you, but to me if you can get 50% increase in performance from each new generation, with a similar power and price constraint --I'm assuming that-- then I don't care through what means they achieve that. That wasn't the case with Intel before the arrival of Ryzen.
     
    Reehahs and Roman Architect like this.
  7. ForgottenLegion

    ForgottenLegion TS Maniac Posts: 241   +230

    It sounds like you're arguing against my comment when we're actually in agreement. Reread what I wrote.
     
  8. Brock Kane

    Brock Kane TS Maniac Posts: 227   +127

    KC & The Sunshine Band! - Boogie Shoes! Nice!
     
  9. Lew Zealand

    Lew Zealand TS Enthusiast Posts: 58   +42

    Not arguing, definitely in agreement and going more extreme in that Intel hasn't even gotten their meager 5% IPC improvements in recent gens.
     
    ForgottenLegion likes this.
  10. techdaz

    techdaz TS Rookie

    Its nice to see Ray Tracing technology / GPU's being introduced now instead of 5 years from now, this head start will allow for game developers to implement this technology in games on a massive scale, without delaying it for another console cycle etc. Wonder if next Gen consoles will have this tech in them.
     
  11. failquail

    failquail TS Member Posts: 23   +8

    What worries me is Nvidia doing their usual thing and restricting their raytracing tech to proprietary status, much like they did with Physx.

    Meaning that because it'll be restricted (or at least performance crippled) tech for a significant portion of the market, no game will be able to be built around it and it'll be relegated to optional eye-candy on select games for select Nvidia GPUs...

    End result: Nvidia again restricting development of new technologies on the PC in favour of their profit margin...
     
    Last edited: Aug 18, 2018
    Roman Architect likes this.
  12. Boilerhog146

    Boilerhog146 TS Evangelist Posts: 615   +214

    Sounds like business as usual. got the shareholders dancing ,AMD and Intel have their work cut out for them.won't be putting one of these on a cpu. will be nvlink instead of sli I can only guess.I want a couple already.4K gaming coming right up.

    now a decent 4K 34" ultrawide,144 hz ,curved, Display to replace my 30" Dell. I may keep gaming.
    I just hope some good games come out shortly thereafter. with none of this pay to play/win,lootboxes DLC ,etc,etc. I'll pay for a good ,COMPLETE game .on a disc.in a box.with no DRM.

    Quadro is already in Devs hands . so games should already be in the works.bring it on.
     
    Reehahs likes this.
  13. vexcro

    vexcro TS Rookie

    Guess what is RTX? A Radeon TX.
     
    SantistaUSA and Reehahs like this.
  14. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 10,448   +4,336

    Something is screwy with those times. The timer is counting down to 11AM Central. That would be 12 noon Eastern and 4PM GMT. That is what the time zone conversion is telling me anyway. Regardless there needs to be two time zones between ET and PT, not just one.
     
  15. HyPeroxya

    HyPeroxya TS Enthusiast Posts: 50

    Counting down to 5pm GMT (17:00) here , so yes they have got that time machine working, bit only in a forward direction... which negates the "kill your mother" conundrum.
     
  16. Julio Franco

    Julio Franco TechSpot Editor Posts: 7,900   +1,113

    You're correct :) times are fixed now. Thanks @HyPeroxya, too
     
    cliffordcooley likes this.
  17. SantistaUSA

    SantistaUSA TS Booster Posts: 104   +27

    Awesome presentation but I want to see some benchmarks! :D
     
  18. texasrattler

    texasrattler TS Evangelist Posts: 400   +140

    Turing looks good. Prices are as expected. Fairly reasonable. We'd all like cheaper but we know what to expect.
    Im excited. Ready to buy or build a new computer.
     
  19. Burty117

    Burty117 TechSpot Chancellor Posts: 3,304   +1,072

    I'll wait for some Benchmarks but it does look promising, it seems Nvidia is flexing it's collaboration with game devs to get the technology in games. I'm skeptical about how well it actually runs while actually playing a game since they didn't show off any gameplay, sure, we had the in-engine demo's for BF5 and Tomb Raider but no actual gameplay, am I even going to notice the blowing up of buildings being properly reflected off the glass of the vehicles around me while I'm running around, trying not to get sniped? Tomb Raider though, it definitely helps with the atmosphere and I didn't think Metro could get much prettier but Nvidia managed it, I'll give them that.

    I guess I just need to wait for HardReset to join in the conversation and let me know that AMD's 7970 had this tech years ago and everyone who buys into Nvidia's 20xx line are just wrong :D
     
  20. harm9963

    harm9963 TS Enthusiast Posts: 27

  21. MonsterZero

    MonsterZero TS Evangelist Posts: 502   +271

    For an additional $200 the 2080Ti version better have like 30% of the performance of the 2080.

    I feel crazy for paying $1000 for a video card.
     
  22. Julio Franco

    Julio Franco TechSpot Editor Posts: 7,900   +1,113

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...