1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Next-gen GeForce 11/20-series cards will reportedly use GDDR6, could be unveiled this...

By midian182 · 38 replies
Apr 1, 2018
Post New Reply
  1. Nvidia might have revealed some amazing hardware at its recent GTC event, including the $400K DGX-2 supercomputer and the $9000 Quadro GV100 GPU, but there was no mention of any of its upcoming gaming cards. However, one memory supplier has revealed that the next GeForce generation will use GDDR6 and could be announced sometime in July or August.

    South Korean semiconductor supplier SK Hynix told Gamer Nexus that its GDDR6 memory is set to reach mass production in three months and will be used in several of Nvidia’s products, such as autonomous vehicle components and gaming GPUs.

    This doesn’t mean the new cards will launch as soon as this summer, but it certainly suggests we won’t see them any time before July. It's likely that Nvidia will reveal precisely when they’ll arrive and further details of the GeForce 11 (or GeForce 20) line in just a few months.

    GDDR6, which will reportedly be found in Nvidia’s Tesla and Quadro parts as well as its new GeForce cards, will boast bandwidth upwards of 16Gb/s. It will run at 1.35v and be 20 percent more expensive to produce than current cards, so expect the 1170 and 1180 (or 2070 and 2080) to cost noticeably more than the 10-series launch prices.

    SK Hynix confirmed the GDDR6 chips would come in 1GB and 2GB densities, and that the new line of cards will feature up to 16GB of video RAM—more than the Nvidia Titan Xp’s 12GB. It’s speculated that the 1170/2070 and 1180/2080 could include 8GB and 16GB RAM, respectively.

    The news hasn’t cleared up the confusion over Nvidia's Ampere and Turing architecture roadmap. Some still suspect that the former could be the name used for its upcoming gaming-based cards, while the latter may be designed for cryptomining GPUs. Hopefully, things will be a lot clearer by the time summer rolls around.

    Permalink to story.

     
  2. TomSEA

    TomSEA TechSpot Chancellor Posts: 3,129   +1,635

    Well this is all very interesting and exciting except PC gamers and building enthusiasts won't be able to get their hands on these cards due to the obnoxious crypto-miners.
     
  3. Nocturne

    Nocturne TS Maniac Posts: 201   +102

    Honestly I'm a bit more worried about RayTracing, I almost get this feeling like NVidia discontinuing SLI support means that due to the heavy compute needed we may be in for another solution where we need a compute card like the old PPU otherwise we will be paying premiums for higher end cards where they are more in the vein of previous Titan cards with high compute.
     
  4. Nobina

    Nobina TS Evangelist Posts: 2,000   +1,536

    Crypto miners shouldn't be a problem for long since they are rapidly becoming homeless.
     
  5. Sausagemeat

    Sausagemeat TS Maniac Posts: 409   +205

    Nvidia releasing new GPU's despite a complete lack of competent competition? Im all ears! This will be my new GPU - the 16GB version. I will fight anyone for one!
     

  6. Edit: Misread the first sentence. The dropping of SLI support isn't something to worry about. The new link tech they showed off at GTC is a superior replacement.
     
    Sausagemeat likes this.
  7. Dustyn

    Dustyn TS Booster Posts: 104   +40

    Crypto-miners spoil it for everyone. Everyone loses... :(
     
    MonsterZero likes this.
  8. Sausagemeat

    Sausagemeat TS Maniac Posts: 409   +205

    I imagine/hope there is going to be a pre-order scheme limiting the cards to something like 2 per customer. I know its not ideal but if you are serious about getting one then you should be able to source one. Im tempted to email Nvidia about it now and ask them to take my money! Especially after the shortage that occurred when the 10 series dropped and there was no mining crisis then!
     
  9. Purchased a 1080ti or mining. So far, winning bigly. Also runs games pretty well...at 100FPS...at 3440x1440...on ultra.

    I will sell it to you for $2500.
     
  10. Sausagemeat

    Sausagemeat TS Maniac Posts: 409   +205

    If you are being serious what algorithm are you using on your 1080 ti? I read they can only do about 37MH/s on Eth which makes it kinda terrible at mining Eth for the amount of money they cost, even at MSRP.
     
    alabama man likes this.
  11. Nocturne

    Nocturne TS Maniac Posts: 201   +102


    Ah, didn't see it ty I'll have to do some reading. Something tells me the next couple years will be interesting.
     
  12. I only mine non-mainstream coins, for the reason listed above. "Popular" coins like BTC, ETH, etc. only turn a decent profit when they are in the stratosphere due to large-scale miners running up the difficulty. In any event, I use a 1080ti and a 1070 and they yield approx. 2.9 GH/s, which translates to about one hundred coins per month running 12/7. The 1080ti itself has a hash rate of whatever the boost clock decides to run at, which is usually between 1910-1920. The 1070 only manages about 980 MH/s.

    And before anyone asks: no, I'm not going to post what these coins are on a high traffic tech site.
     
  13. See:


    The scaling back of SLI and hype about ray tracing makes me think they intend to replace SLI with their NVLink 2 bridge. This makes sense from what's said in the video, as he mentions no software support will be needed.
     
  14. Theinsanegamer

    Theinsanegamer TS Evangelist Posts: 1,570   +1,787

    I was guessing something to this regard when they said "NvLink 2 can keep 2 HBM quadro's memory synced" which would suggest NvLink 2 has ridiculously high bandwidth compared to SLI.

    True DualGPU rendering, as opposed to the two synced GPUs that SLI gave us, would be great for pushing 4k144 monitors with maximum details enabled. As far as I'm concerned, we will need such technology as GPU release schedules get longer and longer (pascal is almost 2 years old already).
     
    davislane1 likes this.
  15. Two words: linear scaling.

    Every gamer's dream.
     
  16. Knot Schure

    Knot Schure TS Addict Posts: 276   +118

    Yeah, the Mrs and I just picked up our FIFTH title deed, and last week I added another batch of GPUs to my setup.

    I feel so' homeless' with a key ring full, like a jailer...

    :)

    Therefore I wonder, from where your comment might come, and who would upvote it? I guess it doesn't matter too much.

    Life is a game, money is how we keep score.
     
  17. Nobina

    Nobina TS Evangelist Posts: 2,000   +1,536

    Thought you crypto dudes don't like money.
     
  18. Knot Schure

    Knot Schure TS Addict Posts: 276   +118

    Yeah, it make me really unhappy...

    Please send me all you have, so I can remain in misery.
     
  19. penn919

    penn919 TS Maniac Posts: 273   +158

    I figure Cryptonight
     
  20. pcnthuziast

    pcnthuziast TS Evangelist Posts: 607   +204

    I wonder how many people are expecting prices to stabilize or return to levels they're used to. I expect the 3 tiers to be priced at 500, 1000, 1500 (60/70/80).
     
  21. wiyosaya

    wiyosaya TS Evangelist Posts: 4,119   +2,406

    Could be unveiled is not the same as will be. I am still in the nVidia does not give a crap about gamers camp at the moment.
     
  22. Nocturne

    Nocturne TS Maniac Posts: 201   +102

    NVlink 2 is super computer connections, NVlink is for direct link to IBMs Big Iron Processors, and replaces PCIe as an interconnect.
    https://www.fudzilla.com/news/graphics/41420-nvidia-nvlink-2-0-arrives-in-ibm-servers-next-year

    As for relative power of the GPU there is zero reason than having more than what NVidia gave us, the reason why was due to bandwidth constraints mainly from HDMI and Display port, I mean unless you want really craptastic color compression, no audio, and latency issues.Also I don't know if you ran then but 3dfx original SLI wasn't really all that great, I still remember the tearing mid screen where the GPU render lines met.
     
  23. Evernessince

    Evernessince TS Evangelist Posts: 4,083   +3,629

    This is a good analogy for how you can win every game in your life and yet still be the most miserable person in the world.

    Just remember that money is a tool, not an end goal. Just collecting it without a set use is like going to the home depot every week a buying a hammer you will never use.
     
  24. Potato Judge

    Potato Judge TS Booster Posts: 161   +73

    Edit :p

    Anyone know a game that uses more than 5 GB of vram? I'm gaming on 1440p with a 1070 and most games seems to like using all your ram first before using vram which is really annoying.
     
  25. Evernessince

    Evernessince TS Evangelist Posts: 4,083   +3,629

    The new mirror's edge uses around 11GB on the highest settings
     

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...