Next-gen GeForce 11/20-series cards will reportedly use GDDR6, could be unveiled this...

midian182

Posts: 9,741   +121
Staff member

Nvidia might have revealed some amazing hardware at its recent GTC event, including the $400K DGX-2 supercomputer and the $9000 Quadro GV100 GPU, but there was no mention of any of its upcoming gaming cards. However, one memory supplier has revealed that the next GeForce generation will use GDDR6 and could be announced sometime in July or August.

South Korean semiconductor supplier SK Hynix told Gamer Nexus that its GDDR6 memory is set to reach mass production in three months and will be used in several of Nvidia’s products, such as autonomous vehicle components and gaming GPUs.

This doesn’t mean the new cards will launch as soon as this summer, but it certainly suggests we won’t see them any time before July. It's likely that Nvidia will reveal precisely when they’ll arrive and further details of the GeForce 11 (or GeForce 20) line in just a few months.

GDDR6, which will reportedly be found in Nvidia’s Tesla and Quadro parts as well as its new GeForce cards, will boast bandwidth upwards of 16Gb/s. It will run at 1.35v and be 20 percent more expensive to produce than current cards, so expect the 1170 and 1180 (or 2070 and 2080) to cost noticeably more than the 10-series launch prices.

SK Hynix confirmed the GDDR6 chips would come in 1GB and 2GB densities, and that the new line of cards will feature up to 16GB of video RAM—more than the Nvidia Titan Xp’s 12GB. It’s speculated that the 1170/2070 and 1180/2080 could include 8GB and 16GB RAM, respectively.

The news hasn’t cleared up the confusion over Nvidia's Ampere and Turing architecture roadmap. Some still suspect that the former could be the name used for its upcoming gaming-based cards, while the latter may be designed for cryptomining GPUs. Hopefully, things will be a lot clearer by the time summer rolls around.

Permalink to story.

 
Honestly I'm a bit more worried about RayTracing, I almost get this feeling like NVidia discontinuing SLI support means that due to the heavy compute needed we may be in for another solution where we need a compute card like the old PPU otherwise we will be paying premiums for higher end cards where they are more in the vein of previous Titan cards with high compute.
 
Nvidia releasing new GPU's despite a complete lack of competent competition? Im all ears! This will be my new GPU - the 16GB version. I will fight anyone for one!
 
Honestly I'm a bit more worried about RayTracing, I almost get this feeling like NVidia discontinuing SLI support means that due to the heavy compute needed we may be in for another solution where we need a compute card like the old PPU otherwise we will be paying premiums for higher end cards where they are more in the vein of previous Titan cards with high compute.


Edit: Misread the first sentence. The dropping of SLI support isn't something to worry about. The new link tech they showed off at GTC is a superior replacement.
 
Crypto-miners spoil it for everyone. Everyone loses... :(
I imagine/hope there is going to be a pre-order scheme limiting the cards to something like 2 per customer. I know its not ideal but if you are serious about getting one then you should be able to source one. Im tempted to email Nvidia about it now and ask them to take my money! Especially after the shortage that occurred when the 10 series dropped and there was no mining crisis then!
 
Crypto-miners spoil it for everyone. Everyone loses... :(

Purchased a 1080ti or mining. So far, winning bigly. Also runs games pretty well...at 100FPS...at 3440x1440...on ultra.

I will sell it to you for $2500.
 
Purchased a 1080ti or mining. So far, winning bigly. Also runs games pretty well...at 100FPS...at 3440x1440...on ultra.

I will sell it to you for $2500.
If you are being serious what algorithm are you using on your 1080 ti? I read they can only do about 37MH/s on Eth which makes it kinda terrible at mining Eth for the amount of money they cost, even at MSRP.
 
Honestly I'm a bit more worried about RayTracing, I almost get this feeling like NVidia discontinuing SLI support means that due to the heavy compute needed we may be in for another solution where we need a compute card like the old PPU otherwise we will be paying premiums for higher end cards where they are more in the vein of previous Titan cards with high compute.


Edit: Misread the first sentence. The dropping of SLI support isn't something to worry about. The new link tech they showed off at GTC is a superior replacement.


Ah, didn't see it ty I'll have to do some reading. Something tells me the next couple years will be interesting.
 
If you are being serious what algorithm are you using on your 1080 ti? I read they can only do about 37MH/s on Eth which makes it kinda terrible at mining Eth for the amount of money they cost, even at MSRP.

I only mine non-mainstream coins, for the reason listed above. "Popular" coins like BTC, ETH, etc. only turn a decent profit when they are in the stratosphere due to large-scale miners running up the difficulty. In any event, I use a 1080ti and a 1070 and they yield approx. 2.9 GH/s, which translates to about one hundred coins per month running 12/7. The 1080ti itself has a hash rate of whatever the boost clock decides to run at, which is usually between 1910-1920. The 1070 only manages about 980 MH/s.

And before anyone asks: no, I'm not going to post what these coins are on a high traffic tech site.
 
Ah, didn't see it ty I'll have to do some reading. Something tells me the next couple years will be interesting.

See:

The scaling back of SLI and hype about ray tracing makes me think they intend to replace SLI with their NVLink 2 bridge. This makes sense from what's said in the video, as he mentions no software support will be needed.
 
Ah, didn't see it ty I'll have to do some reading. Something tells me the next couple years will be interesting.

See:

The scaling back of SLI and hype about ray tracing makes me think they intend to replace SLI with their NVLink 2 bridge. This makes sense from what's said in the video, as he mentions no software support will be needed.
I was guessing something to this regard when they said "NvLink 2 can keep 2 HBM quadro's memory synced" which would suggest NvLink 2 has ridiculously high bandwidth compared to SLI.

True DualGPU rendering, as opposed to the two synced GPUs that SLI gave us, would be great for pushing 4k144 monitors with maximum details enabled. As far as I'm concerned, we will need such technology as GPU release schedules get longer and longer (pascal is almost 2 years old already).
 
I was guessing something to this regard when they said "NvLink 2 can keep 2 HBM quadro's memory synced" which would suggest NvLink 2 has ridiculously high bandwidth compared to SLI.

True DualGPU rendering, as opposed to the two synced GPUs that SLI gave us, would be great for pushing 4k144 monitors with maximum details enabled. As far as I'm concerned, we will need such technology as GPU release schedules get longer and longer (pascal is almost 2 years old already).

Two words: linear scaling.

Every gamer's dream.
 
Well this is all very interesting and exciting except PC gamers and building enthusiasts won't be able to get their hands on these cards due to the obnoxious crypto-miners.
Crypto miners shouldn't be a problem for long since they are rapidly becoming homeless.

Yeah, the Mrs and I just picked up our FIFTH title deed, and last week I added another batch of GPUs to my setup.

I feel so' homeless' with a key ring full, like a jailer...

:)

Therefore I wonder, from where your comment might come, and who would upvote it? I guess it doesn't matter too much.

Life is a game, money is how we keep score.
 
I wonder how many people are expecting prices to stabilize or return to levels they're used to. I expect the 3 tiers to be priced at 500, 1000, 1500 (60/70/80).
 
Could be unveiled is not the same as will be. I am still in the nVidia does not give a crap about gamers camp at the moment.
 
I was guessing something to this regard when they said "NvLink 2 can keep 2 HBM quadro's memory synced" which would suggest NvLink 2 has ridiculously high bandwidth compared to SLI.

True DualGPU rendering, as opposed to the two synced GPUs that SLI gave us, would be great for pushing 4k144 monitors with maximum details enabled. As far as I'm concerned, we will need such technology as GPU release schedules get longer and longer (pascal is almost 2 years old already).

NVlink 2 is super computer connections, NVlink is for direct link to IBMs Big Iron Processors, and replaces PCIe as an interconnect.
https://www.fudzilla.com/news/graphics/41420-nvidia-nvlink-2-0-arrives-in-ibm-servers-next-year

As for relative power of the GPU there is zero reason than having more than what NVidia gave us, the reason why was due to bandwidth constraints mainly from HDMI and Display port, I mean unless you want really craptastic color compression, no audio, and latency issues.Also I don't know if you ran then but 3dfx original SLI wasn't really all that great, I still remember the tearing mid screen where the GPU render lines met.
 
Well this is all very interesting and exciting except PC gamers and building enthusiasts won't be able to get their hands on these cards due to the obnoxious crypto-miners.
Crypto miners shouldn't be a problem for long since they are rapidly becoming homeless.

Yeah, the Mrs and I just picked up our FIFTH title deed, and last week I added another batch of GPUs to my setup.

I feel so' homeless' with a key ring full, like a jailer...

:)

Therefore I wonder, from where your comment might come, and who would upvote it? I guess it doesn't matter too much.

Life is a game, money is how we keep score.

This is a good analogy for how you can win every game in your life and yet still be the most miserable person in the world.

Just remember that money is a tool, not an end goal. Just collecting it without a set use is like going to the home depot every week a buying a hammer you will never use.
 
Edit :p

Anyone know a game that uses more than 5 GB of vram? I'm gaming on 1440p with a 1070 and most games seems to like using all your ram first before using vram which is really annoying.
 
Edit :p

Anyone know a game that uses more than 5 GB of vram? I'm gaming on 1440p with a 1070 and most games seems to like using all your ram first before using vram which is really annoying.

The new mirror's edge uses around 11GB on the highest settings
 
Back