Nvidia's GTX 2070 and 2080 cards could be unveiled next month, launch late April

midian182

Posts: 9,741   +121
Staff member

We’ve been waiting for what seems like an age to see Nvidia’s next generation of GeForce graphics cards, but according to the rumor mill, they’ll finally be unveiled next month. Speaking to TweakTown, a “well-placed source” in the industry said the first public showing would take place at Nvidia’s GPU Technology Conference (GTC), which begins on March 26.

The new cards will follow the current Pascal-based 10-series with either an 11-series or 20-series naming system, which means we could see the GTX 1180 or the GTX 2080. Whatever they’re called, these new gaming GPUs are said to be based on the 12-nanometer Ampere architecture.

It was thought that the new GeForce cards would use the same Volta architecture found in the Titan V and Tesla V100, but it now looks as if Volta will be reserved for Nvidia’s top-end cards built for AI, high-performance computing (HPC), and deep learning. Some suspect that Ampere could actually be a version of Volta for gamers.

Making the situation even more confusing is Turing, which Reuters calls “The new gaming chip” that is expected to be unveiled next month. This is a completely new GPU architecture and may be used in the GeForce GTX 11-Series/20-series cards instead of Ampere. It could, however, be for something else entirely.

Rumors have been floating around for a while that Turing is actually a code name for a new version of Nvidia’s GPUs designed specifically for mining cryptocurrency. If this is true, it should help ease demand for gaming cards from miners and push down the exorbitant price of GPUs.

The new cards are expected to be released between late April and early May. Get ready to hear plenty more rumors in the run-up to GTC.

Permalink to story.

 
Paper launch. Nothing new for nVidia. Apparently, I've read, both for gamers and miners alike, It would be more interesting to learn when RAM manufacturers are finishing new RAM and VRAM factories, cause that is the limiting factor these days. Until than, PC gaming is on a slope to an agonising death.
 
Paper launch. Nothing new for nVidia. Apparently, I've read, both for gamers and miners alike, It would be more interesting to learn when RAM manufacturers are finishing new RAM and VRAM factories, cause that is the limiting factor these days. Until than, PC gaming is on a slope to an agonising death.

Yea it sucks. The manufacturers are loving the short supply and high prices but in the long run they may be doing more damage to the PC industry than they're aware of. The only good thing going for hardware is that it requires pretty beefy hardware to run VR and AR devices so that will at least keep the hardware train producing better devices on a decent schedule.
 
Of course they are, I just bought a 1080 Ti!

exactly why I bought a "tweaked" prebuilt (god help me) a week ago and didn't opt for anything greater than the 1070. I will likely upgrade to the higher spectrum of these cards when they come out.
 
If that rumour about launch date holds true, it's very probably just 12nm version of GTX1080 with no architectural improvements. So we are expecting same that happened between Maxwell and Pascal (99% die shrink, 1% architecture). Just this time die shrink only gives very small improvement.
 
Here we go again with AMD being the only one that innovates. My God let them come out the door with a product, before you start bashing them.

What I said is true. GTX 1080 was GTX 980 Ti with 16nm tech, very little else.

So yes, AMD seems to be only one that innovates. Without AMD we would still be stuck with DX11.

Still remember GTX 1080 got 100/100 "(y)"
 
Here we go again with AMD being the only one that innovates. My God let them come out the door with a product, before you start bashing them.
What are AMD innovating? Ways to mess up graphics card launches?

I still don’t see the benefit of HBM2. I’d rather have more GDDR5X
exactly why I bought a "tweaked" prebuilt (god help me) a week ago and didn't opt for anything greater than the 1070. I will likely upgrade to the higher spectrum of these cards when they come out.
Where I live (the U.K.). A 1070 isn’t much cheaper than a 1080 ti! In fact the 1080ti and the Titan Xp are about the only cards you can get at MSRP beyond a 1050ti. I just looked on eBay and you can get a 1070 for under £500, used. I paid £707 for my Aorus 1080 ti brand new.

But really I’m just sick of not being able to use my 4K monitor.

It’s a shame AMD seem to be absent this time around as if I could have had a Vega 64 for £400-£450 I would have opted for that. Provided it was aftermarket.
 
DirectX 12 and Vulkan for example.
AMD didn’t invent either of those things and both work on Nvidia cards and always have. Mantle could be attributed to AMD but that turned it to be more of an empty promise than a decent innovation.

Don’t get me wrong I’m sure AMD innovate, HBM was an innovation on consumer graphics but it didn’t really do anything for the users, just seemed to increase costs for AMD. But then you get the impression that AMD have been struggling with memory for a little while.
 
AMD didn’t invent either of those things and both work on Nvidia cards and always have. Mantle could be attributed to AMD but that turned it to be more of an empty promise than a decent innovation.

Don’t get me wrong I’m sure AMD innovate, HBM was an innovation on consumer graphics but it didn’t really do anything for the users, just seemed to increase costs for AMD. But then you get the impression that AMD have been struggling with memory for a little while.

Vulkan is almost exact copy of AMD's Mantle. Almost because Khronos group cherry picked best parts of Mantle and renamed it to Vulkan. DirectX12 shares so many similarities with Mantle that Microsoft must have work closely with AMD. So without Mantle there would not have been either Vulkan or DirectX 12.

Because Khronos group adopted Mantle, there was no need for Mantle any more. Mantle had served it's purpose.

HBM is not so useful for huge graphic boards. But imagine how well GDDR5 could be used with CPU+GPU combo. It's almost certain AMD will release CPU+GPU+HBM combo.
 
Too bad they'll limit you to one card per pre-order!
Good thing there are several versions of each type. One model per household, they don't limit you on ordering different models. Sourcing cards is actually pretty easy if you know where to look. I might pay an extra $50 per card over current market price but I make that back in a few weeks.

If you have a high end graphics card and you aren't mining you're an *****. My two 1080Tis paid for themselves in 3 months. I bought the there 18 1070tis over 5 months.

I was lying about selling them, btw. I'm just going to buy 18 founders editions of whatever their newest high end card is.
 
And then the miners will grab the new products making the new gpus unavailable to the rest of the community.
Don't worry, I have 18 1070Tis you can buy once they release their new cards!
:) Only a fool would buy an exhausted graphic card that has run at full load 24/24 hours a day whilst being suffocated like canned sardines; the buyer would be lucky if it would be working just a week or two till the final gasp.
 
:) Only a fool would buy an exhausted graphic card that has run at full load 24/24 hours a day whilst being suffocated like canned sardines; the buyer would be lucky if it would be working just a week or two till the final gasp.
As long as you keep thermals in check the idea that you can work a card to death is a myth. Also, all of my cards are underclocked and undervolted. 30% reduction in power with less than a 5% loss in hash rate.
 
Vulkan is almost exact copy of AMD's Mantle. Almost because Khronos group cherry picked best parts of Mantle and renamed it to Vulkan. DirectX12 shares so many similarities with Mantle that Microsoft must have work closely with AMD. So without Mantle there would not have been either Vulkan or DirectX 12.

Because Khronos group adopted Mantle, there was no need for Mantle any more. Mantle had served it's purpose.

HBM is not so useful for huge graphic boards. But imagine how well GDDR5 could be used with CPU+GPU combo. It's almost certain AMD will release CPU+GPU+HBM combo.
Hmm, I did a bit of reading about that. I would actually say it’s absurd to suggest that DX12 would not have existed without Mantle but yes it does appear that Vulkan and DX12 had a lot of the work done for it by Mantle. It’s actually looking more like AMD tried to make these developments proprietary to AMD which is a bit of a **** move. Still, mantle was killed well over 3 years ago, hardly a recent innovation. One thing I can think of though is freesync. I love freesync mostly because it’s free. Adaptive sync tech for me isn’t a game changer or anything spectacular but at least AMD don’t charge for it!
 
If that rumour about launch date holds true, it's very probably just 12nm version of GTX1080 with no architectural improvements. So we are expecting same that happened between Maxwell and Pascal (99% die shrink, 1% architecture). Just this time die shrink only gives very small improvement.
@HardReset IS BACK BABY! YEAH!

7th comment in? You're losing your touch ;)
Normally the bashing comments start from you at least 4 comments in. When are you going to give us the normal "AMD is clearly the better choice" or my personal favorite "AMD is the most future proof choice".

It's been so long! Welcome back HardReset :)
 
Back