Nvidia gets ready for 7nm Ampere and 5nm next-gen Hopper, places orders with TSMC, Samsung

Well I and this is coming from a AMD fan ( I do "rock" 2700X + Radeon VII ) no longer buy their excuses, yes there was a mining craze and they had issues with HBM2 but Fury X was a bad product launch with shortages and bad pumps on the AIO's, RX480 had weak cooler and was limited to only 6 Pin which caused power draw issues, Vega 64......total disaster now Navi and their drivers and don't get me started on RX5600XT and the BIOS mess, at some point we have to look at the pattern here and acknowledge they are Fuc****** incompetent when it comes to GPU launches this is why I kinda can't wait for RTX3000 and I might be jumping ships for a bit ?
I would argue that "big navi" is potentially AMD's return to GPU glory so maybe it's finally the time to wait.. The 5600xt isn't that big of a deal because the price is right. However, some manufactures are saying that their cards wont support the 14gps upgrade because they didn't design the PCB to support it when released. MSI cards will likely not be getting the update.

As far as excuses go, there is nothing to "buy". The inconvenience is the same, but it's very plainly no AMD's fault. And GPU issues they have had is forgivable because they diverted resources from GPU's to bring us the Ryzen. They took on Intel and I'm going to say they won because they're gaining marketshare like crazy while Intel is dropping prices. Now they're ready to take on nVidia.

I have been buying nVidia cards since the 7800gs but I'm really looking forward to see what AMD is going to be coming out with. I might wait, but my next GPU purchase is entirely tied to cyberpunk 2077. I don't think my 1070ti is going to cut it for 4k60 for 2077 and it is probably the first game that I'm willing to pay a heavy premium to max out.
 
I would argue that "big navi" is potentially AMD's return to GPU glory so maybe it's finally the time to wait.. The 5600xt isn't that big of a deal because the price is right. However, some manufactures are saying that their cards wont support the 14gps upgrade because they didn't design the PCB to support it when released. MSI cards will likely not be getting the update.

As far as excuses go, there is nothing to "buy". The inconvenience is the same, but it's very plainly no AMD's fault. And GPU issues they have had is forgivable because they diverted resources from GPU's to bring us the Ryzen. They took on Intel and I'm going to say they won because they're gaining marketshare like crazy while Intel is dropping prices. Now they're ready to take on nVidia.

I have been buying nVidia cards since the 7800gs but I'm really looking forward to see what AMD is going to be coming out with. I might wait, but my next GPU purchase is entirely tied to cyberpunk 2077. I don't think my 1070ti is going to cut it for 4k60 for 2077 and it is probably the first game that I'm willing to pay a heavy premium to max out.

I have been 50/50 on the GPU's between ATi/AMD and nVidia but after the whole DX11 and Tessellation and later on GamesNOTwork I decited to stop buying nVidia new and my last 4 cards were AMD ( I did have a second hand 980Ti for a month ) Yeah I'm in the same boat I will never get 4K60 at Ultra with my Radeon VII so whatever can run it better will probably end up in my case ?
 
I have been 50/50 on the GPU's between ATi/AMD and nVidia but after the whole DX11 and Tessellation and later on GamesNOTwork I decited to stop buying nVidia new and my last 4 cards were AMD ( I did have a second hand 980Ti for a month ) Yeah I'm in the same boat I will never get 4K60 at Ultra with my Radeon VII so whatever can run it better will probably end up in my case ?
My issue is that cyberpunk comes out in September, but these cards aren't coming until this fall. It could be 2 months after release before we see them hit the shelves. By then my first playthrough of cyberpunk will be over and thus my reason for upgrading. It was one thing when the game was suppose to be released in march and was ~half a generation away from new graphics cards. At this point, I'm kinda hoping it gets delayed again so it can get released along side the new cards. I could always "wait" until the cards come out to play it, but realistically I like the self control to do that. It's also fun when a game first comes out and you can be part of all the buzz of everyone playing it. "oh yeah, I made a blah blah blah and did that this way". "oh cool, I'll have to try that on my second play though."

I'll use skyrim as an example. I stayed up and played the game at midnight when it unlocked on steam. Within 30 minutes of the game being released I not only experienced joining the nordic space program when I met giants for the first time, I authentically experienced the "arrow to the knee meme" probably before the first one was posted on the internet. It's being part of something in a special way and that is more important to me than the graphics settings I play it. I want to crank it up all the way, but probably wont. I currently use a 65"4k tv as a monitor but I'm replacing it with a 1440p 120/144hz monitor. I've been eye high refresh rate gaming TV's to use as a monitor but the barrier to entry is about 10X the cost of a monitor.
 
My issue is that cyberpunk comes out in September, but these cards aren't coming until this fall. It could be 2 months after release before we see them hit the shelves. By then my first playthrough of cyberpunk will be over and thus my reason for upgrading. It was one thing when the game was suppose to be released in march and was ~half a generation away from new graphics cards. At this point, I'm kinda hoping it gets delayed again so it can get released along side the new cards. I could always "wait" until the cards come out to play it, but realistically I like the self control to do that. It's also fun when a game first comes out and you can be part of all the buzz of everyone playing it. "oh yeah, I made a blah blah blah and did that this way". "oh cool, I'll have to try that on my second play though."

RTX 3070 & 3080 are coming before Sep I think, with Ampere rumor to get 4x Ray Tracing performance I can see RTX 3070 faster than 2080 Ti with RTX On.

There is nothing like playing a game on its release date, I still remember playing The Witcher 3 with my Titan X Maxwell, get about 100 hours in before the game was being patched to play properly on any lower graphic card :D.
 
RTX 3070 & 3080 are coming before Sep I think, with Ampere rumor to get 4x Ray Tracing performance I can see RTX 3070 faster than 2080 Ti with RTX On.

There is nothing like playing a game on its release date, I still remember playing The Witcher 3 with my Titan X Maxwell, get about 100 hours in before the game was being patched to play properly on any lower graphic card :D.
nVidia has delay the release of the cards due to the corona virus. They haven't said anything about the release date aside from they are trying to release it this fall
 
Are we sure "Hopper", is a well thought out moniker for this release?

It is after, a common colloquialism for a toilet. :rolleyes:
 
Because Nvidia has annual revenues in excess of $10 billion (net income over $4b), despite their prices :)


AMD have been successful with their CPUs because (a) they're good and (b) they engineered a product design that could be easily scaled over the years; the fundamental layout of Intel's Skylake isn't particularly scalable (unless you want CPUs as long as a hot dog). In terms of their GPU redesign, that was very much done because of the consoles - making one chip and selling it in millions over several years is a better deal than aiming for the fickle and uncertain PC market only. Don't get me wrong: Navi is a great design, but it absolutely shouts 'console' (not in any kind of a bad way, though).

Now given that Nvidia's console portfolio consists of their own Shield and the Switch (one a huge seller, the other...umm...not), both of which aren't using anything special, there's no major competition for them or there's no reason for them to try to be competitive here, as AMD have the rest of the console market sewn up. Certainly a new console, especially a super powerful one, is almost certainly going to hit PC and, to a lesser extent, graphics card sales so this is going to something they're keeping their eye on.

But I don't think neither the consoles nor Navi are worrying them too much, even with the relatively weak Turing sales. This is because RTX v2.0 is likely to be significant improvement on what's currently possible, and this is all by design. What better way to make your new product look so much better than its predecessor by having the first release of new technology be a very mixed bag of fortunes (great visuals, awful performance, only works great on really expensive products).

RT cores don't take up a huge amount of die space, less than 10% of an SM's overall area, so increasing the number of them is cheap - especially if you're transitioning to a smaller node. Current GPUs have more than enough compute capability for games, they just need more bandwidth (internally and externally) and a fancy new rendering technology. In previous years, this was hardware TnL, vertex and pixel shaders, tesselation, and so on; now it's all about ray tracing.

And for that, you need specialised ASICs, which of course, Nvidia already has. We know that AMD are following suit in the consoles, so they will in PC models too, as are Intel. Nvidia have a notable head start here, despite the sales and the iffy performance; this is why 'RTX on' has been marketed so much. Keep the consumer's eye on the brand label so that when the new batch of GPUs comes out, more people will assume that only RTX means ray tracing.


I hope you're right, I really do, but I fear that Nvidia fancy themselves a slice of the cake that Apple has made for itself.

You are right, RTX performance is easily scalable and the visual impact it bring to game is noticeable to novice eyes (With Ampere rumor to get 4x the Ray Tracing performance of Turing). While rasterization had already past its diminishing return point already, I highly doubt a 5600XT and a 5700XT, when tuned to the same FPS via in-game graphical setting (5600XT running High settings while 5700XT running Ultra settings) are that easy to differentiate...

If Nvidia can make a pixelated game like Minecraft look good with RTX, there is nothing stopping indie developers to make their low budget games looking lit with RTX.
 
You might have to wait another generation before that happens.

{ snip}

& post #19

My friend....
You seem to making those predictions in a static world.... and pay no attention that the rdna2 consoles are coming, that have more graphics power than nVidia's $800 dGPU..!

You make little mention and dismiss this completely. Why...?


As someone mentioned, everything you've just stated, was said in a vacuum and is missing the context.... that nvidia has competition in the high end space for the first time... where before, they never have had.

So nVidia can not use the same business model as they once had, or they will get jebaited once more by rdna2. nVidia is going to have to reduce all their RTX cards by 25% just to stay competitive with the Xbox/PS5. The 2080ti will be selling for Under $800 soon, because nVidia doesn't want to get laughed out of town, when the Consoles land.


Subsequently, nVidia is going to have another price reduction (2080ti = $499), or a rebrand RTX 20 series when Navi 2' rdna2 dGPU hit the store shelves. We already know hybrid rDNA1 is more powerful than Turing, and a little ole 251mm^2 die disrupted all of nVidia's Turing SKUs, so what do you think rdna2 will do (to nVidia), when navi23 (244mm^2) it's the same size as navi10... but is 40% more powerful..? And cost nearly the same? ($499)


nVidia has to do nothing <---as you've suggested..?

You do seem to be completely ignorant of what is coming & what has been announced. And instead placed you faith and bet on DLSS2.0 ..? When that is not even the gaming standard and it proprietary gimmick. When DirectML is the Industry Standard and what most Developers will be using. Just like there are no longer any "RTX On" games being developed (6 total in 18 months) as Developers are just making industry standard DXR/Vulkan games. Which don't use "rt cores"... yet you think more RT cores are what games need..? (it was Jensen Huang that perpetuated the myth that you need "rt cores" to do ray tracing.)

Where is your headspace... still stuck in 2018 and the RTX event..?
 
Last edited:
My issue is that cyberpunk comes out in September, but these cards aren't coming until this fall. It could be 2 months after release before we see them hit the shelves. By then my first playthrough of cyberpunk will be over and thus my reason for upgrading. It was one thing when the game was suppose to be released in march and was ~half a generation away from new graphics cards. At this point, I'm kinda hoping it gets delayed again so it can get released along side the new cards. I could always "wait" until the cards come out to play it, but realistically I like the self control to do that. It's also fun when a game first comes out and you can be part of all the buzz of everyone playing it. "oh yeah, I made a blah blah blah and did that this way". "oh cool, I'll have to try that on my second play though."

I'll use skyrim as an example. I stayed up and played the game at midnight when it unlocked on steam. Within 30 minutes of the game being released I not only experienced joining the nordic space program when I met giants for the first time, I authentically experienced the "arrow to the knee meme" probably before the first one was posted on the internet. It's being part of something in a special way and that is more important to me than the graphics settings I play it. I want to crank it up all the way, but probably wont. I currently use a 65"4k tv as a monitor but I'm replacing it with a 1440p 120/144hz monitor. I've been eye high refresh rate gaming TV's to use as a monitor but the barrier to entry is about 10X the cost of a monitor.

Best comment so far: "nordic space programme".
 
Not a good time to upgrade now:

1. Intel is still on "Nehalem" 14nm+++++++ (not even joking) chips that overconsume power and overproduce heat.
2. Only AMD sells PCIe 4.0 mobos. PCIe 3.0 mobos are bad for future proofing.
3. DDR5 is coming in 2022 or earlier.

Upgrade now, and you will be obsolete in 2022.

I too feel that NVIDIA will inflate prices for this coming gen. They basically have zero competition in 2070+ levels of perf. and AMD competition in lower product grades is insufficient.
3.0 wont be a bottleneck for a long time. last thing to worry about. same goes for ddr-4.
 
Last edited:
My 1080Ti cost me $1000 Canadian and it is going strong 3 years after I purchased it, and probably represents the best gaming card I have bought since the original Riva TNT2 I got about 20 years ago. The 2080/Ti series is something I avoided, due to price and performance being crap in comparison to what I am using.

I like how things have "slowed down", for a while, I was buying a new GPU every 12-18 months. Granted, game development is matched to the consoles now, but I bet I can hold out for the 4000 series and go for that, and be good for 3-4 years after.

 
Back