Nvidia Volta gaming GPUs are not in the 'foreseeable future'

In total, you can plan to shave ~$65 USD off your price with a Ryzen 7 1700. .....

Why stop at $65? If you just want adequate 4K 60 or less fps, GPU limited gaming performance, why not same some more money and go R3 1300x? The two will perform roughly the same see:

http://cpu.userbenchmark.com/Compare/AMD-Ryzen-7-1700-vs-AMD-Ryzen-3-1300X/3917vs3930
http://www.guru3d.com/articles_pages/amd_ryzen_3_1200_and_1300x_review,24.html
http://www.hardwarecanucks.com/foru...ryzen-3-1300x-1200-performance-review-11.html

The R7 1700 will win at lower at the resolution 1080p but we all know that the AMD marketing has said doesn't count, if you believe them, and people should go 4K right? Otherwise you'd better go back to the 7700k.

But if you go with the 1300x you can now save $225. That diffrence is price of a single GTX1060 or big GPU upgrade like between a GTX1070 and GTX 1080. All the Ryzen CPUs are overpriced, especially the R7s. Unless you are building a server/render farm they don't fit the bill for gaming and they don't fit the bill for budget builds.
 
...The most optimistic AMD fan can hope in 3-5 years more than half of games will be able to take advantage of those Ryzen cores. By then we'll be 2-3 generations down the road.

Exactly! Well said!

I am still waiting. Nope I just gave up. I gave my FX-8320 to my mother, because she doesn't game and just want a PC that works better than the old core2duo that is falling apart.

What is that famous quote: "Fool me once, shame on you, fool me twice shame on me". I am not going to get fooled by AMD's hype and marketing. That FX-8320 is bottlenecking my GTX970, but my even older sandybridge i5-2500K is NOT. Heck for gaming, the old i5-2500K is more than a match for any of the Ryzen's.

And now Vega 56/64 will try to price itself higher than GTX1070 or GtX1080ti. Really AMD. Not all us want to be part of the "poorly educated."

.... I think that nVidia may just find themselves against a wall with nowhere to go similar to the way Intel is now.

What imaginary wall is that again? About as real as the one Mexico was going to pay for? Right. AMD is doing so great that in less than 6 months, the R7 1800x, dropped $150 went from release price of $500 to $350. See:
www.microcenter.com/product/476003/Ryzen_7_1800X_36_GHz_8_Core_AM4_Boxed_Processor

Goes to show how ridiculous AMD's overpricing was to start with. And the R7 1800x is no better for gaming than the R3 1300x. Better hope the Epyc doesn't end up as an epic fail trying to get datacenters to switch over. The existing sunk costs of already deployed platforms will not be easily made up by even the $1000 price difference.

See:
https://www.cpubenchmark.net/high_end_cpus.html
And it is not like Intel is slower, as much as Intel hates this, all intel needs to do is lower prices, and intel will have the flexibility to play with their account teams to probe for best price people will pay. Transition costs are NOT easily ignored, retraining people, software platform changes do NOT come free.

BTW I want to applaud Intel for bailing on the white house business council, although it would have been better like the way AMD was not on there in the first place.
 
Last edited:
That's why I'm looking at the used market these days!
Games are coded for consoles anyway and even their latest GPU's are totally anemic, weak kneed and asthmatic compared to the current mid/high architecture PC GPU's so yeah, you don't need the latest and greatest... unless you have a fetish for playing at 4K resolutions at +143 fps on a 144Hz panel in which case you'll usually discover that your arrhythmia inducing outlay on all that fancy hardware is let down by some crappy ported game which will never see a proper patch even when promised by the devs/publisher.
 
....

It's unfortunate, but the industry probably needs a breather. At the very least this move could offer AMD at chance to catch up next year...

What breather does this industry need? I'd like to see AMD being competitive and they can do that today. AMD just need to get back to being the unrivaled value. They do NOT need to wait to next year to adjust their prices to day to get competitive today. There is a big enough market where people will take %10-%20 less performance for40% or more price advantage.
 
Now that nVidia has seen the Vega results, is it any surprise nVidia will have no compelling reason to push out Volta now? nVidia will want to make as much return on the Pascal as they can. This reminds me of when nVidia released the tesla/8800 series back in 2007. Yep there will some real excitement when the rename the thing, just like they did with 8800GT to 9800GT and then 8800Ultra to GTS250. That went on for at least 3 years. Yippeekayay!! Look forward to that. About the only thing you can be optimistic about is that hopefully over time the prices will drop, if and when the crypto mining craze is done with.

NO surprise here!
Reminds me of an old Indian friend ,that's been everywhere and done everything.

Binder Dundat !

8800 GTS 320/640 mb to 8800 GTS 512. to 9800GX2
8800gt to 9800gt
8800 GTX to 9800 GTX + to GTS 250 .
8600 GT/GTS to 9600 GT to GT240.
I think the 9800 GX2 was a pair of modded 9800 gts on the same PCB.

I had most of them,and still have a pair of the 9800GX2,and 9800GT still like new ,
then 3 GTX 280 .still running. need to be cleaned like my GTX 480 just were.

ER UM ,NO prices did not drop the cards just dropped a rung on the ladder and replaced by a new topgun. and the milking went on &on &...........

did NVidia ever milk that G92 chip huh?

WTF you do that for.lol ,thanks for the memories .not sure I wanna go through that again.
here we go!
GTX 1080 to GTX 1080+ to GTX 1090 or to 1080 GX2
GTX 1070 to GTX 1075
and on and on..

AMD are Guilty of that shite as well.I have a bunch of ati gpu's as well. some rebrands.
 
Last edited:
Why stop at $65? If you just want adequate 4K 60 or less fps, GPU limited gaming performance, why not same some more money and go R3 1300x? The two will perform roughly the same see:

http://cpu.userbenchmark.com/Compare/AMD-Ryzen-7-1700-vs-AMD-Ryzen-3-1300X/3917vs3930
http://www.guru3d.com/articles_pages/amd_ryzen_3_1200_and_1300x_review,24.html
http://www.hardwarecanucks.com/foru...ryzen-3-1300x-1200-performance-review-11.html

The R7 1700 will win at lower at the resolution 1080p but we all know that the AMD marketing has said doesn't count, if you believe them, and people should go 4K right? Otherwise you'd better go back to the 7700k.

But if you go with the 1300x you can now save $225. That diffrence is price of a single GTX1060 or big GPU upgrade like between a GTX1070 and GTX 1080. All the Ryzen CPUs are overpriced, especially the R7s. Unless you are building a server/render farm they don't fit the bill for gaming and they don't fit the bill for budget builds.

Like some other place the sun doesn't shine ,I can't see it.1060 OK.But putting a GTX 1080 with a rysen 1300x ,talkin about bottleneckin .
 
A 1070 with GDDR5X would be neat. Same with the 1050ti and 1060.
No clue if thats actually possible.
 
Not sure about launch prices, but from what I can tell these are the current price comparison:

https://pcpartpicker.com/products/cpu/#s=59,13&f=75,76&sort=price&page=1[/url: Ryzen 7 1700 is running ~$17USD less than the i7-7700K, but you also have to figure another $25 for a decent cooler for the i7 (i7-7700K doesn't come with its own cooler, but the Ryzen comes with the Wraith Spire, which allows for at least some overclocking), putting the price edge to ~$40 USD for the Ryzen. https://pcpartpicker.com/products/m...ee the 7700k get you more value than the 1700.
 
I suppose that it's fortunate for AMD that hardware has outpaced software to the degree that it has. There's nothing that I can't throw at my R9 Fury so why would I get anything else? Make no mistake though, nVidia WAS teasing a Volta gaming card to try to spoil Vega's release (Vega spoiled its own release). Now that nVidia knows that ATi hasn't developed anything for AMD that they would consider a threat, they suddenly don't have a gaming version of Volta? It looks like nVidia is going to pull an Intel and try to tick-tock us to death. I think that this would be a mistake on their part because once AMD has reaped the profits of EPYC, ATi is going to have a pantload of money to play with for R&D. I think that nVidia may just find themselves against a wall with nowhere to go similar to the way Intel is now.

You read my mind.
 
I...There's nothing that I can't throw at my R9 Fury so why would I get anything else? ....
You read my mind.

Yup AMD put out Vega 56/64 with a price so high that they can't even convince AMD fans to upgrade. LOL. You do not have to have mind reading skills to know something this obvious.

If even you guys won't donate to AMD, then there will be no wall to pin Intel or nVidia against, because Mexico surely is NOT paying for no wall. LOL.
 
A 1070 with GDDR5X would be neat. Same with the 1050ti and 1060.
No clue if thats actually possible.

yes, it would certainly be possible, they put the DDR5X on the 1080. .the first 1080 was DDR5 ,if I'm not mistaken.
however it would not be the least bit profitable.I wouldn't think cards are still in production anyway .unless this recent announcement .put the fab back in action for another run.
 
Thanks...not being involved with the whole cyber coin thingy, wasn't sure if it was GPU's, CPU's, Ouija boards
or what ;)
Bitcoin is or was mined for a long time with GPU's, I know a guy with 7 x 7950 on an msi big bang ,mining Bitcoin 8 or more years ago in its infancy.wish I had joined him, he doesn't need to work now.now asic chips are required for bitcoin ,and as ethereum gets harder and harder to mine same thing will eventually happen it will no longer be profitable because of the loss in efficiency.
 
Last edited:
Volta for gaming is coming, it just isn't coming now. And the fact anyone thinks it ever was should be slapped upside the back of their head. Come on, Vega is no threat to NVidia and the 1080 ti is the King among cards these days. That wont be changing until Volta happens and it will in a year from now. We will hear a lot of Volta talk come next Spring.
 
From what I understand, a LOT of the high end graphic cards are being snapped up by bit-coin server farms.
If that is true, then the actual price should come down if that "market" collapses?
bitcoin is not mined using GPUs. ETH is.

Indeed. Bought an RX 480 a few months ago on a good deal for $150. Seeing all this Ethereum fever, I put it on ebay yesterday - an hour later, it sold for $270. I just made $120 profit and can wait on getting a new GPU until prices come down since I have a mountain of work for the next few weeks and little spare to game anyway... win-win!
 
Volta for gaming is coming, it just isn't coming now. And the fact anyone thinks it ever was should be slapped upside the back of their head. Come on, Vega is no threat to NVidia and the 1080 ti is the King among cards these days. That wont be changing until Volta happens and it will in a year from now. We will hear a lot of Volta talk come next Spring.

You may not see volta release now until AMD'S next generation,What? with a new fabrication process,new faster DDR5X memory chips, available now.there is absolutely nothing to stop Huang from calling the fab ,ordering a shrink on the Pascal.and releasing a whole boatload of GPU's based on the new process.leave the ready to ship Volta 's sitting in a warehouse until he needs them in the channel. he could even bring a little new tech to pascal ,like stacked memory.or HBM2,... WTF AMD?

You read it here 1st.! 1090/1085/1075. anybody?
 
Last edited:
This has been the AMD mantra since FX - as soon as more games utilize more cores...

The most optimistic AMD fan can hope in 3-5 years more than half of games will be able to take advantage of those Ryzen cores. By then we'll be 2-3 generations down the road.

That's nonsense. First of all, 99% of games from 2015 onwards actually utilize 4 cores at the very minimum. Most of them actually can use more. I don't know whether that's gonna make 1700 faster than 7700k, but it certainly makes R5 1600 **** on the i5's.

Also, what was said back then about the FX proven to be true. It's just people like you that compare apples to oranges that don't realize it. Of course, if you compare the fx, a 2012 processor with a 2017 processor it's gonna lose even though it has more cores. But for an apples to apples comparison, wanna try comparing an i3 3120 to an fx 6300 / 6350 on today's games?
 
That's nonsense. First of all, 99% of games from 2015 onwards actually utilize 4 cores at the very minimum. Most of them actually can use more. I don't know whether that's gonna make 1700 faster than 7700k, but it certainly makes R5 1600 **** on the i5's.

You do NOT know because you choose to hide from the facts. Facts show 7700K is faster than any of the R7 for games periods. And quit with the hyperbole. Here are the benches for the i5 vs R5 see:
http://www.gamersnexus.net/hwreviews/2875-amd-r5-1600x-1500x-review-fading-i5-argument/page-4

And the biggest competitor for the i5 is not the overpriced R5, but rather something like the R3. When you can get 1300x for $100, see $30 on 1300x at Microcenter with mobo purchase (btw Newegg is not selling the same mobo for $30 less so it is real discount):
http://www.microcenter.com/search/search_results.aspx?Ntt=1300x
Any less than top line game performance in the 7700K, has no real justification just to be muddling in the middle.

Also, what was said back then about the FX proven to be true. It's just people like you that compare apples to oranges that don't realize it. Of course, if you compare the fx, a 2012 processor with a 2017 processor it's gonna lose even though it has more cores. But for an apples to apples comparison, wanna try comparing an i3 3120 to an fx 6300 / 6350 on today's games?

Strawman fits you name all right. No one is comparing the FX to 2017 and expecting 8 cores of the pile of _______ driver to win. This is a bogus strawman argument and you know it. And it s also a strawman to go compare old i3 for the 6 core FX with today's games. Anyone that got cheap i3, for less money than the FX would have had money saved to replace their CPU or have already long upgraded. There is a reason for cheap 2 cores, they are cheap disposable and replaceable. It is NOT an apples to apples comparison that you imagined it to be.

While it is true is that over time you'd get more out of more cores because of software optimization, this is seeing the tree and missing the forest. Whatever gains you get from software optimization that take longer than than 3 months after release is really not a gain relative to the rest of the industry. The bulk of the this kind of software optimization only comes 2 years or more later as side effect by product from the next gen tech. By then the gains are nowhere close to being enough to match the actual hardware upgrades in the technology itself. Take your example of the FX-6350 released in 2013, by 2015 when they had plenty to time to improve software, but any gains there still left it far behind the likes of the i3-6300 see:
http://cpu.userbenchmark.com/Compare/Intel-Core-i3-6300-vs-AMD-FX-6350-Six-Core/3536vsm713
Or any Skylake i3/i5 etc. It is barely on par with the Skylake pentiums see:
http://cpu.userbenchmark.com/Compare/Intel-Pentium-G4520-vs-AMD-FX-6350-Six-Core/3537vsm713

So this claim to wait for the software optimization is just cheap marketing words to lull gullible to pay more for stuff now. How about AMD take a 20% deposit now, and take the remain 80% later when they actually deliver on the gains later. That is a non-starter for AMD, well damn, why should we be taken for ride by them about more cores/threads and future optimizations. I'll buy what is good bang-for-the-buck now, and buy later what is the best bang-for-buck later.

It is too bad AMD has chosen to price themselves way out of what could even qualify as good bang-for-buck with the only exception being the R3 with the Microcenter discounts.

R7 should be at max $250
R5 should be at max $150
R3 should be at max $100

BTW Ryzen will be bottlenecking your GPU. Why is it that AMD is testing their Vega's with the 7700k? It is already well established that at Ryzens can't keep up with the GTX1080ti at 1440p. See:
http://www.legitreviews.com/cpu-bot...-on-amd-ryzen-versus-intel-kaby-lake_192585/4

That means all Ryzens must be priced with replacement in mind, so that you can still make use of the AM4 platform, AMD's only saving grace.

Vega 56 needs to be $300 max
Vega 64 needs to be $400 max

Anything more and AMD is way overpriced and under performing. No one should be paying more now for future potential gain, when that money can be save and buy the next-gen product for real gain later. Wishful thinking does NOT save you real money.
 
That's nonsense. First of all, 99% of games from 2015 onwards actually utilize 4 cores at the very minimum. Most of them actually can use more. I don't know whether that's gonna make 1700 faster than 7700k, but it certainly makes R5 1600 **** on the i5's.

Also, what was said back then about the FX proven to be true. It's just people like you that compare apples to oranges that don't realize it. Of course, if you compare the fx, a 2012 processor with a 2017 processor it's gonna lose even though it has more cores. But for an apples to apples comparison, wanna try comparing an i3 3120 to an fx 6300 / 6350 on today's games?
I didn't compare FX chips to anything in 2017. I spoke to how many keep repeating that "moar cores" will mean better performance because they buy processors with more cores.
 
I didn't compare FX chips to anything in 2017. I spoke to how many keep repeating that "moar cores" will mean better performance because they buy processors with more cores.

This reminds me of the joke at work. If product managers want a baby in one month, they would get 9 women to do it. Yep moar cores. LOL.
 
You do NOT know because you choose to hide from the facts. Facts show 7700K is faster than any of the R7 for games periods. And quit with the hyperbole. Here are the benches for the i5 vs R5 see:
http://www.gamersnexus.net/hwreviews/2875-amd-r5-1600x-1500x-review-fading-i5-argument/page-4

I agree, the i7 is faster than R7 in gaming. Never suggested otherwise. What are you talking about?

Also, yeah, that's a release date benchmark you know. Actually, the fact that you are commenting on Techspot while linking a benchmark from a different site says all that needs to be said. You are going to link whatever benchmark is more favorable to your position regardless of whether or not it's actually relevant to the discussion. And I'm sorry to say it, a launch date benchmark isn't really relevant.

But with that said, even in that benchmark, with low ram frequency (2933), not many optimization patches out for Ryzen, an R5 1600 that costs lower, comes with a decent cooler and can OC to 3.8ghz with said cooler in a cheaper mobo is neck and neck with a more expensive, with no cooler and requiring a more expensive mobo i5. There isn't even a comparison here, the R5 1600 murders the i5, even in just gaming, not to mention anything else. Now if you actually looked at more recent benchmarks, like the ones from digital foundry or techspot, the i5 is a ****ing joke. In the DF crysis 3 run it was hitting 52 fps while a stock 1600 was at 110+.

And the biggest competitor for the i5 is not the overpriced R5, but rather something like the R3. When you can get 1300x for $100, see $30 on 1300x at Microcenter with mobo purchase (btw Newegg is not selling the same mobo for $30 less so it is real discount):
http://www.microcenter.com/search/search_results.aspx?Ntt=1300x
Any less than top line game performance in the 7700K, has no real justification just to be muddling in the middle.

That's completely nonsense.Do you have 4way sli 1080tis? No? 4000mhz ram? Then you are muddling in the middle, aren't you? And I'm sorry, but the 1300x can't max out a 1070 or a 1080, so that's where the 1600 comes in. I'd much rather have a 1600 + a 1080 / 1080ti than a 7700k and a 1070 / 1080. The 7700k is the best in gaming but it doesn't really make much sense buying it. You are better off spending that extra 200€ you'll save by going for the 1600 in a better gpu or a better monitor than buying the 7700k.


Strawman fits you name all right. No one is comparing the FX to 2017 and expecting 8 cores of the pile of _______ driver to win. This is a bogus strawman argument and you know it. And it s also a strawman to go compare old i3 for the 6 core FX with today's games. Anyone that got cheap i3, for less money than the FX would have had money saved to replace their CPU or have already long upgraded. There is a reason for cheap 2 cores, they are cheap disposable and replaceable. It is NOT an apples to apples comparison that you imagined it to be.

And ignorance should be your name buddy. There is a reason I compared the 6 core fx to the 3rd gen i3. It's because they had the same price you brainiac. A 5 minute google search would tell you that. the 6300/6350 launched at 120 and 140€ respectively, and the i3's launched from 109 to 139 depending on the frequency. So the argument that more cores will be better for the future was proved to be absolutely true. Just ignorant people like you don't realize it cause they are comparing modern day i3's to 6 year old cpus.

Your next paragraph proves EXACTLY what I just said. You are comparing them to modern CPU's!! That's not how the comparison goes buddy. In order to prove the claim "A multicore cpu will scale better in the future compared to similar priced but with less cores cpu" you need to compare the cpu's in question. Those would be for example the fx 6300 and a 3rd gen i3.

The same way holds true for today. In order to prove that the R5's with their multiple threads won't be more future proofed than the i5's youll need to compare the R5 with the CURRENT i5's 5 years from now. Instead, what you are doing is comparing current R5's to the i5's that will be released in 2021 and then pretend like the original claim was false. I'm sorry, but you are wrong.

So this claim to wait for the software optimization is just cheap marketing words to lull gullible to pay more for stuff now. How about AMD take a 20% deposit now, and take the remain 80% later when they actually deliver on the gains later. That is a non-starter for AMD, well damn, why should we be taken for ride by them about more cores/threads and future optimizations. I'll buy what is good bang-for-the-buck now, and buy later what is the best bang-for-buck later.

It's not cheap marketing when it's actually 100% true, as shown by the fx 6350 to i3 3120 comparison.

It is too bad AMD has chosen to price themselves way out of what could even qualify as good bang-for-buck with the only exception being the R3 with the Microcenter discounts.

R7 should be at max $250
R5 should be at max $150
R3 should be at max $100

And that is the most dumb **** I've heard. Ryzen are overpriced? LOL
Even if you ONLY and only talk about gaming, your claim is still false absolutely false. Now if actually bring in all other applications besides gaming your claim can only be taken as troll / comedy or parody. That was ****ing stupid. The 1600 with it's platform will cost you about ~100€ less than the i5 7600k, and they are tied in gaming (except some cases where the i5 gets absolutely demolished like crysis 3 and bf1 64mp). Actually, the funny thing is that the i5 is faster in gaming than Ryzen only when it doesn't really matter. Sure it gets 140 in bf1 single player instead of 130 of Ryzen, but does it matter? Not really. But when it comes to 64mp or crysis 3 it makes some dives to the 50ies, while Ryzen stays calm and calculated at 100+.

BTW Ryzen will be bottlenecking your GPU. Why is it that AMD is testing their Vega's with the 7700k? It is already well established that at Ryzens can't keep up with the GTX1080ti at 1440p. See:
http://www.legitreviews.com/cpu-bot...-on-amd-ryzen-versus-intel-kaby-lake_192585/4

Once again, you fail to amuse me. Completely stupid benchmarks. Only 2 of those games are graphically demanding (deus x and ghosts) and they are pretty close. Also, you failed to mention that the i5 will bottleneck it too, will it not? You are just changing your points halfway through. If an R5 1600 bottlenecks a 1080ti than so does an i5, period.


That means all Ryzens must be priced with replacement in mind, so that you can still make use of the AM4 platform, AMD's only saving grace.

And the same applies to the i5's, only there is no use for the z270 platform anyways :).
Vega 56 needs to be $300 max
Vega 64 needs to be $400 max

And once agian, you are full of ****. Vega 56 should cost as much as a 1070, seeing how it has similar performance. I would buy the 1070 in that case, since it consumes less power, but that's about it. 300 is how much the 1060 costs and the vega 56 walks all over it.

Anything more and AMD is way overpriced and under performing. No one should be paying more now for future potential gain, when that money can be save and buy the next-gen product for real gain later. Wishful thinking does NOT save you real money.
Again, full of nonsense. You are not paying more now for the future. You are paying LESS than Intel for the same performance NOW and even better performance in the future.
 
I didn't compare FX chips to anything in 2017. I spoke to how many keep repeating that "moar cores" will mean better performance because they buy processors with more cores.

And I told you that it's true, moar cores will mean better performance as clearly shown by an fx 6300 / 6350 vs 3rd gen i3 comparison. The fx 6300 demolishes the 3rd gen i3 right now. Wanna guess why? Cause "moar cores" and games making use of them
 
And I told you that it's true, moar cores will mean better performance as clearly shown by an fx 6300 / 6350 vs 3rd gen i3 comparison. The fx 6300 demolishes the 3rd gen i3 right now. Wanna guess why? Cause "moar cores" and games making use of them
Saying something doesn't make it true. BTW the i3-3120 is a mobile processor so since you're not producing the data to support your assertion either you're intentionally misrepresenting your position or you made an error.
Of course, if you compare the fx, a 2012 processor with a 2017 processor it's gonna lose even though it has more cores. But for an apples to apples comparison, wanna try comparing an i3 3120 to an fx 6300 / 6350 on today's games?

Until you can produce some data I'll share Tom's Hierarchy chart (it's about 1 month behind). Be sure to note the FX-6300 being a tier below the i3-3220.
 
Back