Nvidia Volta gaming GPUs are not in the 'foreseeable future'

I misspoke, I just didn't remember the particular i3 that cost the exact same price as the fx 6300 / 6350. Nevertheless, pick whichever 3rd gen either you want, the fx 6300 / 6350 outperforms it in modern games.

You sure like to obsess over the strangely little things. You sure like looking at the trees and missing the forest. It doesn't matter that after 4 to 5 years the FX finally out performs the i3 for modern games. The lower price of the i3 allows people who bought those to move on from the that i3 two years ago. And we all know the i3 is nowhere close the best value from intel, and never has been. Being able to beat the kid with learning disabilities your class at tic-tac-toe 4 years later, does not win you any prizes.

BTW the 1600 is still $10 more that the 7600k ($190 vs $180, both get $30 discount on mobo) see:
http://www.microcenter.com/product/..._AM4_Boxed_Processor_with_Wraith_Spire_Cooler
http://www.microcenter.com/product/472532/Core_i5-7600K_Kaby_Lake_380_GHz_LGA_1151_Boxed_Processor

AMD used to provide much better value and priced much lower. Recall the AthlonXP, or the K6-2. You save $100 compared to the equivalent intel while being on par for some application and trailing other by 10% and some times you could even a get win or two against intel. I bought those AMD chips back in the day. Ryzen is way overpriced compared to that history AMD has had.
 
Against, you are being dishonest. THIS was the link YOU posted, and they are neck and neck
https://www.techspot.com/review/1433-intel-core-i9-core-i7-skylake-x/page3.html

You are calling me dishonst when you are the one who put up this chart. See you own posts
#69:
https://www.techspot.com/community/...foreseeable-future.238979/page-3#post-1624910
And
#116:
https://www.techspot.com/community/topics/amd-radeon-rx-vega-56-review.238937/page-5
And you linked:
https://techspot-static-xjzaqowzxaoif5.stackpathdns.com/articles-info/1450/bench/Average.png

And now you try to misrepresent with me linking the whole article with:
https://www.techspot.com/review/1433-intel-core-i9-core-i7-skylake-x/page3.html

You really work hard at lying. The dishonest one is you. The evidence is all here for everyone to see.

The R5 1600 outperforms the i5 in gaming when it actually matters and absolutely destroys it in everything else.

Say the one who has blatantly lied on as can be seen in these posts, and all this from just cherry picking from one scene in Crysis. This what objective people would call over-exaggeration.
 
You sure like looking at the trees and missing the forest. It doesn't matter that after 4 to 5 years the FX finally out performs the i3 for modern games

First of all, it's 3 years, not 4 or 5. And yes, it does matter, since that was the whole argument to begin with!! You sure are completely off point here

The lower price of the i3 allows people who bought those to move on from the that i3 two years ago

To bad they cost the exact same price as the 6300/6350. You do realize they were all around the 120-140€ price back in 2013, right?

And we all know the i3 is nowhere close the best value from intel, and never has been. Being able to beat the kid with learning disabilities your class at tic-tac-toe 4 years later, does not win you any prizes.

It doesn't matter. The argument was that the "moar cores" that the AMD fans used back in 2012-2013 didn't actually hold true, when in fact it has.


I don't care about your USA prices (that's the argument you used, wasn't it?). Here in EU the 7600k is 10-15€ more expensive. Also, the 1600 actually ends up being cheaper even in USA. It has a stock cooler and it oces in a b350 motherboard. Intel really needs to lower their prices man

AMD used to provide much better value and priced much lower.

And it still does. The R5 1600 obliterates the higher priced 7700k in all workloads. In gaming it offers better gaming experience than an i5 7600k. Bottom line, there is no game that you can't enjoy with an R5 1600. There are games that you can't enjoy with the 7600k. Therefore, the R5 1600 offers a better gaming experience and much more horsepower. Sorry, the i5 isn't even close

Ryzen is way overpriced compared to that history AMD has had.
Maybe it is overpriced compared to its history. But it certainly isn't overpriced compared to the **** i5's.
 
You are calling me dishonst when you are the one who put up this chart. See you own posts

#66 you asked me to check the rest of the data. I did, and I replied to you that they are neck and neck. This is the bench you referred to
https://www.techspot.com/review/1433-intel-core-i9-core-i7-skylake-x/page3.html

So what the heck are you talking about?



Say the one who has blatantly lied on as can be seen in these posts, and all this from just cherry picking from one scene in Crysis. This what objective people would call over-exaggeration.

I didn't lie, I never lie. And it wasn't one scene in crysis, it's the whole freaking game. Check the DF benchmark. The same happens to other games as well, like AC / TR and BF1. Stop being dishonest.

You basically take benchmarks and only post links with the game that the i5 is in the front, like you did with anandtech. I'll ask you once again, why did you link 3 games from the anandtech and none of them included the Civ 6 bench??? Please, enligthen me

Bottom line, the R5 offers playable experience in all games. The 7600k doesn't. You can't play Crysis 3, you can't play Civ 6, you can't play Aots, you can't play BF1 64mp, since it drops and stutters like crazy. Heck even in games that it performs better, like WD2, it stutters like crazy. If you buy an i5 over an R5 you are kinda crazy
 
Bottom line, the R5 offers playable experience in all games. The 7600k doesn't. You can't play Crysis 3, you can't play Civ 6, you can't play Aots, you can't play BF1 64mp, since it drops and stutters like crazy. Heck even in games that it performs better, like WD2, it stutters like crazy. If you buy an i5 over an R5 you are kinda crazy

Try harder at lying. There are millions of people who have built i5 machines, going back to to even before sandbridges. You are going to come here and lie and say those with 6600k, 4590K, 3570K, 2500K, etc, cannot, did not play Crysis 3, Civ 6, Aots, BF1 64mp, etc. etc. They did not stutter like crazy, this is a complete and blatant lie.

I do NOT dispute the fact that streaming you game play, have extra cores help to some degree, but recording video is the same as streaming. And we all know when you divert cpu and disk resources you create more opportunities for those frame rate drops. In other words your youtube video test methods are distorting the results, the recording are deliberately creating the stutters you cry about. This is so cherry picking, deliberately stacking the deck against the i5. If you want to record video, you use a video camera or a separate hardware screen capture device. You measurement methods should not distort your device under test. This is basic scientific process.
 
Last edited:
In post #75 in your own words you wrote:
it would be dishonest of me to compare the R5 1600 to the 7800x

So lets look at the evidence:

In post #69 you wrote:
That's 30 games averages and minimums. R5 ties to the 7800x, which btw combined with the cooler used and the mobo to reach 4.7ghz that's 2 to 3 times the cost

And again you in post #69 you wrote:
An R5 1600 performs on par with 7800x, which costs 400+ euro.

And on multiple ocassion you even attached the chart below see you own posts:
#69:
https://www.techspot.com/community/...foreseeable-future.238979/page-3#post-1624910
And
#116:
https://www.techspot.com/community/topics/amd-radeon-rx-vega-56-review.238937/page-5
And you linked:
https://techspot-static-xjzaqowzxaoif5.stackpathdns.com/articles-info/1450/bench/Average.png

What conclusion should we draw? And yet you go around insulting people as "dishonest" when you are the one that has been lying straight up for everyone to see.

BTW we all know the 7800x is overprice being less overpriced is still overpriced.
 
.... Intel really needs to lower their prices man

Did I ever say otherwise? Intel needs to lower their prices, but AMD is making it so easy for Intel to get away with bloody murder. Why? Because AMD is overpriced.


Maybe it is overpriced compared to its history. ....

Glad you agree here, and it is NOT maybe, this a fact. The R5 are overpriced, just not as badly as R7. The Vega is overpriced. the R3 just happens to the be least overpriced of them all, but higher than they should be. When AMD come late to the market, being the second option, being unable to have clear wins across the board for performance, means AMD have the unfortunate position of having to price significantly less to be considered NOT overpriced. Priced on par with Intel is overpriced because Intel is definition of overpriced, they need to be 30% tp 50% less. They have to give rational buyers at reason to pay for the transition costs, and mitigate against things like GPU being bottlenecked because the CPU can't keep up, like they already don't with the GTX1080ti, and having to wait possible 2 for 4 years for software optimizations to catch up because of moar cores. These costs are on AMD, they need to be one to absorb it for their customers.
 
Last edited:
The argument was that the "moar cores" that the AMD fans used back in 2012-2013 didn't actually hold true, when in fact it has.

It wasn't true in 2012, nor was it in 2013. It only comes true too late for it to matter, when game developers game new games and adjust to the new tech landscape. If you can NOT get the benefit for the time frame that mattered, for the existing games that people wanted to play at that time, it is too little too late.

The AMD market malarkey about how it is forward looking, does NOT help with current games. We don't buy this stuff for future benefit by paying today's dollars now. It is all fine and good to be forward looking, but as consumers, customers, and gamers we should not have to pay premium for what hat may mean. Especially since, in two-years or less, there will very likely be even better bang-for-the-buck options that far exceeds whatever "moar cores" now provide or will provide, and it is definitely unwise to pay more for "moar cores" because AMD's marketing malarkey said so. AMD has no choice but to give those "moar cors" away to win over customers, just like they gave away the 64-bit stuff to win over customers in the socket 939 era, a time when they actually claimed the performance crown, and even then they did not price higher than Intel.
 
Try harder at lying.
Sorry, not lying. You may not like the facts that I'm presenting but sadly, you can't change them
There are millions of people who have built i5 machines, going back to to even before sandbridges.
Yes there are.

You are going to come here and lie and say those with 6600k, 4590K, 3570K, 2500K, etc, cannot, did not play Crysis 3, Civ 6, Aots, BF1 64mp, etc. etc. They did not stutter like crazy, this is a complete and blatant lie
Nope, this is a complete and blatant strawman. I said they don't offer playable experience, not that they can't double click the icon and run the game. They can run it, it's just not enjoyable. I mean unless of course you enjoy drops to circa 60 fps. You can see the benchmarks your self, in all those games I mentioned the newest i5 is dropping under the 60ies. So what exactly are you arguing about, it's a fact. But since you mentioned the 4590k, check out the 4690k, it bottlenecks a 1060 mind you!!



Sure the 7600k if OCed @ 4.8 can be ~okay in BF1 64mp, but not always. There are cases where it WILL drop , for example when 10-15 people are in the same place dropping bombs etcetera. It's not something that will happen every game, so just a random benchmark run won't catch that, but if you do play BF1 mp yourself you'll notice it.

And that's why I've said DF's benchmarks are one of the best. Because they test the heaviest scenes in the games they are benchmarking, not some random **** in the game that nothing happens like most other benchmarking sites do. And that's exactly where the i5 was exposed compared to the R5, contrary to what the averages actually tell you.

I told you already but you either don't get it or ignoring it for fanboysm's sake. The i5 is great when it doesn't matter. But when it comes down to your CPU doing the heavy lifting it gets demolished. That was always the case, that's why there is an i7 just above it.

Sure it won't affect your average gamer that may not even notice it, but it's right there, so saying the i5 > R5's for gaming is just plain wrong.

I do NOT dispute the fact that streaming you game play, have extra cores help to some degree, but recording video is the same as streaming. And we all know when you divert cpu and disk resources you create more opportunities for those frame rate drops.
No, recording is not the same as streaming. Streaming requires your CPU to process the video in order to lower the resolution / bitrate etcetera depending on what service you are streaming too, which is not the case while recording. You can record 1/1 resolution or even use your GPU for that job. And btw, if it's disk resources then surely the Ryzens will suffer from that too, yet they don't.
 
hey have to give rational buyers at reason to pay for the transition costs, and mitigate against things like GPU being bottlenecked because the CPU can't keep up, like they already don't with the GTX1080ti, and having to wait possible 2 for 4 years for software optimizations to catch up because of moar cores. These costs are on AMD, they need to be one to absorb it for their customers.

Again with the nonsense. What the heck do you mean that the GPU is bottlenecked? Are you just trolling right now or do you actually believe that nonsense you are sprouting? You do realize that by the criteria you have said EVERY CPU bottlenecks almost any GPU? You do realize taht the 7700k also bottlenecks the 1080ti, right? AOTS is a clear example. Civ is another one. BF1 multiplayer is another one. Shall I go on? I can find COUNTLESS of games where an 7700k bottlenecks a 1080ti even at 720p.

The term bottlenecking is used when a CPU fails to max out a GPU on GPU bound scenarios. If it's a CPU bound scenario then OFCOURSE the GPU is going to be bottlenecked. Case in point, Dota 2. Eve online. Cs go. A 1080ti is absolutely bottlenecked by a 7700k in these games. Why don't you keep repeating that 7700k bottlenecks a 1080ti yet you do that for the Ryzen? I wonder why

Even if your point was true, and it's not, you are pretending like the 2080ti won't get bottlenecked by a 7700k at 1080p? That's once again, bullshit.

Also, I asked you and you didn't answer. Why none of the links to the anandtech benchmark had Civilization 6 were the i5 gets demolished? Please answer that
 
Last edited:
It wasn't true in 2012, nor was it in 2013. It only comes true too late for it to matter, when game developers game new games and adjust to the new tech landscape. If you can NOT get the benefit for the time frame that mattered, for the existing games that people wanted to play at that time, it is too little too late.

Bullshit as usual. I didn't say whether it was worth it or not, I merely suggested that the argument was true. In fact, it was. Whether YOU personally deem it to be "too little too late" is completely irrelevant. I'm sure there are people out there still gaming on their fx 6350. Actually, I have a friend who does. He would be opting for an upgrade had he gone for the 3rd gen i3. Or the pentium g3258 which back in the day "crushed" the fx 8350. Now all these are all gone, yet the fx8350 still holds.
 
In post #75 in your own words you wrote:

So lets look at the evidence:
In post #69 you wrote:
And again you in post #69 you wrote:
And on multiple ocassion you even attached the chart below see you own posts:
#69:
https://www.techspot.com/community/...foreseeable-future.238979/page-3#post-1624910
And
#116:
https://www.techspot.com/community/topics/amd-radeon-rx-vega-56-review.238937/page-5
And you linked:
https://techspot-static-xjzaqowzxaoif5.stackpathdns.com/articles-info/1450/bench/Average.png

What conclusion should we draw? And yet you go around insulting people as "dishonest" when you are the one that has been lying straight up for everyone to see.
Are you truly that daft? You do realize I'm not using it as an argument to conclude that Intel is overpriced, right? WTF is wrong with you? Is it low IQ or you just trolling me?
 
Nope, this is a complete and blatant strawman. I said they don't offer playable experience, not that they can't double click the icon and run the game. They can run it, it's just not enjoyable.

You said they can't play in post #79
". You can't play Crysis 3, you can't play Civ 6, you can't play Aots, you can't play BF1 64mp," What about "can't play" that you repeated multiple times is the same a "not enjoyable".

Furthermore to say all those millions of i5 users have a "not enjoyable" experience is also an utter lie. You do NOT speak for them, and they play those a just and they don't encounter the stutters.


No, recording is not the same as streaming. Streaming requires your CPU to process the video in order to lower the resolution / bitrate etcetera depending on what service you are streaming too, which is not the case while recording. You can record 1/1 resolution or even use your GPU for that job. And btw, if it's disk resources then surely the Ryzens will suffer from that too, yet they don't.

You would know wouldn't you. But you really don't. Tell me which video is stored in raw pixels format? Your disks/ssd won't have enough space to hold that stuff. Simple math here:
At 1080p, thats 1920x1080x32bitsx144framesPerSec = 9,555,148,800 bits/sec or rougly 10Gbps. Or 1.2Gbytes per sec of disk space, round it down 1GB for easy math. Your 1TB disk is 1000 Gb, or about 1000 secs of game time, or roughly 16 minutes of game time. At 4K, you'd get about 4 minutes before your 1TB fills up at 40Gbps.

You want to tell me this stuff is not going thru the CPU and/or GPU codecs to compress them down to MP4s/AVIs etc. LOL. It is well known that rotational disks can't keep with 10Gbps writes. Even SSDs can't sustain 10Gbps writes for very long. Video recording is the same as streaming, is just that instead of file handle to storage location on disk, it is a network socket to some network storage location. We already know you are full of _________, but now we have proof positive that you do NOT know what the heck you are talking about.
 
Are you truly that daft? You do realize I'm not using it as an argument to conclude that Intel is overpriced, right? WTF is wrong with you? Is it low IQ or you just trolling me?

Fact is Intel is OVERPRICED. NO disputing it. AMD by pricing on par with intel makes them OVERPICED too. We all know AMD did NOT use to do this, and they can return to their roots and provide unrivaled value.
 
You said they can't play in post #79
". You can't play Crysis 3, you can't play Civ 6, you can't play Aots, you can't play BF1 64mp," What about "can't play" that you repeated multiple times is the same a "not enjoyable".

If only you just stopped cherrypicking. Read the sentence juts before you decided to quote. What does it say? Here, let me quote it for you

"Bottom line, the R5 offers playable experience in all games"

That's exactly what I meant.

Furthermore to say all those millions of i5 users have a "not enjoyable" experience is also an utter lie.
No it's not. For me it's not an enjoyable experience. I even put out qualifiers. If someone enjoys drops to under 60 then sure, he is gonna enjoy it.

You do NOT speak for them, and they play those a just and they don't encounter the stutters.
They do, it's a fact. All metrics show it, the i5 stutters in cpu intensive games. Be it crysis 3, watchdogs 2, bf1 mp, it's there. Sorry you don't like it.


You would know wouldn't you. But you really don't. Tell me which video is stored in raw pixels format? Your disks/ssd won't have enough space to hold that stuff. Simple math here:
At 1080p, thats 1920x1080x32bitsx144framesPerSec = 9,555,148,800 bits/sec or rougly 10Gbps. Or 1.2Gbytes per sec of disk space, round it down 1GB for easy math. Your 1TB disk is 1000 Gb, or about 1000 secs of game time, or roughly 16 minutes of game time. At 4K, you'd get about 4 minutes before your 1TB fills up at 40Gbps.

What? You do realize you don't have to save the video at 144 fps , right? Why would you with an i5 when you don't even get that much fps in the first place, lol!

You want to tell me this stuff is not going thru the CPU and/or GPU codecs to compress them down to MP4s/AVIs etc. LOL. It is well known that rotational disks can't keep with 10Gbps writes. Even SSDs can't sustain 10Gbps writes for very long. Video recording is the same as streaming, is just that instead of file handle to storage location on disk, it is a network socket to some network storage location. We already know you are full of _________, but now we have proof positive that you do NOT know what the heck you are talking about.

Yes, you are right, because you absolutely have to write and play from the same disk. I'm convinced, you are just trolling. No way you are that dumb. I just don't believe it.
 
... Whether YOU personally deem it to be "too little too late" is completely irrelevant. ....

Too little too late is absolutely relevant. Time and money are always linked. Any sane person can understand this and it NOT just me that can see that marketing malarkey about "moar cores" is just smoke an mirrors. Everyone knows that Patton has said:
A good plan, violently executed now, is better than a perfect plan next week.”
You can't wait till later, when later has better bang-for-the-buck solutions, and you definitely do NOT pay more now for that later. Why do think every gamer worth their salt know "zerging" works great. The same applies to real life and getting the best bang-for-the-buck.
 
Fact is Intel is OVERPRICED. NO disputing it. AMD by pricing on par with intel makes them OVERPICED too.

But they are not pricing on par with Intel. They are pricing BETTER products for the same price. R5 1600 for example, obliterates even the i7 7700k in workstation applications, it offers a better gaming experience than the i5 7600k, it comes with a cooler, it can work on a cheap mobo and has a longtime support for the socket. No arguments here, it's absolutely a steal for the price
 
You can't wait till later, when later has better bang-for-the-buck solutions, and you definitely do NOT pay more now for that later. Why do think every gamer worth their salt know "zerging" works great. The same applies to real life and getting the best bang-for-the-buck.

If you plan to keep your CPU for a long time I don't see how buying the best CPU for today is actually the better choice. I'm sure quite a lot of people would be cursing right now for buying a 3rd gen i3 or a pentium g3258 instead of an fx6300 / 6350. Especially considering that back in 2012, they both pretty much got the job done, they didn't lose anything going for the fx instead of an i3. Sure the i3 was faster when compared with a 780ti or something, but nobody would pair those 2 together. So, what exactly would it be wrong back in 2012 going for the fx instead? Absolutely nothing.

Also, you keep repeating that you pay more but you don't. The top i3 cost 139€ and the fx6350 was 140. You didn't pay more.

Since you still have an i5 2500k you should know that already. People keep their CPU's for a long long time. Personally, I don't, I change every 6 months to a year, but that's not the average.
 
What? You do realize you don't have to save the video at 144 fps , right? Why would you with an i5 when you don't even get that much fps in the first place, lol!

So you are admitting these videos are well doctored then? If they don't record the original source material at 144 fps, how are we to believe that the video is anywhere close to being a reliable representation of the actual performance. Those youtube video is never representative what you can actually see on actual system in person.

Yes, you are right, because you absolutely have to write and play from the same disk. I'm convinced, you are just trolling. No way you are that dumb. I just don't believe it.

Same disk, different disk, it doesn't matter, show me a video capture that is saving the raw pixels uncompressed. If you got 40Gps link to some remote network storage, you might be able to sustain that. Show me that these youtubers are actually NOT compressing their videos in real time. FRAPS, OBS, etc. they are all using compression of some sort in realtime, even if it is not the lossy H.264 stuff, not different from the streamers.

That is why reputable site, when they benchmark, they don't record the videos on the same machine at the same time. The collect the data and present the benchmark data.
 
But they are not pricing on par with Intel. They are pricing BETTER products for the same price. R5 1600 for example, obliterates even the i7 7700k in workstation applications, it offers a better gaming experience than the i5 7600k, it comes with a cooler, it can work on a cheap mobo and has a longtime support for the socket. No arguments here, it's absolutely a steal for the price

We all get it you love your R5 1600. But don't lie to people that paying 50% for 5% gain over the 1300x is good way to spend money.
 
NO surprise here!
Reminds me of an old Indian friend ,that's been everywhere and done everything.

Binder Dundat !

8800 GTS 320/640 mb to 8800 GTS 512. to 9800GX2
8800gt to 9800gt
8800 GTX to 9800 GTX + to GTS 250 .
8600 GT/GTS to 9600 GT to GT240.
I think the 9800 GX2 was a pair of modded 9800 gts on the same PCB.

I had most of them,and still have a pair of the 9800GX2,and 9800GT still like new ,
then 3 GTX 280 .still running. need to be cleaned like my GTX 480 just were.

ER UM ,NO prices did not drop the cards just dropped a rung on the ladder and replaced by a new topgun. and the milking went on &on &...........

did NVidia ever milk that G92 chip huh?

WTF you do that for.lol ,thanks for the memories .not sure I wanna go through that again.
here we go!
GTX 1080 to GTX 1080+ to GTX 1090 or to 1080 GX2
GTX 1070 to GTX 1075
and on and on..

AMD are Guilty of that shite as well.I have a bunch of ati gpu's as well. some rebrands.

I actually worked for Tiger Direct in those days. I still remember nVidia's excuse for re-branding the 8800GTX to the 9800GT as a different card. "Oh the 8800 series is only PCI-Express v1.0 while the 9800 series is PCI-Express v2.0." as if that made much difference. People ate it up though because it wasn't until ATi *****-slapped the nVidia GeForce GX 260 with the Radeon HD 4870 at $150 less, all they had was the HD 3870 and that card was not the least bit competitive compared to even the 8800 series. I was so glad to get my hands on the HD 4870 because I had already learned enough about Intel and nVidia from working there to know that I wanted no part of them.
 
If you plan to keep your CPU for a long time I don't see how buying the best CPU for today is actually the better choice. ....

Why limit yourself to "plan to keep" and "for a long time". What's the point of the AM4 socket being kept around, if you can't get better CPUs later to take advantage of it? Why limit you mentality to such limited nonsensical limitations. Buy the best bang-for-the-buck and save the difference, for the best bang-for-buck later.
 
So you are admitting these videos are well doctored then? If they don't record the original source material at 144 fps, how are we to believe that the video is anywhere close to being a reliable representation of the actual performance. Those youtube video is never representative what you can actually see on actual system in person.

What are you freaking smoking? Even if it was recorded at 144 fps you do realize youtube only plays videos at 60, right? And I don't care about the videos, I'm just looking at the fps graph and the gpu usage graphs, that's all that actually matters.



That is why reputable site, when they benchmark, they don't record the videos on the same machine at the same time. The collect the data and present the benchmark data.
DF has a capture device, and the i5 7600k still gets demolished in CPU heavy scenarios. Here, just like this one


Here is what's happening. Throughout the whole run not ONCE did the 7600k managed to put the 1060 to 99% usage! Which means, a 1060 was bottlenecked by an i5 7600k. On the other hand, not ONCE did the 1060 on the R5 drop under 99%!! That's because, as I've said, and for some weird reason you don't want to accept, the i5 shits the bed when the CPU has to do the heavy lifting. That's a fact, sorry. R5 > i5
 
We all get it you love your R5 1600. But don't lie to people that paying 50% for 5% gain over the 1300x is good way to spend money.
The 1300x can't max a 1070 or a 1080. So if you plan to buy one of those cards the 1300x is a no go. On the other hand, if you want to get a 1060, sure the 1300x is fine. Where the heck did I say otherwise? Why are you putting words on my mouth?

And once again, what you are comparing is just completely dumb. Of course a CPU that's twice as expensive won't perform twice as fast, that's always the case no matter what company. A car that's twice as expensive won't go twice as fast etcetera. If this is your idea of vfm then the cheapest processor you can find is the best vfm and everything else is overpriced. That's once again bullshit.

The R5 1600 is an absolutely fantastic deal, every reviewer on planet Earth agrees with it.
 
...
Since you still have an i5 2500k you should know that already. People keep their CPU's for a long long time. ...

Want to know why it is only recently retired from main/primary machine duty for a 7700k mini-itx build, because Intel has had zero competition all this time from AMD, and intel changes their damn sockets way too much.

But I have more than a few machines, I am replacing a processor for one of these at least once every 2 years. Like the 1300x just replaces the FX-8320, one of the worst _expletive goes here_ from AMD ever, that one lasted 3 years. I don't trust AMD for any CPU for the long run right now. At least my old i7-2600K and i5-2500K can still perform respectably today. My haswell non-K i5, no overclock is still managing to bench better than the 1300x. Last year my I got a haswell i3 for my SFF mini cube and that replaced the AthlonX2 toledo SFF mini cube for the media center. There is always something to replace. It does NOT make sense to go all-in on any single build. There is always a place to reuse the older CPU, unless you got junk the FX-8320.
 
Back