Ryzen 7 1700 vs. Core i7-7820X: 8-Core Royal Rumble

Nothing has changed.
If you're a gamer you're going to go with Intel 9 times out of ten.
If you're a megatasker and use a lot of MT apps, then you're going to go with Ryzen if you're on a budget.

I have a serious problem with the idea of buying a part simply because it's close in performance and a bit cheaper.

Would I buy an Intel HEDT part over Ryzen? Probably not, but that's because I have no use for that performance. Would I buy Ryzen if I was gaming higher than 1080p? More than likely, but right now with my current setup, Ryzen does nothing for me.
 
And what benchmarks are you looking at?
blah blah blah...

There are multiple sources for information btw. See:

https://www.hardocp.com/article/2017/05/26/definitive_amd_ryzen_7_realworld_gaming_guide/13
And this NOT me saying this:
"
Overall, the Intel Kaby Lake 7700K CPU at 5GHz Z270 system provided the highest performance while gaming. Didn’t matter if it was single-GPU, multi-GPU, 1080p, or 1440p, or 4K, the most wins (at least in terms of raw data) are with the 7700K at an overclocked 5GHz.

Overall, the AMD Ryzen 7 1700X at an overclocked 4GHz provided the same performance and gameplay experience as the Intel 2600K on Z68 at 4.5GHz. It was most competitive with the 2600K CPU with both overclocked to the highest levels.

In terms of gameplay experience we felt the 2600K and Ryzen CPUs "felt" the same while gaming in single-GPU at any resolution. We "felt" the 7700K at 5GHz had an experience advantage at all resolutions, and especially with multi-GPU CrossFire.
"

The GPU bottleneck is real. Denial and self-delusion is your choice, but I would have to be stupid to fool myself that way. And what is so wrong in asking that AMD lower their prices so Intel will follow suit and lower their prices? I don't build server farms, and I really have no immediate direct use for $800+ CPUs, beating way overpriced CPUs with slightly less overpriced stuff is still overpriced.
 
I see a lot of text, a lot of speech...& I see inconsistent testing metrics. If you want to test CPU performance at 1080p, you don't use a GPU that's designed for 1080p resolutions (I.e. GTX 1060)...you use a GPU that's designed to handle heavier resolutions like 1440p & 4K (I.e. a GTX 1080TI) to eliminate any potential limitations from the GPU.

I especially find those tests strange when they only show overclocked results for the chips. Even among gamers, the majority do not overclock their machines. We're talking performance comparisons here, after all. Comparing how well a NOS-equipped Ford Mustang performs compared to a NOS-equipped Dodge Charger may be interesting, but without the baseline comparison (I.e. non-NOS-equipped) you have no idea if the NOS even added anything. Same with overclocking: there have been a number of games benchmarked here on Techspot where their top-spot Core i7 saw little to no performance gains, even when OC'd to 50% of its stock clock. So when I don't see any testing done at stock clocks (or even at low-level overclocks), & I don't see any testing results that show how much effect overclocking has, it's hard to take them at face value. Especially when the results themselves end up being so close. Sorry, but when the spread of "first-place" to "last-place" on your testing only has a 5-10% gap, & especially if the gap is ~5FPS, it's really hard to claim that any 1 particular CPU is an "undisputed" winner. And considering that those Kaby Lake chips are 5 generations newer than the Sandy Bridge chips, but even when overclocked to 10% faster speeds can't manage to get a consistent 10% margin over them... exactly what kind of improvement is Kaby Lake providing over a 6-year-old product? Shaving 4W off the TDP? That's really impressive...& yes, you'd hear the sarcasm if we were speaking face-to-face.

I also find the tests suspect, since they not only don't seem to match other benchmarks. And no, I'm not talking about Techspot's Ryzen benchmarks (or Tom's Hardware's, or any other sites)...I'm talking about the GPU benchmarks.
-- take Battlefield 1, for example. The only test where their Intel Core i7 test seemed to match Techspot's benchmarks was the 4K test...except that HardOCP's GTX 1080TI was apparently running a lot slower, as its performance only matched Techspot's non-TI model (https://www.techspot.com/review/1267-battlefield-1-benchmarks/page3.html); Techspot's GTX 1080TI running on a slightly slower i7-7700K (4.9GHz vs. 5GHz) was 12% faster at 4K (https://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/).
-- Same thing at 1440p. HardOCP's GTX 1080 (paired with their OC'd 5GHz i7-7700K) ran much slower than Techspot's GTX 1080 (paired with an i7-6700K that was only running at 4.5GHz). About 17% slower, to be exact (https://www.techspot.com/review/1267-battlefield-1-benchmarks/page2.html). And HardOCP's GTX 1060 was also running significantly slower (about 13% slower), as was their RX 480 (about 16% slower).
-- It's even worse for DOOM, for example. There's a 37% difference in performance at 1440p between HardOCP & Techspot's benchmarks (& again, Techspot shows much better performance available). Their GTX 1060 performance was just as bad, running 40% slower than Techspot's.

Now, maybe it's possible that there's something about the settings that explains why HardOCP's performance numbers are so significantly lower than other benchmarks...but since they hardly gave any details about their benchmark system (unlike other sites, like Techspot...), it's harder to take their results at face value.[/B]
 
Last edited:
I see a lot of text, a lot of speech...& I see inconsistent testing metrics. If you want to test CPU performance at 1080p, you don't use a GPU that's designed for 1080p resolutions (I.e. GTX 1060)...you use a GPU that's designed to handle heavier resolutions like 1440p & 4K (I.e. a GTX 1080TI) to eliminate any potential limitations from the GPU....

Blah Blah Blah

You obviously have reading comprehension problems or selective memory. This is just one example data set randomly picked:
https://www.hardocp.com/article/2017/05/26/definitive_amd_ryzen_7_realworld_gaming_guide/10

See the GTX1080ti is clearly listed.
 
As for pricing...why should AMD worry about reducing their prices?
-- Ryzen 3 beats the i3-7350K in performance and is much cheaper (running $39USD cheaper for the 1200 & $19 cheaper for the 1300X), & is right behind the i5-7400 in performance while being significantly cheaper ($46-66USD cheaper)
-- Ryzen 5 is roughly between the i5-7500 & i5-7600K in performance. The 4C/8T models are cheaper than the i5-7500 ($11USD for the 1500X, $31USD for the 1400), & even the 6C/12T 1600 is cheaper than the i5-7600K ($27USD cheaper for the 1600, $48USD cheaper for the 1500X, & $68USD cheaper for the 1400).
-- Ryzen 7 may lag slightly behind the i7-7700K, but you can save some by opting for the 1700 model (which overclocks about as far as the 1800X can reach), something like $34USD.
-- None of those prices include the savings between the price of a B350 board (minimum to overclock a Ryzen CPU) & a Z270 board (required to overclock a Kaby Lake CPU). If you go on the cheap end for both, sure, it's only ~$20USD...but on the other end, you might see the savings go up to $50USD or more. At that point, combined with the CPU savings, you're going to have a chance to improve other hardware in your system: better/more storage, faster RAM, maybe a better GPU, etc.

AMD doesn't have to have rock-bottom prices. They just have to have a big enough price difference that it matches any performance gap (real or imagined)...& not only are they doing that right now, but on the budget gaming side they're not only beating Intel in performance but also beating them in price.
 
[QUOTE="GreenNova343, post: 1622357, member: 386202"

I especially find those tests strange when they only show overclocked results for the chips. Even among gamers, the majority do not overclock their machines. ...[/QUOTE]

You have hard data to prove that? LOL. I was running a 2600K as my main until recently, and I still do use it as my backup/secondary machine. I didn't start out overclocking it, but I do now. Actually most gamers I know do overclock at some point, primarily to buy sometime and delay that upgrade to get better pricing on the new stuff.
 
On
You obviously have reading comprehension problems or selective memory. This is just one example data set randomly picked:
https://www.hardocp.com/article/2017/05/26/definitive_amd_ryzen_7_realworld_gaming_guide/10

See the GTX1080ti is clearly listed.

Only at 4K; they tested 1440p with the non-TI, & 1080p was with the GTX 1060. But I guess you wanted to ignore that one, since it showed the Ryzen system beating the Sandy Bridge & matching the Kaby Lake.

It also further reinforces my argument about how Kaby Lake shows regression in performance. Again, we're talking about comparing a 2nd-gen i7 to a 7th-gen i7, with the latter clocked ~10% faster (technically, it's 11.1111111111.....%, but that's close enough for this purpose)...yet the best it managed for The Division was just under 5% better at 1440p? It couldn't even improve 2% over the Sandy Bridge at 1080p, & couldn't even get half a percent improvement at 4K? Again, you're basically cherry-picking the results that make a single argument look good, while refusing to look at the overall picture (let alone any results that contradict your argument), & refuse to consider any other argument that casts a negative light on Intel.
 
...
-- Ryzen 5 is roughly between the i5-7500 & i5-7600K in performance. The 4C/8T models are cheaper than the i5-7500 ($11USD for the 1500X, $31USD for the 1400), & even the 6C/12T 1600 is cheaper than the i5-7600K ($27USD cheaper for the 1600, $48USD cheaper for the 1500X, & $68USD cheaper for the 1400).
...

Intel i3 is, was, and has always been a lame duck. Saying you are faster than the a guy without legs doesn't win you any awards, nor does pricing that beats i3 mean very much. AMD needs to get their R3 to replace there own old obsolete FX piledriver, visheras, etc.

And I do NOT know you get your R5 vs i5 price gaps. See:
http://www.microcenter.com/search/s...4294966995+4294845179+4294866854&myStore=true

The 1600x is $20 more than the 7600K@$200
The 1600 is only on par with 7600@190
The 1500x and 1400 are obvious too expensive for the performance they provide compared to 1300x. There is nothing to be gained from mess around any CPU at the $150 price point right now. That is why I've said AMD can be clearly winner if they priced the 1600x at $150.
 
On


Only at 4K;........ couldn't even get half a percent improvement at 4K? Again, you're basically cherry-picking the results that make a single argument look good,....blah blah blah.

The simple fact that you obsess about 4K means you do NOT understand what you need to look at to see the CPU being the bottleneck for the GPU. 4K performance is GPU limited not CPU limited so of course you'd see roughly the same result.

And it is not cherry picking results. Here is more data for other sources:
http://www.legitreviews.com/cpu-bot...-on-amd-ryzen-versus-intel-kaby-lake_192585/4

Deny it all you want the bottleneck is real. Read this from Techspot so you can be enlightened:
https://www.techspot.com/news/68407...ottlenecking-cpu-gaming-benchmarks-using.html

Because if you believe the GPU bottlenecked performance then a FX-8370 would be just as good as your R7 1800x. Going on about the 1080p resolution with the 1060 is actually you cherry picking the results.
 
Wow these new Intel core X chips are quite a disappointment. I wouldn't buy one. But then again I wouldn't buy Ryzen either. The 7700k and the 7600k are the best chips out now I reckon. Well, unless you happen to be among the less than 0.1% of users who will actually see
more benefit from more cores than better IPC.
 
Wow these new Intel core X chips are quite a disappointment. I wouldn't buy one. But then again I wouldn't buy Ryzen either. The 7700k and the 7600k are the best chips out now I reckon. Well, unless you happen to be among the less than 0.1% of users who will actually see
more benefit from more cores than better IPC.

Well, as you can read from this article previously linked https://www.techspot.com/news/68407...ottlenecking-cpu-gaming-benchmarks-using.html

For the longest time myself and other respected tech reviewers claimed that all PC gamers need is a Core i5 processor as you reach a point of diminishing returns with a Core i7 when gaming. This was true about a year ago and there wasn't much evidence that suggested otherwise. Of course, we often noted that things would no doubt change in the future, we just didn't know when that change would happen.

A year later things have changed. A few games are indeed more demanding, but the biggest change is seen on the GPUs. The GTX 980 and Fury graphics cards were considered real weapons a year ago, but today they can be considered mid-range with graphics chips such as the Titan XP and soon to be released GTX 1080 Ti delivering over twice as much performance in many cases.

This is the problem with almost every tech editor. Unless they get clear evidence about something, they just ignore it. In that article, Steve basically admits he screwed up recommendations very badly saying "get i5 for gaming". Just year after, recommendation is suddenly i7 "because faster graphic cards came out". Well, fact that transition from 28nm tech to 14/16nm tech allows much faster cards was no surprise at all.

Same problem comes in mind with Samsung's 840 and 840 Evo. Wise people, including me, asked why pay more for TLC drive (840/840 Evo) than more reliable MLC drive? Well, back then there was no evidence that Samsung's TLC is unreliable. It was.

This shares same kind of problem. I say that Ryzen 7 is much better choice for gamers than any quad core currently if we look X time forward from this day. Only problem is that there is no evidence about that right now. So I expect that after X time Steve writes that "X time ago ago there was no evidence that Ryzen 7 is better choice than i5/i7 quad core but things have changed".

That X might well be 1 year or little more.

This time I don't even bother to write my usual opinions about benchmarks and GPU bottlenecking etc.
 
Last edited:
Wow these new Intel core X chips are quite a disappointment. I wouldn't buy one. But then again I wouldn't buy Ryzen either. The 7700k and the 7600k are the best chips out now I reckon. Well, unless you happen to be among the less than 0.1% of users who will actually see
more benefit from more cores than better IPC.

If you buy an i5 in 2017 you are dumb, period. The i7 has double the threads for 40% more money, and the i3 has the same amount of threads for half the money.

i5's are reserved for those who didn't get the memo when BF4 came out.
 
... Steve basically admits he screwed up recommendations very badly saying "get i5 for gaming". Just year after, recommendation is suddenly i7 "because faster graphic cards came out". .....

In what way did Steve screw up? The i5 recommendations were on every site everywhere for everyone to see, it was always about best bang for the buck based on gaming FPS metric relative to the competition at the time. And until the R3 came out, there was no a CPU cheaper than the 7600K that could provide better gaming value. The R5s are not better performing for games and they cost too much relative to the i5. The R3s still could use another $30 reduction so that the 1300x is $100 to eliminate the FX-PileOf_____Drivers, Visheras, etc. from the market. The R3 is also further proof that R5/R7 are way overpriced. See:
http://cpu.userbenchmark.com/Compare/AMD-Ryzen-7-1800X-vs-AMD-Ryzen-3-1300X/3916vs3930

At current market prices, the R3 is effective 85% as a fast and yet the 1800x is at 3x the price. This demonstrates AMD's wishful thinking (or deceit depending on how people want to look at this) trying to milk the AMD fanboys for donations. Intel is not know for charity, but the price premium of the 7700k@$300 vs 7600k@$200 is only 1.5x. And compared to the obviously poor value of a i3-7360K, you get 3x price multiplier vs 7700k, but the i3 is at only 66% as fast as a 7700k. See:
http://cpu.userbenchmark.com/Compare/Intel-Core-i7-7700K-vs-Intel-Core-i3-7350K/3647vs3889

Steve has not messed up in any way or form by recommending the i5. It is already August before the R3 has finally became available. Even for 2017, i5 was your best gaming value for 2/3 of the year until the R3 has finally became available. The super budget build now should be something like a R3+GTX1060( or $200 equivalent or better) good for all current 1080p gaming tasks.
 
In what way did Steve screw up? The i5 recommendations were on every site everywhere for everyone to see, it was always about best bang for the buck based on gaming FPS metric relative to the competition at the time. And until the R3 came out, there was no a CPU cheaper than the 7600K that could provide better gaming value. The R5s are not better performing for games and they cost too much relative to the i5. The R3s still could use another $30 reduction so that the 1300x is $100 to eliminate the FX-PileOf_____Drivers, Visheras, etc. from the market. The R3 is also further proof that R5/R7 are way overpriced. See:
http://cpu.userbenchmark.com/Compare/AMD-Ryzen-7-1800X-vs-AMD-Ryzen-3-1300X/3916vs3930

Then every site screwed up. Most people buy desktops for many years. So if CPU recommendation radically changes in one year time when nothing unexpected happens, those sites simply had bad recommendations.

In which way R5/R7 are overpriced? Much better choices for gaming than any i5 and offer much better overall platform.

Don't even bother to look Userbenchmark.com. That benchmark simply sucks.

At current market prices, the R3 is effective 85% as a fast and yet the 1800x is at 3x the price. This demonstrates AMD's wishful thinking (or deceit depending on how people want to look at this) trying to milk the AMD fanboys for donations. Intel is not know for charity, but the price premium of the 7700k@$300 vs 7600k@$200 is only 1.5x. And compared to the obviously poor value of a i3-7360K, you get 3x price multiplier vs 7700k, but the i3 is at only 66% as fast as a 7700k. See:
http://cpu.userbenchmark.com/Compare/Intel-Core-i7-7700K-vs-Intel-Core-i3-7350K/3647vs3889

Eh? 7600K is quad core, 7700K is quad core so price premium should be minimal. R3 is quad core, R7 is octa core, 3x price premium is very little.. So price premium for 7700K is much larger than for 1800X.

Again, using some crapbenchmark.com results to say octa core is overpriced vs quad core "(y)"

Steve has not messed up in any way or form by recommending the i5. It is already August before the R3 has finally became available. Even for 2017, i5 was your best gaming value for 2/3 of the year until the R3 has finally became available. The super budget build now should be something like a R3+GTX1060( or $200 equivalent or better) good for all current 1080p gaming tasks.

Read again what Steve actually wrote. He admitted that for today i7 is better than i5 and year before i5 was better choice. And because reason for this was "much better graphic cards" (no surprise to anyone) , we can say recommendations were pretty badly screwed.
 
...
Don't even bother to look Userbenchmark.com. That benchmark simply sucks.

.... blah
Again, using some crapbenchmark.com results to say octa core is overpriced vs quad core "(y)"
... blah.

No one got screwed saving $100 on i5. They still don't bottleneck you video card anywhere close to as hard as ryzen's will. Sure diss Userbenchmarks, when over 20 million users have provided good valid crowd-sourced data. The person cherry picking stats is you.

Your extra cores are in park and idling, when games are running. They don't help games in any significant manner. Paying extra for stuff you are not using is the epitome of being conned, just like the supersized McDonald's is not actually the best value for most people. As someone famously said, "we don't play cinebench", extra cores are nice, but not nice enough to pay so much more for them, and they do NOT help games. Remember back when AMD brought out 64-bit CPUs, did they charge us extra for the extra 32-bits for those socket 939 athlons, they did NOT, they priced competitively against Intel, but now we are supposed to pay for for their slower cores just because they are shoving it at use. LOL!. Either provide unrivaled performance win across the board, or provide unrivaled value by pricing significantly less than your competition. Failing to do either makes it overpriced.
 
No one got screwed saving $100 on i5. They still don't bottleneck you video card anywhere close to as hard as ryzen's will. Sure diss Userbenchmarks, when over 20 million users have provided good valid crowd-sourced data. The person cherry picking stats is you.

Basically when buying CPU last year for today, i7 was better choice but article stated otherwise.

What these tech sites do well is benchmarking. Do you really believe 20 million users knows nothing about benchmarking? No background tasks, clean Windows install, latest drivers etc? I don't.

our extra cores are in park and idling, when games are running. They don't help games in any significant manner. Paying extra for stuff you are not using is the epitome of being conned, just like the supersized McDonald's is not actually the best value for most people. As someone famously said, "we don't play cinebench", extra cores are nice, but not nice enough to pay so much more for them, and they do NOT help games. Remember back when AMD brought out 64-bit CPUs, did they charge us extra for the extra 32-bits for those socket 939 athlons, they did NOT, they priced competitively against Intel, but now we are supposed to pay for for their slower cores just because they are shoving it at use. LOL!. Either provide unrivaled performance win across the board, or provide unrivaled value by pricing significantly less than your competition. Failing to do either makes it overpriced.

What's your source for that "extra cores don't help games"? No, do not offer any benchmarks, offer real world situations.I doubt you have any.

Also comparing price wise octa cores against quad cores simply does not work. There are good reasons why octa cores are much more expensive than quad cores.

About pricing, right now Ryzen 5 1600X is around $70 cheaper than i7-7700K and that Ryzen 5 is much better choice for gaming machine. Some may disagree because benchmarks and no evidence but quotes I posted applies to this situation very well with little adjustments:

For the longest time myself and other respected tech reviewers claimed that all PC gamers need is a Core i7 processor as you reach a point of diminishing returns with a Ryzen 5 hexa core when gaming. This was true about a year ago and there wasn't much evidence that suggested otherwise. Of course, we often noted that things would no doubt change in the future, we just didn't know when that change would happen.

A year later things have changed.

They claim now that quad i7 is better for games than Ryzen 5 1600X but year later they will say things have changed. It always happens this way as me and many else know these things much better than any hardware site author that relies only on benchmarks.
 
....

About pricing, right now Ryzen 5 1600X is around $70 cheaper than i7-7700K and that Ryzen 5 is much better choice for gaming machine. Some may disagree because benchmarks and no evidence but quotes I posted applies to this situation very well with little adjustments:

....

They claim now that quad i7 is better for games than Ryzen 5 1600X but year later they will say things have changed. It always happens this way as me and many else know these things much better than any hardware site author that relies only on benchmarks.

R5 has to more than $70 to even justify any consideration. If it is not the fastest is it needs to be much cheaper, the R3 makes the R5 pricing completely unjustifiable. Deny the benchmarks all you want but here is more data, and the 7700K is on top for gaming, full stop. More cores are proven to be of no help from these well known facts alone:

http://www.gamersnexus.net/hwreviews/2875-amd-r5-1600x-1500x-review-fading-i5-argument/page-4

The 1600x is not faster than i5-7600K either so AMD must make up for it with price. But with introduction of the R3, the there is really no cost performance gains to be gotten, or justifications for any CPU slower than the 7700K and more expensive than the R3s. The muddled middle now is just a murky way to waste more money.

And you claim to the know the future, but you don't and the GPU bottleneck timebomb hiding in the R3,R5,R7 is going to bite in a couple of years from now, just like how my FX-8320 is bottlenecking by GTX970 even though it is newer than my i7-2600k. 1300x would be better to be $30 cheaper, and then if would be like the $100 AthlonXP back in the day, you would NOT mind replacing them in a year or 2, just like how I replaced those socket A the 1800+ with the 2400+, and then the 2800+, because you can maximize the value of the platform, and never having to spend more than $100 on the CPU and this pair well with the GPU upgrades. With the 1600x being $20 more than the 7600K at well over $220, this upgrade path is a complete non-starter. The facts and reality does NOT match your delusions.
 
R5 has to more than $70 to even justify any consideration. If it is not the fastest is it needs to be much cheaper, the R3 makes the R5 pricing completely unjustifiable. Deny the benchmarks all you want but here is more data, and the 7700K is on top for gaming, full stop. More cores are proven to be of no help from these well known facts alone:

http://www.gamersnexus.net/hwreviews/2875-amd-r5-1600x-1500x-review-fading-i5-argument/page-4

The 1600x is not faster than i5-7600K either so AMD must make up for it with price. But with introduction of the R3, the there is really no cost performance gains to be gotten, or justifications for any CPU slower than the 7700K and more expensive than the R3s. The muddled middle now is just a murky way to waste more money.

Again, you are only looking at benchmarks and ignoring the fact that benchmarks are different thing than real world gaming. i7-7700K is not enough for today's games because even when benchmarking CPU utilization is around 95% on heavy games. So in reality R5 1600X is much better for gaming. No matter what benchmarks say, that's clear fact.

And you claim to the know the future, but you don't and the GPU bottleneck timebomb hiding in the R3,R5,R7 is going to bite in a couple of years from now, just like how my FX-8320 is bottlenecking by GTX970 even though it is newer than my i7-2600k. 1300x would be better to be $30 cheaper, and then if would be like the $100 AthlonXP back in the day, you would NOT mind replacing them in a year or 2, just like how I replaced those socket A the 1800+ with the 2400+, and then the 2800+, because you can maximize the value of the platform, and never having to spend more than $100 on the CPU and this pair well with the GPU upgrades. With the 1600x being $20 more than the 7600K at well over $220, this upgrade path is a complete non-starter. The facts and reality does NOT match your delusions.

Wtf is this "future GPU bottleneck" for R7? There won't be major GPU bottleneck for 1800X for next three years, guaranteed. As I already said, even 7700K CPU utilization is around 95% on heavy benchmarks situations so essentially all quad cores are useless for gaming in near future. That's why R5 1600X is much better choice for gaming CPU assuming same CPU is used next year too. You won't see this on benchmarks today but that's why I know things better. I also look under the hood.[/quote][/QUOTE]
 
Again, you are only looking at benchmarks and ignoring the fact...
blah blah blah

The benchmarks are the real facts. The others are "alternative facts". The benchmarks already show that the ryzen bottlenecking GTX1080ti at 1440p. As CPUs get faster, the bottleneck will move to 4K. If you are happy with 4K at roughly 60fps or less, then of course there is no issue, the GPU will bottleneck it all and never notice the CPU limitations, and you can be happy with this for well beyond 3 years. Heck you'd happy to do this with a FX-9590 and it has 8 cores too.
 
The benchmarks are the real facts. The others are "alternative facts". The benchmarks already show that the ryzen bottlenecking GTX1080ti at 1440p. As CPUs get faster, the bottleneck will move to 4K. If you are happy with 4K at roughly 60fps or less, then of course there is no issue, the GPU will bottleneck it all and never notice the CPU limitations, and you can be happy with this for well beyond 3 years. Heck you'd happy to do this with a FX-9590 and it has 8 cores too.

Do you play benchmarks? I don't. If you do, good luck then but stop shouting benchmarks = games, because they are not. Also benchmarks are very far from so called real facts. Anyone who understands these things know it.

If Ryzen is bottlenecking at 1440p, then settings are low and GPU is not doing full job, very simple. Where you did get idea CPU's are getting faster and why that would mean Ryzen will bottleneck 4K? That makes no sense.

GPU bottleneck offers best image quality so that's what should be targetted.
 
Do you play benchmarks? I don't. ...
Where you did get idea CPU's are getting faster and why that would mean Ryzen will bottleneck 4K? That makes no sense.

GPU bottleneck offers best image quality so that's what should be targetted.

Obviously you didn't read:
https://www.techspot.com/news/68407...ottlenecking-cpu-gaming-benchmarks-using.html

And did see this either:
http://www.legitreviews.com/cpu-bot...ed-on-amd-ryzen-versus-intel-kaby-lake_192585
And a random sample of the stats included:
http://www.legitreviews.com/cpu-bot...-on-amd-ryzen-versus-intel-kaby-lake_192585/5

Ryzen is bottlenecking the the GPU and limiting the GTX1080ti already at 1440p. And it is about GPU getting faster leading to the CPU being the bottleneck. The GPU bottlnecks are at 4K right now, short of multi GPU setups. You insist on misreading the statements, and misrepresenting fact. And of course no one just plays benchmarks, but benchmarks are the shared facts that are the truth that people can discuss. If you have no benchmarks, then you are just making up opinions and perhaps even lies.
 
Obviously you didn't read:
https://www.techspot.com/news/68407...ottlenecking-cpu-gaming-benchmarks-using.html

And did see this either:
http://www.legitreviews.com/cpu-bot...ed-on-amd-ryzen-versus-intel-kaby-lake_192585
And a random sample of the stats included:
http://www.legitreviews.com/cpu-bot...-on-amd-ryzen-versus-intel-kaby-lake_192585/5

Ryzen is bottlenecking the the GPU and limiting the GTX1080ti already at 1440p. And it is about GPU getting faster leading to the CPU being the bottleneck. The GPU bottlnecks are at 4K right now, short of multi GPU setups. You insist on misreading the statements, and misrepresenting fact. And of course no one just plays benchmarks, but benchmarks are the shared facts that are the truth that people can discuss. If you have no benchmarks, then you are just making up opinions and perhaps even lies.

First article does not contain any Ryzen 7 benchmarks on 1440p

That another article, as I said you probably should learn to READ articles instead just looking at benchmarks.

For example Thief is played on Normal quality (duh). For GTA V:

In Grand Theft Auto V we set the game to run with no MSAA with 16x AF and high image quality settings as we didn’t want the GPU to bottleneck

Deus EX:

We picked to run just ‘Medium’ image quality settings due to how tough this game title is to render and we feel that most gamers will use this setting.

You talk about GPU bottleneck while article states they are avoiding GPU Bottleneck. LMAO :p

So again, you lost because you only looked at benchmarks and forgot to read text.

Here is more benchmarks you do NOT like. This one even did a special benchmarks with streaming:
https://lanoc.org/review/cpus/7478-intel-i7-7700k-vs-amd-ryzen-r7-1700-in-gaming?showall=1

Still ryzen can't get any sort of a clear win, those extra cores are not being utilized in any sort of meaningful way. Therefore paying more for them is no different than paying more for bigger shirt and now it fits even worse.

They actually played Doom, that uses future API Vulkan, and Ryzen 7 1700 was as fast as i7-7700K. So for future even Ryzen 7 1700 is as good as i7-7700K so CPU like Ryzen 5 1600X is much better choice than i7-7700K for future.
 
.....

You talk about GPU bottleneck while article states they are avoiding GPU Bottleneck. LMAO :p
...

You need to avoid the GPU bottleneck to see the CPU bottleneck on GPU. You sure like to mince words and read between the line and create your own story. Go on denying the benchmark results that everyone can see. The trend the obvious, ryzen bottlenecks at 1080p very heavily and very obviously, because there is no GPU bottleneck. At 1440p, the GPU limitation is more obvious but, not dominant enough to hide the CPU bottleneck. Then at 4K, the GPU bottleneck is all you get to measure and allows people like you to be blissfully ignorant about CPU bottleneck and pretend it doesn't exist.

Roll forward 2 years, when the GTX1080ti becomes a midrange GPU, and you get substantially faster GPU, guess what, the GPU bottleneck is not going be there from the GPU at 4K, but that GPU bottleneck will be imposed by the CPU, specifically Ryzen will do worse for it. This is fact, you can SLI GTX1080ti now and take a peak at that future.
 
Back