@Steve @Julio Franco will you do a simple test to verify the results that some are getting of the Windows FCU improving FPS by quite a big percentage? Gamer Nexus did CPU bound tests but not the GPU ones (yet).
If you want to talk about 7600K vs Ryzen, you're 6 months too late and on the wrong thread...
Again, you seem severely confused. If you're looking back 3 years now, the FX-8350 vs i5-4670K does indeed show that the latter did better on new cards.
All I'm trying is to prove that the claim is wrong. `The claim is " if a CPU does better on 720p in games than another CPU then the first one will do better in future games as well". The claim mentions nothing about generation of CPU or anything else. Therefore my comparison (R5 1600 vs 7600k) that proves the claim wrong is absolutely fine.If you're looking forwards to the future, the obvious comparison you're trying your hardest to pretend doesn't exist is i5-8600K vs R5 1600X. Which yet again, IS THE WHOLE POINT OF THE ARTICLE.
So, instead of caring about the facts you care about winning an e-argument. Great. Now tell me, since you bothered to go to every review, how exactly do you think you proved me wrong? You know there are like a trillion websites outhere that show the opposite results? You know why? Cause averages are entirely based on the games you are testing. I can take 10 games from 2012 and show that the 7600k is 30% faster than the 1600. I can also take some other 10 games (like bf1 multiplayer / civ 6 / crysis 3) and show that the 1600 is 30% faster than the 7600k. The fact is, the more modern a game is, the faster the 1600 gets. The heavier a scene, the more the 7600k plummets. Which was my point all freaking along, that the Ryzens do better on more recent games!but let's just check one or two or more professional sites before we take AMDfanboy1 living in mom's basement as the end all be all on benches.
Does Tom's hardware agree? Nope the further right you go the better the processor performs and notice the gap widens once you OC the 7600k compared to an OC 1600.
So other then every professional web site out there not agreeing with you, you are correct because of "youtube benchmark".
I love reading fanboy hissy fits in the morning, they read like...victory.
Nope, nobody said that.
Did you read the same posts?
No it's not indicative of future performance. It's only indicative of performance in CURRENT games with future graphics cards.
Nope, nobody said that.Lot of people saying that Ryzen won? Baffling.
Did you read the same posts?Did they read the same tests?
No it's not indicative of future performance. It's only indicative of performance in CURRENT games with future graphics cards.Especially when you look at 720p tests - something that can actually differentiate between these CPUs and indicate future performance.
So, instead of caring about the facts you care about winning an e-argument. Great. Now tell me, since you bothered to go to every review, how exactly do you think you proved me wrong? You know there are like a trillion websites outhere that show the opposite results? You know why? Cause averages are entirely based on the games you are testing. I can take 10 games from 2012 and show that the 7600k is 30% faster than the 1600. I can also take some other 10 games (like bf1 multiplayer / civ 6 / crysis 3) and show that the 1600 is 30% faster than the 7600k. The fact is, the more modern a game is, the faster the 1600 gets. The heavier a scene, the more the 7600k plummets. Which was my point all freaking along, that the Ryzens do better on more recent games!
Do you think that the i5's do better the more recent a game is, or are you just cherrypicking to win an e-argument?
PS1. Calling me a fanboy isn't an argument, sorry. I can call you a fanboy back and it get's us nowhere. But I surely must be doing a really lousy job at being a fanboy cause I'm suggesting people to buy an 8400 instead of a 1600! Go figure
Youtube videos generally offers much better indication about real gaming than benchmark runs. When running benchmark, you don't actually play the game. That's big difference and explains why youtube is indeed be much better source than any benchmark result found on "professional website".
The only thing you've "proven" is that you're still incapable of understanding why 720p benchmarks exist. Likewise, "lack of proof of the future is proof of a negative" is a logical fallacy, not some 'clever' argument.I want to talk about the claim that 720p benchmark are indicative of future performance. I proved it wrong simply by comparing 2 cpu's. All I'm trying is to prove that the claim is wrong. `The claim is " if a CPU does better on 720p in games than another CPU then the first one will do better in future games as well".
Also, 720p is the best indication of future performance out of all the tests. Testing in that fashion puts the “bottleneck” on the CPU. Using the logic that a faster chip today will be a faster chip in the future you can come to the conclusion that the faster chips today will last longer. Or are you suggesting that the slower chips at 720p today will actually end up being faster in future titles?
yea, not one single person believes that other then those who wear AMD PJs to bed but nice try youtube fanboys are better reviewers then an entire slate of professional review web sites
As for threading in future games, it'll continue to be the same mixed bag we've always had. At one extreme you've got Crysis 3, the other extreme Starcraft 2, and most lying somewhere in the middle (BF1, Overwatch, etc). As explained previously, this isn't going to change any time soon or continuously magically scale every 3 years for the simple reason that AAA cross-platform games are ultimately designed for and constrained by consoles (6-7 usable out of 8x Jaguar cores). That's why there's a far bigger jump from 4C to 6C but little from 6C/12T 1600X vs 8C/16T 1800X or 16C/32T 1950X. It's why in Youtube vids, games that "use" 12 threads often have as little as 4-5% utilization on half the cores. This isn't going to continue to scale every 3 years until you see 16 core games consoles. And yes, that's a future prediction you absolutely can "take to the bank" because no game dev in their right mind is going to throw away 80-90% of their 2019 AAA sales just because someone bought a Threadripper and wants it to become the minimum new standard. That's not how the real world works at all regardless of enthusiast PC hardware.
Except it has been happening.Right along with a Half Life 3 release....Over the next year, we'll see games that will utilize multiple cores for various purposes other than simply running the game engine. .
Cmon man the 'more core and future proof' angle again?
Maybe games will use more then 4-8 cores/threads in the next few years, sure would be nice. Good luck convincing people its going to happen.
As for threading in future games, it'll continue to be the same mixed bag we've always had. At one extreme you've got Crysis 3, the other extreme Starcraft 2, and most lying somewhere in the middle (BF1, Overwatch, etc). As explained previously, this isn't going to change any time soon or continuously magically scale every 3 years for the simple reason that AAA cross-platform games are ultimately designed for and constrained by consoles (6-7 usable out of 8x Jaguar cores). That's why there's a far bigger jump from 4C to 6C but little from 6C/12T 1600X vs 8C/16T 1800X or 16C/32T 1950X. It's why in Youtube vids, games that "use" 12 threads often have as little as 4-5% utilization on half the cores. This isn't going to continue to scale every 3 years until you see 16 core games consoles. And yes, that's a future prediction you absolutely can "take to the bank" because no game dev in their right mind is going to throw away 80-90% of their 2019 AAA sales just because someone bought a Threadripper and wants it to become the minimum new standard. That's not how the real world works at all regardless of enthusiast PC hardware.
Where can you find a Ryzen-5 1600 for 169.99?Back in real life where people use 1080p, 2560x1080p, and 2560x1440p Freesync monitors, Excellent AM4 motherboards were 20% off at newegg (only $60 shipped for ASRock's excellent AB350m Pro4) and Ryzen 5 1600 was $169.99, but let's go ahead and do everything we can to pretend intel is still relevant in price/performance.......
Nice forgetting that OC'ing isn't guarenteed and costs additional cash for a cooler.Sure you can speculate based on past and current results. And that's exactly what I did. The 7600k wipes the floor with the R5 1600x on older games. In current games, as you've said, they are close, with a slight advantage to the R5 1600x in modern engines. So the argument about 720p / CPU overhead / future gaming performance is wrong.
There are benches on youtube
OK benches on youtube, discussion is over. No more to see here...but let's just check one or two or more professional sites before we take AMDfanboy1 living in mom's basement as the end all be all on benches.
Does Tom's hardware agree? Nope the further right you go the better the processor performs and notice the gap widens once you OC the 7600k compared to an OC 1600.
Does Anandtech agree? Nope
I have $250, What Should I Get – the Core i5 7600/7600K or the Ryzen 5 1600X?
Platform wise, the Intel side can offer more features on Z270 over AM4, however AMD would point to the lower platform cost of B350 that could be invested elsewhere in a system.
On performance, for anyone wanting to do intense CPU work, the Ryzen gets a nod here. Twelve threads are hard to miss at this price point. For more punchy work, you need a high frequency i5 to take advantage of the IPC differences that Intel has.
For gaming, our DX12 titles show a plus for AMD in any CPU limited scenario, such as Civilization or Rise of the Tomb Raider in certain scenes. For e-Sports, and most games based on DX9 or DX11, the Intel CPU is still a win here.
https://www.anandtech.com/show/1124...x-vs-core-i5-review-twelve-threads-vs-four/17
Hey Maybe Techpowerup agrees? Nope
Surely PCgamer (aka maximumpc) agrees? Nope
What about techspot, the very site you are posting on? Nope
So other then every professional web site out there not agreeing with you, you are correct because of "youtube benchmark".
I love reading fanboy hissy fits in the morning, they read like...victory.
The only thing you've "proven" is that you're still incapable of understanding why 720p benchmarks exist. Likewise, "lack of proof of the future is proof of a negative" is a logical fallacy, not some 'clever' argument.
The realclaimobservation is : "For over 20 years, people have benchmarked at lower than normal play resolutions to eliminate GPU bottlenecks. By doing this, the difference in fps between that and normal resolution gives an idea of how much overhead you have when upgrading GPU but keeping same CPU 1-2 years later."
So yes, that IS being indicative of future performance, that's the whole point and sole reason to do it and why people have been doing it for 20 years going back to 640x480 resolution benchmarks when 800x600-1024x768 CRT's were the norm. It's perfectly valuable data that shows up different effects of how much more a CPU can do in future without a GPU bottleneck (that exist even at 1080p). The wider the 720p vs 1080p gap, the more headroom, conversely a game which is hitting say 112fps at both resolutions is already CPU bottlenecked (to which a GPU upgrade will add nothing). It is absolutely "indicative of future performance" on same CPU as GPU's improve. That's not even a "claim", but a simple observation of reality...
As for threading in future games, it'll continue to be the same mixed bag we've always had. At one extreme you've got Crysis 3, the other extreme Starcraft 2, and most lying somewhere in the middle (BF1, Overwatch, etc). As explained previously, this isn't going to change any time soon or continuously magically scale every 3 years for the simple reason that AAA cross-platform games are ultimately designed for and constrained by consoles (6-7 usable out of 8x Jaguar cores). That's why there's a far bigger jump from 4C to 6C but little from 6C/12T 1600X vs 8C/16T 1800X or 16C/32T 1950X. It's why in Youtube vids, games that "use" 12 threads often have as little as 4-5% utilization on half the cores. This isn't going to continue to scale every 3 years until you see 16 core games consoles. And yes, that's a future prediction you absolutely can "take to the bank" because no game dev in their right mind is going to throw away 80-90% of their 2019 AAA sales just because someone bought a Threadripper and wants it to become the minimum new standard. That's not how the real world works at all regardless of enthusiast PC hardware.
Meh...maybe a few more then 5 years ago, nothing crazy.Except it has been happening.
Games are using more and more cores per year..
Ryzen CPU's are the slowest CPU's in gaming.it makes little sense to splash out extra cash for any Intel CPU over a Ryzen 5 1600/X if you are just doing gaming.
Unfortunately, static marketing slides of "DX12 Profiling" are about as "useful" as static marketing slides of Mantle or AoTS scripted draw call saturation tests were at predicting +1300% increases in frame-rates going from DX11 to DX12...It's going to change and here's good reason:
Nice forgetting that OC'ing isn't guarenteed and costs additional cash for a cooler.
Also you are ignoring the fact that basically every Intel needs an aftermarket cooler out of the box vs AMD being able to run with stock coolers (except for the models which do not come with them).
The FPS comparisons done here are pointless, Intel is pulling ~10% more FPS for 100% more price. This has been the case since Ryzen came out.
Nothing changed with the new release except that ***** Intel fanboys have to buy new motherboards again
Meh...maybe a few more then 5 years ago, nothing crazy.
The 7700K smokes everything in every game and its 4/8.
My 930 from 7 years ago is 4/8.
Unfortunately, static marketing slides of "DX12 Profiling" are about as "useful" as static marketing slides of Mantle or AoTS scripted draw call saturation tests were at predicting +1300% increases in frame-rates going from DX11 to DX12...
Just out of curiosity, do the people who say that 720p performance is not an indicator of future performance think that the chips which perform worse at 720p now will perform better than their competition at 720p in the future?
Because that is ridiculous.
Of course 720p testing isn’t 100% accurate for the future. But it does show how fast a game can run when the limiting factor is the CPU. It is the best possible way of determining how fast a CPU can run games. Typically, if a chip A is faster than chip B at any given task in year X then chip A will be faster than chip B in year X+3. It’s not difficult logic.
3) Current intel owners don't need new motherboards, their chips are already faster then Ryzen. Future Intel owners need a new mobo.
Ryzen 2 will be faster than current Intel's and current Ryzen users don't need new motheboard for that. Intel users however will need new motherboard for CPU's faster than Ryzen 2.
show me the benchmarks of Ryzen 2 beating an intel processor....
Not really, I just looked at the benches in this article and the 7700k and 8700k are almost identical besides a few games.i7-8700K is faster than i7-7700K and it's 6C/12T...
.