Final Fantasy XV Mega CPU Battle

Thanks. Interesting article. I like that you included the A12-9800, but it would have been cool to test an FX-4350 too, just to see how an L3 cache affects performance when the number of cores is the same.
 
Thanks. Interesting article. I like that you included the A12-9800, but it would have been cool to test an FX-4350 too, just to see how an L3 cache affects performance when the number of cores is the same.

Thanks mate, I'll invest more time testing once the game comes out.
 
Any
Thanks. Interesting article. I like that you included the A12-9800, but it would have been cool to test an FX-4350 too, just to see how an L3 cache affects performance when the number of cores is the same.

Thanks mate, I'll invest more time testing once the game comes out.
Well done, great test as always.

Those numbers btw are all over the place. The high quality settings is what you could expect, but then the Lite quality are a huge mystery. And what's making it a huge mystery is the 8700k frankly. I could accept that the game is perfectly multithreaded and so the 1600 and the 1800 are doing better than 8400 / 8600k , but when even the 8700k is worse, there is something wrong here.

Do you think it could have something to do with Spectre / Meltdown patches?
 
Thanks mate, I'll invest more time testing once the game comes out.

Thanks. Frankly I didn't expect to get a response to my wishful thinking. Much appreciated.

By then the new Ryzen APU's would be out, and that would add more interesting data points to the mix. (Are they on the way for review?)
 
It seems as though there is something wrong with the FFXV benchmark

https://www.gamersnexus.net/game-be...ous-misleading-benchmark-tool-gameworks-tests

Not only is there problems with GameWorks but there seems to be problems with culling as well.
^ This. From the article - "On our own, we’d discovered that HairWorks toggling (on/off) had performance impact in areas where no hair existed.... The benchmark is rendering creatures that use HairWorks even when they’re miles away from the character and the camera... Although we don't believe this to be intentional, the Final Fantasy XV benchmark is among the most misleading we’ve encountered in recent history. This is likely a result of restrictive development timelines and a resistance to delaying product launch and, ultimately, that developers see this as "just" a benchmark".

^The Hairworks equivalent of lack of Occlusion Culling, etc, is a pretty big one right up there with Crysis 2's massive over-tessellation of hidden water surfaces which virtually halved frame-rates with nothing to show for it...

Honestly I think the most appropriate response from tech sites to these released half-broken games (including self-publicity pre-release benchmarks) deliberately launched in a broken state due to "resistance to delaying product launch" is to not give them any publicity at all (including not using pre-release benchmarks) until they do fix it, so they don't get rewarded more than more responsible devs who do delay the product launch date in order to iron out the bugs properly. At best, you're just rewarding sh*tty development practises. At worst, we'll then end up with the same benchmark getting two completely scores with no apparent version numbers to compare (completely defeating the entire purpose of a benchmark...)
 
^ This. From the article - "On our own, we’d discovered that HairWorks toggling (on/off) had performance impact in areas where no hair existed.... The benchmark is rendering creatures that use HairWorks even when they’re miles away from the character and the camera... Although we don't believe this to be intentional, the Final Fantasy XV benchmark is among the most misleading we’ve encountered in recent history. This is likely a result of restrictive development timelines and a resistance to delaying product launch and, ultimately, that developers see this as "just" a benchmark".

^The Hairworks equivalent of lack of Occlusion Culling, etc, is a pretty big one right up there with Crysis 2's massive over-tessellation of hidden water surfaces which virtually halved frame-rates with nothing to show for it...

Honestly I think the most appropriate response from tech sites to these released half-broken games (including self-publicity pre-release benchmarks) deliberately launched in a broken state due to "resistance to delaying product launch" is to not give them any publicity at all (including not using pre-release benchmarks) until they do fix it, so they don't get rewarded more than more responsible devs who do delay the product launch date in order to iron out the bugs properly. At best, you're just rewarding sh*tty development practises. At worst, we'll then end up with the same benchmark getting two completely scores with no apparent version numbers to compare (completely defeating the entire purpose of a benchmark...)

Yeah but we didn't realize there was an issue until we started testing! I did this testing 4 days ago and presented it in a video on the 3rd, a day before the Gamers Nexus findings. The day before Steve @ GN released that video I'd already decided we weren't going to invest time in GPU testing because the results didn't seem right and I was sure GameWorks was the culprit. I also knew AMD wouldn't release their official driver till the game came out so there wasn't much point anyway.

Thanks. Frankly I didn't expect to get a response to my wishful thinking. Much appreciated.

By then the new Ryzen APU's would be out, and that would add more interesting data points to the mix. (Are they on the way for review?)

I have the new APUs and we've begun testing, so you'll see our in-depth review next week.
 
Great review, the game is obviously playable on any modern CPU. Nice to see the i3-8350 stock performance ahead of the ryzens since my 7600k is running at 4.8ghz.
 
I have the new APUs and we've begun testing, so you'll see our in-depth review next week.

I'm really looking forward to that. So happy to hear they're already with reviewers, and it's not another 'we'd rather not get stuff into reviewers hands' launch from AMD (which I guess was understandable with Bristol Ridge). I hope these APU's are going to rock, because I have my eye on a 2200G as an upgrade for my Phenom II X6 1090T + Radeon HD 5750.

Hopefully you're doing some RAM speed testing in that review (though it's okay by me if it will end up in a later article).
 
The 6 core Ryzen did really well.
Nice review Steve!

Looking forward to the Raven Ridge review. Can't wait to swap out my A12-9800!
I hope to see how cpu utilization is during heavy video playback as well as live streaming performance.
 
Great review, the game is obviously playable on any modern CPU. Nice to see the i3-8350 stock performance ahead of the ryzens since my 7600k is running at 4.8ghz.

Not sure if you read the review correctly as even the R5 1500x was heads and shoulders above the 4/4 Intels.
Don't focus solely on the average FPS. The 4 threaded Intels take a huge hit on the 1% lows even with gameworks disabled. This is where you will notice stuttering.
 
4k benches with a 1080ti please!

As long-haired and short-haired Steve pointed out, this game needs to fix some of the gameworks crap before any meaningful gpu tests are done.

I am curious on the vRam consumption and performance when using both 8 and 16 GB of system Ram. It will be curious to see what a hog this game is.
 
Not sure if you read the review correctly as even the R5 1500x was heads and shoulders above the 4/4 Intels.
Don't focus solely on the average FPS. The 4 threaded Intels take a huge hit on the 1% lows even with gameworks disabled. This is where you will notice stuttering.

I did, I'm just not an AMD fanboy so I read it "properly" for help on the definition of that word see below

http://www.dictionary.com/browse/properly
 
Not sure if you read the review correctly as even the R5 1500x was heads and shoulders above the 4/4 Intels.
Don't focus solely on the average FPS. The 4 threaded Intels take a huge hit on the 1% lows even with gameworks disabled. This is where you will notice stuttering.

I did, I'm just not an AMD fanboy so I read it "properly" for help on the definition of that word see below

http://www.dictionary.com/browse/properly

Not why you referenced properly. Anyhow, it just seems like 4 cores really take a hit.

What is really suprising is that the 4/4 R3 looks better than the i3 despite having lower clocks.
 
Not why you referenced properly. Anyhow, it just seems like 4 cores really take a hit.

What is really suprising is that the 4/4 R3 looks better than the i3 despite having lower clocks.
If you are not going to thank me for correcting and teaching you then then don't quote my posts.
 
The benchmark tool is really disappointing, no 1440p support, terrible performance (I get some stuttering on my 7700k + 1080 Ti both overclocked). I preordered the game, but it's better to wait for the final release and "game ready drivers" to judge performance.
 
FX 8370 did well , would like to see what it would do ,OC IN YOUR BENCH .

My FX 8370e 4.50 GHz in the charts with a 290x falls inline with 780ti -970 1060 580 , but have no ideal what CPU is used , wish they show complete system used .
 
The first thing I did when I heard that this game (which I was really hoping to come to PC) will feature Gameworks was to put it in my not to buy list. Japanese devs making a PC port while also adding GW? What could go wrong? :D /s
 
Great review, the game is obviously playable on any modern CPU. Nice to see the i3-8350 stock performance ahead of the ryzens since my 7600k is running at 4.8ghz.
Ahead of which ryzen's? Is there a Ryzen lower than the 1200? Cause iIt's behind even the 1200 when removing the GPU bottleneck. Actually, it's not even close to the R3 1200, since it's 50% (yes, 50!)behind when it comes to 1% minimums, nevermind the bigger Ryzen's. So wtf are you talking about?
 
Last edited:
Stock i7 2600K still on par with the lower end Ryzens in this game. 7 year old CPU. Wonder how it would do overclocked.....
 
Back