Far Cry 6 Benchmarked

I think nvidia would be better off using the silicon space on cuda cores not this tensor core that original purpose was machine learning for science.
AMD broke away from mixing the two and went separate ways.
 
Appalling levels of performance all round considering the standard of visuals on display...very,very last gen looking.

..But that's what you get when a company like Ubisoft keeps flogging last gen Gfx engines like Dunia to death instead of moving over to their only semi decent engine, Snowdrop, used in The Division games.

But of course to do so would risk their pipeline of churning out the same old, same old cut & paste games each year.

Gotta keep those institutional shareholders fed...
 
Dunno what the other specs you are using to test this with, but with my Z490 MSI Gaming Edge mobo, with i9 10850K and my MSI 1080ti Gaming X, I get average of 85fps with everything set to ultra, hd textures on, and even FidelityFX CAS turned on.

I posted this, because I really hate how people seem to think the 1080ti is a bad card based off of these reviews, but this card, when used with a quality maker like MSI, has easily 20-30% more power than the other versions that are out there, and also the motherboard is 100% key to how a system will run, so benchmarks really need to say that information as well, because if you have a $50 mobo, it doesn't matter what everything else you have is, games will still run for crap.
 
Appalling levels of performance all round considering the standard of visuals on display...very,very last gen looking.

..But that's what you get when a company like Ubisoft keeps flogging last gen Gfx engines like Dunia to death instead of moving over to their only semi decent engine, Snowdrop, used in The Division games.

But of course to do so would risk their pipeline of churning out the same old, same old cut & paste games each year.

Gotta keep those institutional shareholders fed...
Avatar is going to use Snowdrop, and I can't wait to see/play it!

But on topic, I don't know what game people are playing, but the visuals in this game are amazing.
 
Dunno what the other specs you are using to test this with, but with my Z490 MSI Gaming Edge mobo, with i9 10850K and my MSI 1080ti Gaming X, I get average of 85fps with everything set to ultra, hd textures on, and even FidelityFX CAS turned on.

I posted this, because I really hate how people seem to think the 1080ti is a bad card based off of these reviews, but this card, when used with a quality maker like MSI, has easily 20-30% more power than the other versions that are out there, and also the motherboard is 100% key to how a system will run, so benchmarks really need to say that information as well, because if you have a $50 mobo, it doesn't matter what everything else you have is, games will still run for crap.
Do a search - when Techspot switched to this benchmark rig, they gave the full specs... there were many who argued they should have stayed Intel, as the majority of gamers used (and still use) those for gaming... There is no arguing that the 5950 is superior to any Intel rig... but... not for gaming...
 
Look at that - folks still get some solid 1080 performance out of an almost 6.5 year old 980Ti, should be pulling a good 60fps on a mix of medium/high settings. If I still had my 980Ti I'd be content with the performance she still brings....

Thankfully I was able to move on and now have a much more capable 1440p card, 3060Ti.

Performance looks pretty good all around, even on those older and less powerful GPUs. Too bad the last few Far Cry games were pretty crappy. I won't pick up this game, not worth my money and time.

 
I think nvidia would be better off using the silicon space on cuda cores not this tensor core that original purpose was machine learning for science.
AMD broke away from mixing the two and went separate ways.
It's needed for dlss. They usually share alu circuit too
 
Why you testing ultra after everything you said.

Steve says a lot of things that are pretty contradictory like how about how in the past everytime he wanted to "test" Ray tracing he almost always went with shadow of the tomb raider yet here he says how dxr shadows is basically pointless and that's all that tomb raider uses.

Meanwhile he could have used a game like control that ACTUALLY shows the difference something like DXR can make but no just like here he's petty adament not to show how much better Nvidia is at dxr (even if he admits it defeatedly when speaking as to why he doesn't care)

I heard over and over how much better the rdna2 cards were gonna be from him and others (MLID) and yet here we are with the only cards any actually cares about (based on user numbers) being the ones that have all the "worthless " technology.
 
Steve says a lot of things that are pretty contradictory like how about how in the past everytime he wanted to "test" Ray tracing he almost always went with shadow of the tomb raider yet here he says how dxr shadows is basically pointless and that's all that tomb raider uses.

Meanwhile he could have used a game like control that ACTUALLY shows the difference something like DXR can make but no just like here he's petty adament not to show how much better Nvidia is at dxr (even if he admits it defeatedly when speaking as to why he doesn't care)

I heard over and over how much better the rdna2 cards were gonna be from him and others (MLID) and yet here we are with the only cards any actually cares about (based on user numbers) being the ones that have all the "worthless " technology.
Pretty sure he only meant DXR was useless in THIS game...
 
Thank you for including VRAM usage and Medium Preset results.
Even if you only include VRAM results for GPU's that really seem to use a lot of it and when you think it may be hindering performance, I'd be more than okay with that!
 
Last edited:
Can we get a Ultra/High comparison...and some fps for the High settings? After the previous article, it would be quite interesting.
 
"In our opinion, Far Cry 6 looks better than Assassin’s Creed Valhalla"
Sure, but does it look better than Assassin's Creed Odyssey? :laughing:

In any case, I'm VERY pleasantly surprised to see where my RX 5700 XT lands on this list. I didn't expect that it would ever beat the RX 2080, let alone in such a complex open-world AAA title like Far Cry. :D
 
Why you testing ultra after everything you said.
For this test, Steve's looking to measure ABSOLUTE GPU HORSEPOWER and the only way to do that is to load the GPU up with every last thing that you can. This is the way that ALL GPU benchmarks have always been done.
I have to say that I'm proud of the Nvidia GTX 1080Ti; 4 years later and it can still nearly muster 60 FPS @ 1440p on the Ultra settings for a completely brand new game!
There can be no doubt that the longevity of the GTX 1080 Ti is amazing. It rivals that of the R9 Fury. Very few cards remain that viable for as long as those two cards have.
I posted this, because I really hate how people seem to think the 1080ti is a bad card based off of these reviews,
You won't hear that from me. I absolutely despise nVidia as a company but I would never say that their products are bad. The GTX 1080Ti has had an amazingly long life span. The two cards that I've seen that have had the most longevity have been the GTX 1080Ti and the R9 Fury. They can both still game to this day.
Would be interesting to see CPU benchmark test for this game as well.
I'm sure they have that in the pipeline.
So will the 3080 play this properly with the HD Texture pack? Does anybody know?
Of course it will. Steve was using the HD Texture pack for the testing.
 
Would be interesting to see CPU performance/scaling. I really hope that they optimized the game to run on multicore CPU's better than Far Cry 5, since apparently my Ryzen 5 4600H was bottlenecking my RTX 2060.
CPU usage: 50%
GPU usage: 70%
Framerate: All over the place.
 
"In our opinion, Far Cry 6 looks better than Assassin’s Creed Valhalla"
Sure, but does it look better than Assassin's Creed Odyssey? :laughing:

In any case, I'm VERY pleasantly surprised to see where my RX 5700 XT lands on this list. I didn't expect that it would ever beat the RX 2080, let alone in such a complex open-world AAA title like Far Cry. :D
It's an AMD title, unfortunately.
 
Back