Today we're taking a look at Far Cry 6 to see how it runs on a variety of PC hardware, so lots and lots of benchmarks including 30+ AMD and Nvidia GPUs at 3 resolutions and 2 quality presets.
Today we're taking a look at Far Cry 6 to see how it runs on a variety of PC hardware, so lots and lots of benchmarks including 30+ AMD and Nvidia GPUs at 3 resolutions and 2 quality presets.
Avatar is going to use Snowdrop, and I can't wait to see/play it!Appalling levels of performance all round considering the standard of visuals on display...very,very last gen looking.
..But that's what you get when a company like Ubisoft keeps flogging last gen Gfx engines like Dunia to death instead of moving over to their only semi decent engine, Snowdrop, used in The Division games.
But of course to do so would risk their pipeline of churning out the same old, same old cut & paste games each year.
Gotta keep those institutional shareholders fed...
Do a search - when Techspot switched to this benchmark rig, they gave the full specs... there were many who argued they should have stayed Intel, as the majority of gamers used (and still use) those for gaming... There is no arguing that the 5950 is superior to any Intel rig... but... not for gaming...Dunno what the other specs you are using to test this with, but with my Z490 MSI Gaming Edge mobo, with i9 10850K and my MSI 1080ti Gaming X, I get average of 85fps with everything set to ultra, hd textures on, and even FidelityFX CAS turned on.
I posted this, because I really hate how people seem to think the 1080ti is a bad card based off of these reviews, but this card, when used with a quality maker like MSI, has easily 20-30% more power than the other versions that are out there, and also the motherboard is 100% key to how a system will run, so benchmarks really need to say that information as well, because if you have a $50 mobo, it doesn't matter what everything else you have is, games will still run for crap.
It's needed for dlss. They usually share alu circuit tooI think nvidia would be better off using the silicon space on cuda cores not this tensor core that original purpose was machine learning for science.
AMD broke away from mixing the two and went separate ways.
Because it represents the worst-case for performance. If your GPU can breeze through Ultra at your desired resolution then you won't need to spend much time tweaking to find that sweet-spot for performance and visuals.Why you testing ultra after everything you said.
Why you testing ultra after everything you said.
Pretty sure he only meant DXR was useless in THIS game...Steve says a lot of things that are pretty contradictory like how about how in the past everytime he wanted to "test" Ray tracing he almost always went with shadow of the tomb raider yet here he says how dxr shadows is basically pointless and that's all that tomb raider uses.
Meanwhile he could have used a game like control that ACTUALLY shows the difference something like DXR can make but no just like here he's petty adament not to show how much better Nvidia is at dxr (even if he admits it defeatedly when speaking as to why he doesn't care)
I heard over and over how much better the rdna2 cards were gonna be from him and others (MLID) and yet here we are with the only cards any actually cares about (based on user numbers) being the ones that have all the "worthless " technology.
Why you testing ultra after everything you said.
For this test, Steve's looking to measure ABSOLUTE GPU HORSEPOWER and the only way to do that is to load the GPU up with every last thing that you can. This is the way that ALL GPU benchmarks have always been done.Why you testing ultra after everything you said.
There can be no doubt that the longevity of the GTX 1080 Ti is amazing. It rivals that of the R9 Fury. Very few cards remain that viable for as long as those two cards have.I have to say that I'm proud of the Nvidia GTX 1080Ti; 4 years later and it can still nearly muster 60 FPS @ 1440p on the Ultra settings for a completely brand new game!
You won't hear that from me. I absolutely despise nVidia as a company but I would never say that their products are bad. The GTX 1080Ti has had an amazingly long life span. The two cards that I've seen that have had the most longevity have been the GTX 1080Ti and the R9 Fury. They can both still game to this day.I posted this, because I really hate how people seem to think the 1080ti is a bad card based off of these reviews,
I'm sure they have that in the pipeline.Would be interesting to see CPU benchmark test for this game as well.
Of course it will. Steve was using the HD Texture pack for the testing.So will the 3080 play this properly with the HD Texture pack? Does anybody know?
Yeah I know, but I'm not going to read the article for you.So will the 3080 play this properly with the HD Texture pack? Does anybody know?
It's an AMD title, unfortunately."In our opinion, Far Cry 6 looks better than Assassin’s Creed Valhalla"
Sure, but does it look better than Assassin's Creed Odyssey?
In any case, I'm VERY pleasantly surprised to see where my RX 5700 XT lands on this list. I didn't expect that it would ever beat the RX 2080, let alone in such a complex open-world AAA title like Far Cry.