Far Cry 6 Benchmarked

The only time I noticed a different with RT reflections on, I crouched down to look at my ride's hub-cap/wheel cover and saw my smiling face looking back at me. Turned off RT and no reflection..
 

Ben1978

Posts: 134   +101
Why you testing ultra after everything you said.
Because the audience expects. I'm gonna try and use high settings by default from now on. Both DF and HU have said ultra are a complete waste of time, so I'll try and pretend they don't exist. I should also step down from 4K to 1440p but that's gonna be harder cos 4K is noticeably shinier. Having said that in Deathloop I've been playing very happily with 4K adaptive performance, where resolution could be dropping down by as much as 50% without me realizing. Now that's a setting that every game should have.
 

Ben1978

Posts: 134   +101
Steve says a lot of things that are pretty contradictory like how about how in the past everytime he wanted to "test" Ray tracing he almost always went with shadow of the tomb raider yet here he says how dxr shadows is basically pointless and that's all that tomb raider uses.

Meanwhile he could have used a game like control that ACTUALLY shows the difference something like DXR can make but no just like here he's petty adament not to show how much better Nvidia is at dxr (even if he admits it defeatedly when speaking as to why he doesn't care)

I heard over and over how much better the rdna2 cards were gonna be from him and others (MLID) and yet here we are with the only cards any actually cares about (based on user numbers) being the ones that have all the "worthless " technology.
His view on ray tracing seems to be that at first he thought it was a waste of resources, but in newer games like control and watch dogs legion its starting to show decent results, especially combined with dlss 2.0. But I'd hazard to say he still thinks we're another GPU generation or two away from it being worth the performance hit.
 

hahahanoobs

Posts: 3,840   +1,908
His view on ray tracing seems to be that at first he thought it was a waste of resources, but in newer games like control and watch dogs legion its starting to show decent results, especially combined with dlss 2.0. But I'd hazard to say he still thinks we're another GPU generation or two away from it being worth the performance hit.
It's very confusing for someone to think that.

It seems the recent "is ultra quality worth it" videos are exposing gamers that put image quality over smooth frame rates, so the objections to using RT has more to do with just not being happy seeing how many frames are lost. Look at MS flight sim. How much fps are people getting WITHOUT RT? SUB 60fps right? Yet it gets everyone's recommendation. But if a game is sub 60fps with RT it's a waste.

Not having RT results comes down to bias and fairness, because if AMD can't compete then no one gets the ball. With that being said, Techspot/HWU tests RT enough to satisfy me anyway.
 

Lew Zealand

Posts: 1,973   +2,305
TechSpot Elite
Like everything else, RT simply comes down to cost vs. benefit.

In some games the benefit is high (Control, CP2077) so it's worth the cost in FPS for many people with a GPU that can handle it (higher end Nvidia, thank you DLSS) and maybe not with others (low end Nvidia, all AMD).

In other games, the difference is so minimal (FC6 here, SotTR, lots of others) that any FPS lost isn't worth it, even with DLSS where available.

I think the issue with many reviewers right now is that RT has very few games where RT makes a real, appreciable difference, enough to make the compromise in FPS unquestionably worth it. There are some for some people, but really not enough. And also not enough at game release time where it really matters, instead of showing up in a patch months afterwards (SotTR for example).

Another example, I prefer to play Minecraft with a fully Path Traced shader pack (think: whole world is ray-traced) and a hi-res, heavy realistic texture pack which brings my FPS from 500 down to 30-40. Because the improvement in visuals are worth it IMO. But not everyone would think that's a good choice. Works for me though!
 

Rocky4040

Posts: 22   +28
Avatar is going to use Snowdrop, and I can't wait to see/play it!

But on topic, I don't know what game people are playing, but the visuals in this game are amazing.
I have to agree with you I do not have the game yet but from the pictures posted here it looks pretty good and not last gen whatever that means. I think a lot of these people read crap on what ever sites and if the guy is ticked off at the company will write a bad review and then people like the one here just respew that sites garbage across the interwebs if any of what I said made any sense that is. It's late here lol
 

Tantor

Posts: 212   +401
"Even Vega does exceptionally well here, with Vega 56 matching the GTX 1080 Ti... a bit of a "what the..." moment when I saw those results.

Yep, Vega is pretty good, given decent optimization. My Radeon VII has been the best purchase I made in a long time.

By the way, how come you didn't include Radeon VII in your lineup?
 

Dimitrios

Posts: 943   +754
What's crazy is that my RX580 4GB with 1-2 year old drivers actually runs well on this game! When I get home later I will upload my benchmarks if this site allows it. I didn't play too long
last night but I was pushing 50-60FPS.

Software Version
2020.0525.1419.25779

Driver Version
20.10.35.02-200821a-360470C-ATI
 

Peter Farkas

Posts: 582   +441
Why you testing ultra after everything you said.
You clearly haven’t listened to everything they said in that video about Ultra Quality…
The specifically said why they will keep on testing with Ultra settings, go back and watch the full video before commenting silly questions!
 

Loli Pop Carbon

Posts: 23   +11
I have to say that I'm proud of the Nvidia GTX 1080Ti; 4 years later and it can still nearly muster 60 FPS @ 1440p on the Ultra settings for a completely brand new game!

Nvidia needs to release driver optimization for Pascal, including 1080 Ti. 1080 Ti should be in between 2070s and 2080. This is waaaay below its maximum capability. I'm gonna be so mad if they abandon Pascal, especially in this tough time for upgrade.
 

Lounds

Posts: 951   +857
It kind of makes sense why AMD marketed the 6600XT as a 1080 max settings card now. Clearly it doesn't have enough bandwidth to cope with higher resolutions. The fact the 5700XT could be had in 2019 for a similar price and be so much better in modern titles proves that AMD has over priced the 6600XT due to the current situation. I do wonder if it would have been a sub £300/$300 card in normal market conditions. Disappointing Ray Tracing results for the 6600XT as well.

I'm glad I went back to Nvidia after many years of AMD/Radeon ownership and got the 3060Ti, my last Nvidia card was the GTX 460 1GB.
 

Ludak021

Posts: 571   +429

"I’d say overall performance looks quite neutral, at least at 1080p on the high-end segment."

<me, looking at the cpu bottleneck at the top segment, then at 5600XT being a lot faster than 1080Ti, then quit reading further>
 

Diwiak

Posts: 9   +0
Hi,

how do you test all cards with HD textures? FC6 installer saying HD textures are available only for 11+GB VRAM graphic cards..

Thanks for answer
 

Roboyt0

Posts: 11   +15
Do a search - when Techspot switched to this benchmark rig, they gave the full specs... there were many who argued they should have stayed Intel, as the majority of gamers used (and still use) those for gaming... There is no arguing that the 5950 is superior to any Intel rig... but... not for gaming...

https://www.techspot.com/review/2131-amd-ryzen-5950x/

11 game average at 1080P and the 5950X is faster than everything Intel had to offer at the time, Nov. 2020, of that review; albeit marginally.

https://www.techspot.com/review/2260-amd-ryzen-5800x-vs-core-i7-11700k/

5800X vs 11700K. 5800X faster on average of 32 games...once again marginally at 4% overall. This was more recent in May 2021. With the 5800X only being slower, by a maximum of 2%, in a total of 4 games. The other 28 are either a dead tie, or lean, sometimes heavily, in AMD's favor.

Looks like if you take your own advice you would see that AMD is extremely competitive, if not edging out a win for gaming at this point... They are also doing so while consuming less power, sometimes considerably so, and also being dominant in multi-threaded workloads.
 

Squid Surprise

Posts: 4,425   +3,744
https://www.techspot.com/review/2131-amd-ryzen-5950x/

11 game average at 1080P and the 5950X is faster than everything Intel had to offer at the time, Nov. 2020, of that review; albeit marginally.

https://www.techspot.com/review/2260-amd-ryzen-5800x-vs-core-i7-11700k/

5800X vs 11700K. 5800X faster on average of 32 games...once again marginally at 4% overall. This was more recent in May 2021. With the 5800X only being slower, by a maximum of 2%, in a total of 4 games. The other 28 are either a dead tie, or lean, sometimes heavily, in AMD's favor.

Looks like if you take your own advice you would see that AMD is extremely competitive, if not edging out a win for gaming at this point... They are also doing so while consuming less power, sometimes considerably so, and also being dominant in multi-threaded workloads.
Yes... that was when the 5950 was reviewed... but when they were setting up their benchmark PC, the Intel 11900 was available (it released a couple months after that review) and took the gaming lead back...

No one is arguing that the 5950 isn't a VASTLY better CPU... just not really meant for a gaming rig...
 

Sausagemeat

Posts: 1,037   +863
What's unfortunate about that?
If means the ray tracing was gimped. Radeons are so far behind at ray tracing that they have clearly asked developers to pair it back on the games they sponsor. On AMD sponsored titles we get gimmicky, low res RT features. On Nvidia sponsored titles we get fully ray traced worlds that have a significant positive impact on the visuals of the game.

AMD is literally holding back the progress of 3D gaming.
 

Ben1978

Posts: 134   +101
3080 has 10Gb of ram, FarCry 6 specifically states it needs 11Gb of Ram.
Ah yes. I wasn't aware of this when I made my post and was going on Steve's assertion that 10gb would probably be fine for years to come, made, I dunno, six months or a year ago. Bit annoying as I got the original 3080. But then I should count myself lucky to have any 3080.
 

Yenega

Posts: 302   +202
"In our opinion, Far Cry 6 looks better than Assassin’s Creed Valhalla"
Sure, but does it look better than Assassin's Creed Odyssey? :laughing:

In any case, I'm VERY pleasantly surprised to see where my RX 5700 XT lands on this list. I didn't expect that it would ever beat the RX 2080, let alone in such a complex open-world AAA title like Far Cry. :D

5700XT does not beat RTX 2080. Not in this test and not in other tests in this game.

5700XT performs like a 6600XT pretty much.