The only time I noticed a different with RT reflections on, I crouched down to look at my ride's hub-cap/wheel cover and saw my smiling face looking back at me. Turned off RT and no reflection..
Because the audience expects. I'm gonna try and use high settings by default from now on. Both DF and HU have said ultra are a complete waste of time, so I'll try and pretend they don't exist. I should also step down from 4K to 1440p but that's gonna be harder cos 4K is noticeably shinier. Having said that in Deathloop I've been playing very happily with 4K adaptive performance, where resolution could be dropping down by as much as 50% without me realizing. Now that's a setting that every game should have.
His view on ray tracing seems to be that at first he thought it was a waste of resources, but in newer games like control and watch dogs legion its starting to show decent results, especially combined with dlss 2.0. But I'd hazard to say he still thinks we're another GPU generation or two away from it being worth the performance hit.Steve says a lot of things that are pretty contradictory like how about how in the past everytime he wanted to "test" Ray tracing he almost always went with shadow of the tomb raider yet here he says how dxr shadows is basically pointless and that's all that tomb raider uses.
Meanwhile he could have used a game like control that ACTUALLY shows the difference something like DXR can make but no just like here he's petty adament not to show how much better Nvidia is at dxr (even if he admits it defeatedly when speaking as to why he doesn't care)
I heard over and over how much better the rdna2 cards were gonna be from him and others (MLID) and yet here we are with the only cards any actually cares about (based on user numbers) being the ones that have all the "worthless " technology.
It's very confusing for someone to think that.His view on ray tracing seems to be that at first he thought it was a waste of resources, but in newer games like control and watch dogs legion its starting to show decent results, especially combined with dlss 2.0. But I'd hazard to say he still thinks we're another GPU generation or two away from it being worth the performance hit.
I have to agree with you I do not have the game yet but from the pictures posted here it looks pretty good and not last gen whatever that means. I think a lot of these people read crap on what ever sites and if the guy is ticked off at the company will write a bad review and then people like the one here just respew that sites garbage across the interwebs if any of what I said made any sense that is. It's late here lolAvatar is going to use Snowdrop, and I can't wait to see/play it!
But on topic, I don't know what game people are playing, but the visuals in this game are amazing.
You clearly haven’t listened to everything they said in that video about Ultra Quality…
I have to say that I'm proud of the Nvidia GTX 1080Ti; 4 years later and it can still nearly muster 60 FPS @ 1440p on the Ultra settings for a completely brand new game!
Do a search - when Techspot switched to this benchmark rig, they gave the full specs... there were many who argued they should have stayed Intel, as the majority of gamers used (and still use) those for gaming... There is no arguing that the 5950 is superior to any Intel rig... but... not for gaming...
Yes... that was when the 5950 was reviewed... but when they were setting up their benchmark PC, the Intel 11900 was available (it released a couple months after that review) and took the gaming lead back...https://www.techspot.com/review/2131-amd-ryzen-5950x/
11 game average at 1080P and the 5950X is faster than everything Intel had to offer at the time, Nov. 2020, of that review; albeit marginally.
5800X vs 11700K. 5800X faster on average of 32 games...once again marginally at 4% overall. This was more recent in May 2021. With the 5800X only being slower, by a maximum of 2%, in a total of 4 games. The other 28 are either a dead tie, or lean, sometimes heavily, in AMD's favor.
Looks like if you take your own advice you would see that AMD is extremely competitive, if not edging out a win for gaming at this point... They are also doing so while consuming less power, sometimes considerably so, and also being dominant in multi-threaded workloads.
If means the ray tracing was gimped. Radeons are so far behind at ray tracing that they have clearly asked developers to pair it back on the games they sponsor. On AMD sponsored titles we get gimmicky, low res RT features. On Nvidia sponsored titles we get fully ray traced worlds that have a significant positive impact on the visuals of the game.What's unfortunate about that?
Ah yes. I wasn't aware of this when I made my post and was going on Steve's assertion that 10gb would probably be fine for years to come, made, I dunno, six months or a year ago. Bit annoying as I got the original 3080. But then I should count myself lucky to have any 3080.3080 has 10Gb of ram, FarCry 6 specifically states it needs 11Gb of Ram.
"In our opinion, Far Cry 6 looks better than Assassin’s Creed Valhalla"
Sure, but does it look better than Assassin's Creed Odyssey?
In any case, I'm VERY pleasantly surprised to see where my RX 5700 XT lands on this list. I didn't expect that it would ever beat the RX 2080, let alone in such a complex open-world AAA title like Far Cry.