Even the RTX 2080 Ti struggles with Watch Dogs: Legion at 60fps in 1080p

midian182

Posts: 9,769   +121
Staff member
WTF?! In addition to launching on the current crop of consoles, Watch Dogs: Legion will be one of the first titles on next-gen machines. As such, maxing out its graphics on the PC might take an absolute beast of a rig. Right now, the game is so demanding at its highest settings that even the might RTX 2080 Ti can’t maintain 60 fps at just 1080p.

Ubisoft showed off some more gameplay footage of Watch Dogs: Legion at its Forward event yesterday, the same day that Digital Foundry uploaded a video of an early preview build on PC.The game is set at locked at 30 fps in the build, likely to match the console versions, with all the graphics settings at Ultra, including ray-traced reflections. Even though the test machine was running an RTX 2080 Ti, it was unable to manage a steady 60 fps@1080p when the framerate was unlocked.

There are a few important things to note here: this is a pre-release build of the game, so there’ll be plenty of optimizations before launch that will improve frame rates; and setting everything to Ultra in a graphically demanding game, especially the ray tracing option, is always going to push a PC to its limits.

Digital Foundry never had the chance to test DLSS 2.0, which renders a game at a lower than native resolution and then upscales back to its native res using AI and deep learning. Its use could mean the final version of Legion will be able to manage 60fps at 4K (upscaled from 1080p) even with ray tracing enabled.

Hopefully, this won’t be bad news for those with PCs that cost under $1,500, and the final game will be better optimized.

Permalink to story.

 
So using nvme ssd disk pcie 4.0-5-0 gpu ddr5 ram would make it run better. rtx 30xx to 2 nd gen. x670 mb lga 1700 intel 11 th 12 th gen. or game are hard to optimize. amd with new pcie 3.0-10.0. future computer consoles to run it over 15 fps. 60 are not low. but aming at 120-400 fps. most not likely yet. if gpu not has not enough power would it not be up to ram mb via chipset to optimize games. we dont need enabling it. if you run gtx 10xx 900 series and maybe lower you dont needing enabling it all the time. a rtx 20xx ti are to exspensive.
 
Last edited:
I think this says more about Ubisoft's useless, outdated Gfx engines that they use rather than the 2080Ti...
Dunia and AnvilNext are somewhat dated now, as they're built on older engines, but Snowdrop and Disrupt are both new and mostly built afresh for more recent hardware and APIs. Not sure what Legion is using, but it's probably Disrupt (which Watch Dogs 2 used). Besides, the game is due for release until the end of October, so there's still time to amend the situation.
 
Dunia and AnvilNext are somewhat dated now, as they're built on older engines, but Snowdrop and Disrupt are both new and mostly built afresh for more recent hardware and APIs. Not sure what Legion is using, but it's probably Disrupt (which Watch Dogs 2 used). Besides, the game is due for release until the end of October, so there's still time to amend the situation.

Ubisofts Gfx engines are so poorly multithreaded on the CPU side, relying on one or two highly clocked cores, that even the most powerful of GPU's are quickly bottlenecked..
 
Watch Dogs 2 was extremely demanding as well, but I thought it looked very nice. It was so demanding though you really HAD to run with the temporal filtering on. Image reconstruction. With that on it still looked pretty great I must say, as long as you were going for at least 1440p final output. I was really impressed with the result.

DLSS 2.0 is just a more sophisticated version of the temporal filtering they were playing around with a few years ago to upscale.

You need to get used to this. In order for next gen consoles to deliver this improvement in graphics they are almost certainly going to rely heavily on upscaling via various methods to steal pixels and frames here and there.

I'm sure that a certain elite will be unhappy, demand their native pixels and fair enough. However I think it'll be just another tool in the box of people buying into midrange hardware that can boost up their frame rates and overall visual fidelity.

If it is done well, it'll hand 4K to the masses on midrange hardware. Or at least image quality pretty damn close to it.

Work smarter not harder. With the advances now available for upscaling you can avoid having to simply brute force your way up the visual ladder to a certain extent.
 
I wonder how much performance would improve by just disabling ray tracing and leaving the rest on Ultra.

My guess is that even on a RTX, it adds massive overhead.
 
Crappy games, skins and microtransactions... Ubisoft's crap at it's best. I was one of their biggest fans until they destroyed rainbow six. No more for me, thanks. Had every Far Cry [2-5], played a ton of Splinter Cell. New games are just unoptimized woke garbage. Not for me, I guess...
 
This game is an slight graphics update over watchdogs 2 even as per trailer show on UBI Forward and in the video of digital foundry, its just painful.

Character movement is odd, Animations including facial are stiff and unreal, models are just average, cars look old gen, everything else looks very much like previous game so where exactly all this graphics horsepower is going?

PS : Enemy AI in openworld is still old gen eg cops get out of car and stand in middle shooting till you kill them. Take cover behind your car for fks sake. In bridge scene the guards are also happy just getting shot at. Some scenes did include cover but they were part of story sequence.

PPS : New drinking game - Watch the digital foundry video and take a shot everytime he says Raytracing. Let me know if you survive alcohol poisoning.
 
Last edited:
So using nvme ssd disk pcie 4.0-5-0 gpu ddr5 ram would make it run better. rtx 30xx to 2 nd gen. x670 mb lga 1700 intel 11 th 12 th gen. or game are hard to optimize. amd with new pcie 3.0-10.0. future computer consoles to run it over 15 fps. 60 are not low. but aming at 120-400 fps. most not likely yet. if gpu not has not enough power would it not be up to ram mb via chipset to optimize games. we dont need enabling it. if you run gtx 10xx 900 series and maybe lower you dont needing enabling it all the time. a rtx 20xx ti are to exspensive.
The pcie version has nothing to do with how performant a gpu is...
I can bet you whatever you want that rtx 3080ti which will come with pcie 4.0 support, will get better framerates on z490 with pcie 3.0 only and 10900k vs x570 with pcie 4.0 and whatever zen 2 cpu you want.
 
The amazing new PS5 with it's incredible "game-changing" ssd speeds will surely run this title at 60FPS. /s
wtf did you all smoke? how is exactly the ssd is supposed to improve the speed at which the gpu can process data?
the ssd in ps5 will allow for seamless transitions between very complex scenes, BUT that doesn't mean it helps the gpu in any way with rendering the frames.
 
And how fast will it run with RT disabled. I’ll bet 60fps @ 1440p with ultra settings. RTX3xxx will be something like 400% improvement in RT performance, leaks indicate only a 10% hit when enabled. RTX3060 will do better than 2080Ti with RT enabled.
 
Wait when Crysis launched and SLI 8800gtx couldn't handle it at max at 1920x1200 everyone went awesome built for the future, but now it's a problem?
 
Wait when Crysis launched and SLI 8800gtx couldn't handle it at max at 1920x1200 everyone went awesome built for the future, but now it's a problem?
Crysis was a game changer at that time. Nothing in this is a game trailer. Ubisoft is just re-issuing the same dusty old franchises, with added payment as a "bonus", with "woke" themes (Kassandra in AC), with poor optimizations and money-grubbing schemes.
Did Crysis had a new theme? Check
Did Crysis had microtransactions? Nope.
Did Crysis had "pay for skins"? Nope
Did Crysis had "half a game for 80 USD?" Nope
Did Crysis had "5 editions with useless junk"? Nope
It's no comparison between the two.
 
wtf did you all smoke? how is exactly the ssd is supposed to improve the speed at which the gpu can process data?
the ssd in ps5 will allow for seamless transitions between very complex scenes, BUT that doesn't mean it helps the gpu in any way with rendering the frames.
"/s" ?
 
Back