Assassin's Creed Valhalla can't hit 60fps@4K even with the RTX 3090

They should change their name to Ubidisappointed. What a total joke. Well I guess I wouldn't buy the game until it's 75% off in the 2022 Steam sales, so maybe by then Hopper or RDNA3 can run it well LOL.
 
I don’t know how optimized this particular game is. Just saying that a 3090 struggling at 4K ultra is not necessarily a bad thing.
It is a bad thing considering nvidia is shouting 8k gaming from the rooftops with the 3090...

And there is no doubt that the game is at best mediocre optimized at launch (now). Ubi's "rich" history of badly optimized games (at least at launch) supports this 100%.
 
It is a bad thing considering nvidia is shouting 8k gaming from the rooftops with the 3090...

And there is no doubt that the game is at best mediocre optimized at launch (now). Ubi's "rich" history of badly optimized games (at least at launch) supports this 100%.

The article makes it seem that 4K ultra 60fps is something we should expect as a given. It’s not, regardless of what Nvidia’s ridiculous marketing says.

AC may not be optimized but lack of optimization isn’t the only possible reason for a game to struggle at 4K. I would hope most new games aren’t perfectly smooth at ultra settings at 4K otherwise it shouldn’t be called ultra.
 
Then again you still have these people who actually beliee that 8K is unavoidable.

Even if it was technically possible (right now, it definitely isn't), you still only wish your eyes were that good.
/rant
The defense people have for this is "they aid that about 4k too!"

Yeah, they did say that about 4k. The think with that is they were also *wrong* about it. 4K gave a perceptible difference from 1080 and even 1440 at the distances people were watching. This is coming from me, a guy that has a 4K TV as a computer screen and loves it. But at 6ft away 4k is a huge jump from 1440. 4k to 8k would not be any sort of jump at all unless you get closer then 4 feet to a 50" screen, and at that point you are going to have a shitty time and a sore neck.

8K is a gimmick outside of 200+ inch screens. For recording, I want to see them goto 8K, 12K and beyond. The more the better and more crisp the 4K downscale. 8K is a distraction away from true HDR, 10 and 12bit panels, ultra low latency panels, 500-1000nit being the baseline 709 color accuracy, 120Hz baseline, the list goes on for actual real improvments. Hell, even 3D was less of a gimmick than 8k.
 
It is a bad thing considering nvidia is shouting 8k gaming from the rooftops with the 3090...

And there is no doubt that the game is at best mediocre optimized at launch (now). Ubi's "rich" history of badly optimized games (at least at launch) supports this 100%.
Nvidia can **** off the 8k bandwagon. There is zero people that will benefit from 8k due to pixel density and distance from screens. This is why 4k on phones is stupid and not really done outside of gimmicks.
 
Last edited:
Just play it @ 1080P and enjoy the smoothness.

Problem is, there's nothing new in this over-beaten series anymore, other than new locales and some minor additions. The basic mechanism remains the same.

Ubisoft should have released every "new game" after Assassin's Creed Origins as DLCs.

Anyway, I wish the series focused only in the past. Make a fresh story occurring only in the ancestry life, rather than switching to the modern world to and fro and Animus nonsense. It breaks the immersion, for me.
 
If ray tracing is disabled, and one is using low or medium settings, on a 1920 by 1080 monitor, is it possible to get at least 30 fps? If so, the game is still playable, so I can see why, even if its showing is poor compared to other games, the company isn't viewing this as a total disaster that needs to be fixed before launch.
 
Why don't they just use an efficient game engine like Unreal4 or Cryengine and focus on making games? Making 3d engines isn't their thing; their AnvilNext 2.0 game engine is a resource hog and poorly optimized. Anyone with any system monitoring software can see how intensive their games are on the CPU, RAM, and number of cores used...and for what? Their games don't even look that good! Making a 3d engine is a huge drain on resources that could better be spent just focusing on game design.
 
This news is outdated, it's been shown the sample video this is taken from is under performing. Tweaktown found this data incorrect and the 3090 is the ONLY card capable of running 60fps @ 4K, even if barely so, it does do so.
 
Ahhh the master race.

This problem is been going on since tetris in the 90s probably, we are in 2020 and it will never end, you can have a RTX 9999 and still will not be enough, just get a PS5
 
How is a pS5 going to solve anything? Its simply a upper-mid teir PC that can do nothing but game. Its only going to get 60fps by having unconfigurable lower settings than the PC version.
 
Last edited by a moderator:
This is the worst performance/optimization I have seen for a game I have ever seen; even at medium settings with a 1070Ti @1080p with an i5 9400 I get massive frame drops constantly. I get no such issues in any other game.
 
The mobo looks like it will be a beast. Don't forget to use 4 sticks of DDR4 3600MHz RAM with it :) (or some nice 3200MHz CL14 sticks).
I finally got the Asus Dark Hero and got a G.Skill DDR4-64 GB: 4 x 16 GB kit along with it. Sadly the only thing I cannot get is a new GPU. So I have to survive with my trusty GTX 1080 Ti for now.
 
Back