Assassin's Creed Valhalla could be limited to 30fps on Xbox Series X

midian182

Posts: 9,774   +121
Staff member
Why it matters: Are you looking forward to seeing Assassin’s Creed Valhalla running at 60fps in glorious true 4K on the Xbox Series X? You might want to reel in your expectations. Developer Ubisoft has confirmed that in its current state, the game will “run at least 30fps.”

One of the Xbox Series X’s supposed flagship features is the ability to run games in a 4K resolution at 60 fps. We’ve also heard of titles able to run at 120fps, presumably at a lower resolution, such as Dirt 5.

With Assassin’s Creed Valhalla, it seems Ubisoft is still working on hitting that magical 60 fps target. Speaking to Eurogamer Portugal, the company said: “Currently, we can guarantee that Assassin’s Creed Valhalla will run at least 30 FPS. Assassin’s Creed Valhalla will benefit from faster loading times, allowing players to immerse themselves in history and the world without friction. Finally Assassin’s Creed Valhalla will benefit from improved graphics made possible by the Xbox Series X, and we can’t wait to see the beautiful world we’re creating in stunning 4K resolution.”

The keyword here is “currently.” Ubisoft could further optimize the game so the guaranteed framerate increases by the time the Xbox Series X arrives, but running at true 4k@60fps is something even the highest-end PCs struggle to achieve.

It could be that Valhalla offers a performance mode similar to those on the current-gen consoles, one which sacrifices visuals and/or resolution for higher frame rates. That would likely disappoint those expecting “the world’s most powerful consoles” to offer high FPS counts at true 4K.

It’s not been a great few days for Assassin’s Creed Valhalla. The gameplay trailer didn’t actually show any gameplay, and we heard that the Legend of Beowulf mission would be exclusive to Season Pass holders.

Permalink to story.

 
Ahaha, how hilarious do those previous statements look now about 120 FPS and 8K and stuff? Like "oh we meant Super Mario Bros 3, that'll run super smooth!"
 
Didn‘t Assassin‘s Creed Unity not have the same problem on PS4 and Xbox, I.e. it could not run at 30 fps and if I remember correctly it did not even run @1080p?

So using a game from this series as an indicator of the upcoming consoles‘ capabilities may not be the best choice.

This is what Eurogamer had to say:

Assassin's Creed Unity on console was farcical in places, even after multiple patches, often lurching down to 20fps or even lower. Curiously, it was the PlayStation 4 version that disappointed most, easily beaten during gameplay by a faster Xbox One release, though the GPU-intensive cut-scenes could see the Sony platform pull ahead of its Microsoft equivalent. Ultimately though, both consoles failed to impress: sustained sub-30fps gameplay just doesn't cut it.
 
Well if you go with Radeon then you don’t get 4k60! None of their current consumer grade gaming cards can deliver 4K60 really so why would anyone expect the console to do so?
 
It's not hard to run 4k60, I do it on a 1070ti. The thing is, you can't do it with everything cranked up to 11. If you are more distracted by graphics than gameplay then it isn't a good game. We aren't talking RuneScape level graphics. I played Jedi fallen order at 4k60 and I could only tell the difference if I was up close to something and actually looking for a difference in quality.

That said, if they can't get their engine to run 4k60 on the new Xbox it's the developers fault, not Microsoft's hardware department.

A little bit of context why I game at 4k on my hardware. I game on a 65" TV and it only supports low latency game mode if the input signal is 4k. It only supports 60hz input but I've learned of a jailbreak that allowed 120hz input as it is a 120hz panel. Been afraid I might break it in the process so I'm waiting until I get a new one to mod it.
 
This is just them covering their bases.
MS is just saying that they won't force developers to 60 fps. Maybe it would be easier for (some) indie developers to not do it. Idk.

Ubi is just saying that maybe sometimes in the game the fps might drop a bit, but maybe it won't.
it's still 6 moths to release, and a lot of tweaking to do. They just don't wanna make promises that they will be called out on.
 
Well if you go with Radeon then you don’t get 4k60! None of their current consumer grade gaming cards can deliver 4K60 really so why would anyone expect the console to do so?
Even if they went with Nvidia they would have had the same performance or even less, especially if you factor in the premium cost of using Nvidia to make the GPU. Then the cost would grown even more because only a third party could offer the CPU (AMD or Intel).

Does it make sense to use an RTX 2080 Super in a console meant for the mainstream market that sells for about $500 (+- $100)?

So no, it's not that AMD can't make a 4K 60FPS GPU (RDNA2 should be more than capable to do it if we look at RDNA 1 benchmarks), it's just that the consoles can't afford to use them from a technical standpoint (power draw, temps, chip size, etc) and a manufacturing cost standpoint.
 
Given the development of variable shader rate, dynamic resolution scaling, contrast adaptive sharpening, etc, there's a good chance that the output will be at 4K, but the internal rendering will be a lot lower. The likes of DLSS shows that there is some considerable mileage to be gained from this.
 
Even if they went with Nvidia they would have had the same performance or even less, especially if you factor in the premium cost of using Nvidia to make the GPU. Then the cost would grown even more because only a third party could offer the CPU (AMD or Intel).

Does it make sense to use an RTX 2080 Super in a console meant for the mainstream market that sells for about $500 (+- $100)?

So no, it's not that AMD can't make a 4K 60FPS GPU (RDNA2 should be more than capable to do it if we look at RDNA 1 benchmarks), it's just that the consoles can't afford to use them from a technical standpoint (power draw, temps, chip size, etc) and a manufacturing cost standpoint.
Looking at the numbers of AMD cores in the next gen parts it was obvious it wasn’t going to run 4K60.

However no AMD chip exists with the current cores that can really run 4K. However on the Nvidia side that does exist.

As you are probably unaware, the core count doesn’t easily scale upwards. You can’t just add more cores and get more performance because it will exceed thermal and voltage thresholds.

Personally I do wish the consoles had Nvidia hardware. Because first of all it’s blatant to anyone that Nvidias cards are far superior to AMDs parts. They are faster and more power efficient but still only at 12nm as oppose the AMD parts already at 7nm. We would see more RTX and DLSS support for the PC games I play. It would also be more likely they could pull off a 4K60 solution with the current tech. But I don’t care that much, I highly doubt il buy one.
 
I hope that we get the option to run at 1080p60 with HDR, though. I can't easily tell if the source is 1080p or 4K on my 65" Samsung Q90 TV from the ~3,5m sofa distance unless I switch back and forth and look really really carefully, or walk to less than 2m from the TV and still look carefully. There is however a huge difference between gaming at 30 FPS and 60 FPS.
 
ACO plays in the 40 and 50s at 4K60 Ultra on my overclocked 2080ti. With that said with 12tflops of processing power the XBSX should be able to do a good approximation of 4K60. It would be a waste of 12tflops for the game to run at 30fps. While people have been disappointed with what's been shown so far, what should really be pointed out is that simply moving from 30fps in most games to 60 would actually be a generational leap.
 
Did we really expect true 4K60 from the power envelope of a console?
yes, not from 3rd party open world games. Microsoft and Sony games should be able to hit it, also lower demanding games (think rocket league) should be able to no prob (they are already the ones doing 4k60 on xbox one x)
 
It's not hard to run 4k60, I do it on a 1070ti. The thing is, you can't do it with everything cranked up to 11. If you are more distracted by graphics than gameplay then it isn't a good game. We aren't talking RuneScape level graphics. I played Jedi fallen order at 4k60 and I could only tell the difference if I was up close to something and actually looking for a difference in quality.

That said, if they can't get their engine to run 4k60 on the new Xbox it's the developers fault, not Microsoft's hardware department.

A little bit of context why I game at 4k on my hardware. I game on a 65" TV and it only supports low latency game mode if the input signal is 4k. It only supports 60hz input but I've learned of a jailbreak that allowed 120hz input as it is a 120hz panel. Been afraid I might break it in the process so I'm waiting until I get a new one to mod it.

I too game on a 65" 4k (LG C9 OLED) TV but I prefer to keep the graphical settings turned up but lower resolution to 1440p it's a much better experience and because my screen supports 120hz at 1440 currently (and not at 4k currently) its a much smoother and responsive gaming experience.

Resolution is the last thing I notice when compared to making textures look like crap or turning off all the fancy lighting and particle effects etc. Frame rate too also is something I can notice instantly and find it much better when over 60 and closer to 120.

If these consoles had to focus on any experience it's a 1440p/60-120hz/high-med high that would be the most useful.

The order of importance

1. Frame rate of at least 60 or 120

2. Graphical settings closer to high/ultra over med/low

3. Resolution

They'll end up at 4k/30/medium if left to do things like they always do and that's the worst of any of the choices they can make.

4k is really truly needed when pixel peeping in motion anything above 1440p is "good enough".
 
ACO plays in the 40 and 50s at 4K60 Ultra on my overclocked 2080ti. With that said with 12tflops of processing power the XBSX should be able to do a good approximation of 4K60. It would be a waste of 12tflops for the game to run at 30fps. While people have been disappointed with what's been shown so far, what should really be pointed out is that simply moving from 30fps in most games to 60 would actually be a generational leap.
That's the leap they should be focused on above all else it should be mandated!
 
I hope that we get the option to run at 1080p60 with HDR, though. I can't easily tell if the source is 1080p or 4K on my 65" Samsung Q90 TV from the ~3,5m sofa distance unless I switch back and forth and look really really carefully, or walk to less than 2m from the TV and still look carefully. There is however a huge difference between gaming at 30 FPS and 60 FPS.
Yes I sit only about 3 ft from a 65" and though I can tell the difference between 1080 and 4k I can't tell the difference even that close with 1440 and 4k.

They should take the sweet spot approach and focus on 1440/60
 
That said, if they can't get their engine to run 4k60 on the new Xbox it's the developers fault, not Microsoft's hardware department.

Microsoft is hands off with third-party devs; if the devs want to target all the pretties at 30 fps, that's on the devs.

That said, this probably has more to do with the AnvilNext 2.0 engine than anything else. It runs like *** on PCs, and the Xbox Series X is basically a lower-level PC with better gaming APIs.

Short of rebuilding the engine from scratch, I don't think Assassin's Creed will ever run well.
 
Yes I sit only about 3 ft from a 65" and though I can tell the difference between 1080 and 4k I can't tell the difference even that close with 1440 and 4k.

They should take the sweet spot approach and focus on 1440/60
The problem with running at 1440p on my TV is that it won't do anything higher than 30hz. I can drop it down to 1080p and it does 60hz again. It's a Samsung TV and while the picture is FANTASTIC, the software and functionality really sucks. I tracked the panel down and found that it's used in a variety of other brands. I can buy a control bored for it that will accept 4k120 but I lose HDR and full array local dimming.

That will definitely be a project I do AFTER I already have a replacement for it.

Also, I, to only sit about 3 feet away from my TV while gaming, although sometimes I get a board to out my keyboard and mouse on that I set on my arm rests.

But when not gaming I use it for having 4 1080p windows open or 6 1280*1080 windows. I really hate switching tabs or having to bounce back and fourth between work places. Having everything laid out in front of me does wonders for my work flow.

I will say I wish I would have gotten a 55 instead of a 65, easier on the neck, lol.
 
Well if you go with Radeon then you don’t get 4k60! None of their current consumer grade gaming cards can deliver 4K60 really so why would anyone expect the console to do so?

I do 4K @ 60 on Radeon VII in many games at Ultra settings so I dont know what you are smoking. Atm I'm playing FC5 no problem
 
Ahaha, how hilarious do those previous statements look now about 120 FPS and 8K and stuff? Like "oh we meant Super Mario Bros 3, that'll run super smooth!"
Those marketing claims are only eaten up by novices.

I do 4K @ 60 on Radeon VII in many games at Ultra settings so I dont know what you are smoking. Atm I'm playing FC5 no problem
Which games are those besides FC5?

Well if you go with Radeon then you don’t get 4k60! None of their current consumer grade gaming cards can deliver 4K60 really so why would anyone expect the console to do so?
The problem with this is the Console GPU isn't based on any of the GPU's you can buy on the market yet. RDNA 2 Gpu's will be out later this year and that is what the console is using.
 
Last edited:
Those marketing claims are only eaten up by novices.



Which games are those besides FC5?



The problem with this is the Console GPU isn't based on any of the GPU's you can buy on the market yet. RDNA 2 Gpu's will be out later this year and that is what the console is using.

Doom 2016 ( I know not the newest of games ) Froza Horizon 4, that Diablo like Warhammer 40K game. War Thunder, BFV, I dont own anything 2020 but there are plenty of games that will do 60 fps at 4K on my card but I will add that even 2080Ti doesn't do 4K @ 60 in all games
 
People always seem to overlook the level of optimisation that developers can do on consoles that they can't do on PC's, especially over time. The PS4 (vanilla) runs the equivalent of a Radeon HD 7870, yet modern releases on PS4 would absolutely choke on an equivalent PC, especially if you could replicate it's 8 core Jaguar (lower clock for clock performance than even Bulldozer, and it runs at 1.6GHz).

First gen third party games nearly always run inferior because the developers are still learning the architecture. It wouldn't surprise me if Valhalla runs at 30fps, but it is unlikely that represents what either next gen consoles can do with appropriate optimisation.
 
It only supports 60hz input but I've learned of a jailbreak that allowed 120hz input as it is a 120hz panel.
If it only supports 60Hz, then it must have a HDMI 2.0 port that physically CAN'T accept 4k/120Hz input. No jailbreak can change it.
 
Back