One of the first commercial Unreal Engine 5 games, Immortals of Aveum, requires an RTX...

Daniel Sims

Posts: 1,370   +43
Staff
Forward-looking: Tech demos and trailers for upcoming games have given players tantalizing previews of Unreal Engine 5's capabilities. So far, however, the only two released products using the engine are a Matrix-themed demo and recent Fortnite updates. EA's Immortals of Aveum looks to be one of the earliest signs of what kind of PC you'll need to experience UE5 features like Lumen and Nanite.

Immortals of Aveum, an upcoming magic-themed FPS from EA Originals, will require a GeForce RTX 2080 Super at minimum for 1080p 60fps gameplay at low or medium settings. As one of the first Unreal Engine 5 games to see release when it launches this July, its system specs certainly make a statement for what AAA games may require going forward.

Some recent and upcoming PC versions of games that left behind the PlayStation 4 and Xbox One list the weaker 2070 Super under recommended specs, including Star Wars Jedi: Survivor, The Last of Us Part 1, and Returnal. Unreal Engine 5 may prove more demanding than those already visually impressive games.

What's odd, however, is the noticeable gap between the 2080 Super and the GPU Aveum lists for the AMD equivalent – the RX 5700 XT. The suggestion implies that Nvidia cards like the 2070 Super, 3060 Ti, or GTX 1080 Ti might suffice, but Aveum developer Ascendant Studios specifically told The Verge that players will at least want either a 2080 Super, a 3070, or a 5700 XT.

Video memory, which has recently become a controversial topic due to the lack of it in recent Nvidia GPUs, doesn't explain the discrepancy, as the 2080 Super and 5700 XT only have 8GB of it. Ascendant said that, for some reason, the 12GB 3060 wouldn't cut it.

The company noted ray tracing as an important element in Aveum and its system requirements, but the 5700 XT features no RT cores. The game will likely include software-based ray tracing – which doesn't use RT cores – owing to its implementation of Lumen.

Lumen, one of the main UE5 features Epic Games has hyped, incorporates ray tracing and other techniques to simulate how light bounces around environments. Fortnite, the only released UE5 PC game so far, lets players engage Lumen through software or hardware, the latter being more realistic with a higher performance cost.

The recommended spec for Aveum – 1440p gameplay at 60fps on medium-to-high settings – lists an RTX 3080Ti or RX 6800 XT. What's interesting is that, unlike some recent high-end games, Aveum doesn't recommend 32GB of system RAM. You'll be okay with 16. However, the game does follow an ongoing trend of gargantuan storage requirements – 110GB in this case.

The minimum CPU requirements list an i7-9700 or Ryzen 7 3700X, but Ascendant said an i5-10600, 11500, 12400, or 13400 should also work. The recommended tier suggests an i7-12700 or Ryzen 7 5700X, but a 5700G or 7700 is also fine.

Unreal Engine 5's other headline element, Nanite, is also in Aveum. The feature dynamically adjusts geometric detail in objects and environments based on their distance from view, only rendering what players will be able to see.

World Partition – UE5's technique for streaming massive environments without loading screens – is also in. It makes 20-30km-long levels possible, though the developer says Aveum isn't an open-world game.

Despite how extreme Aveum's system requirements seem, they appear very close to Fortnite's "Epic" quality requirements, which likely refers to playing the game with Lumen and Nanite enabled. Although that makes for a sample size of only two games, it could suggest a system requirement range to expect from future UE5 titles.

Immortals of Aveum releases for PC, PlayStation 5, and Xbox Series Consoles on July 20. Ascendant is still optimizing the game, so the system requirements could change before then.

Other UE5 games likely coming out in 2023 include Tekken 8 and the remake of Silent Hill 2, but it's unclear whether they'll use Lumen, Nanite, or World Partition.

Permalink to story.

 
Well at least it's not some old turd like the last of us.
It's one thing to be demanding for a crappy looking game it's another to justify it.
I skipped this latest generation of GPUs. I'm glad I did...
 
Also game devs will announce that their game will support Nvidia DLSS4.
Just that DLSS4 will run only on Nvidia 5xxx cards.
DLSS is becoming a meme at this point. It's great tech and I'm glad we have it but nVidia seems to be using it as their answer to everything. Next we'll hear about how DLSS has made peace with North Korea, cured cancer and made eating tuna next to somebody in public socially acceptable.
 
Last edited:
So the minimum requirements here included a GPU that is more powerful than either of the current gen consoles? Devs are going to have to dial U5 back a bit if they want to actually sell any games.
 
And This is children how their company flopped - by not looking at the list of most popular GPUs. The end.
 
DLSS is becoming a meme at this point. It's great tech and I'm glad we have it but it nVidia seems to be using it as their answer to everything. Next we'll hear about how DLSS has made peace with North Korea, cured cancer and made eating tuna next to somebody in public socially acceptable.
Yes, and most important, Nvidia will demand to say that DLSS is "better" than FSR, though anybody which dare to show that DLSS is only slightly better will be punished and blackmailed.
And if any reviewer will dare to show a graph with FSR running on GTX 1xxx cards with higher FPS numbers, while Nvidia DLSS is showing 0 FPS increase, will be banned by Nvidia, because showing how their own DLSS is not running on their previous gen Nvida cards is destroying the whole web of Nvidia lies.
"Interesting" how a lot of reviewers are hit by Nvidia with Amnesia about this fact.

Is this game sponsored or supported by Nvidia in any way?
Because the same Amnesia is manifesting regarding most of the games which "support" Nvidia DLSS 2 or 3 (soon 4 and 5), while in fact game devs are sponsored by Nvidia to implement it.
I want to know if a game is sponsored, publicly or behind the scenes, by any corporation, like Nvidia, AMD or Intel. Cyberpunk 2077 is the most noisy example of this. Nvidia is so in bed with CP2077 game devs without officialy admiting, that CDPR can be considered an Nvidia subsidiary.

I'll prepare an interesting post about "triple standards" of many hardware reviewers which praise some strength of hardware components which they get for free to review, but in the same time blatantly disregard the huge issues or limitations which the same components have, only to suite their narrative or better to say the producer narrative.
The tragedy is that most of those "reviewers" are hit by this Amnesia and do not use their own same skewed "standards" when reviewing hardware from the competition.
Thus, unfortunately, many hardware reviewers gradually became the 2nd class PR tools for the big tech corporations instead of formulating an HONEST, personal and intelligent argumented opinion. And some of them became so delusional that they claim they like it instead of fighting against.
 
Last edited:
Is this game sponsored or supported by Nvidia in any way?
Not a sponsored title, though Nvidia will probably offer some kind of support via drivers. The game will use FSR 2.0 for upscaling duties.

So the minimum requirements here included a GPU that is more powerful than either of the current gen consoles?
The raw performance figures of the 2080 Super and 5700 XT are actually on par with the GPUsin the Xbox Series X and PS5, in terms of FP32 FMA throughput, texel and pixel fill rates, and bandwidth -- both higher and lower in certain areas, but they're not a million miles apart. To the point that I wonder if the devs have actually used these graphics cards with internal testing or they're just looking at the theoretical figures, comparing them to those for the consoles, and going "yeah, these are going to be the minimum."
 
Last edited:
but, but, but....that's what DLSS is for....
DLSS doesn't reduce VRAM requirements THAT much. A lot of the demand here is purely the insane quality textures they're using.
I feel absolutely jebaited into my 3070 purchase. I will definitely be going AMD next time.
 
DLSS doesn't reduce VRAM requirements THAT much. A lot of the demand here is purely the insane quality textures they're using.
I feel absolutely jebaited into my 3070 purchase. I will definitely be going AMD next time.
DLSS reduces VRAM in that you can lower textures and the AI in it actually does a decent job of adding detail that isn't originally in the textures. So you can go from high to medium textures, turn on DLSS and have an experience somewhere in the middle. DLSS is better than FSR but you might not need to turn on DLSS at all if you had an extra 4gb of VRAM on the 4070. If the 4070 was $450 I'd consider it an instant buy. I will note that I currently have a 6700xt that I'm using until I do a complete system rebuild this winter and I'm very happy with it. The 6700xt performed better than I expected based solely on benchmarks so I hope that experience carries over to when I decide to spend some real cash at the end of this year.
 
Hopefully those Bethesda folks not thinking of doing another graphical overhaul of Skyrim... Skyrim Ultimate Edition?
 
but, but, but....that's what DLSS is for....
You can take a look at The Last of Us as an example. It is very poorly optimized, but DLSS did not solve the VRAM requirements much. Thus, with or without DLSS, all 8GB cards are limited to medium quality settings even at 1080p.
 
Still rocking a 8700k, because for the most part, CPU performance hasn't been the bottleneck.
 
Yessss!!! Finally we can move on to next gen games that actually use modern hardware. I feel "next gen" games were taking forever to arrive (Cyberpunk 2077 being the exception).
 
Back