Unreal Engine 5's new Lumen and Nanite systems brings near photorealistic environments...

Cal Jeffrey

Posts: 4,188   +1,430
Staff member
Something to look forward to: Microsoft has given us a few early glimpses of games running on the Xbox Series X over the last couple of weeks, but we have not seen much for the PlayStation 5 yet. However, today Epic demonstrated an in-house tech-demo game to showcase Unreal Engine 5 running on the PS5, and it looks pretty stunning.

On Wednesday, Epic Games unveiled Unreal Engine 5, which it says will unleash the power of next-generation machines. To drive the point home, the developers provided a "first look" of the engine running on a PlayStation 5 (below).

Epic's tech demo is called "Lumen in the Land of Nanite," named after two new technologies coming to Unreal Engine — Nanite and Lumen. The Tomb Raideresque real-time game demo highlights how the latest core technologies can bring cinema-level CG to console games with near photorealistic graphical fidelity.

Nanite is a "micropolygon geometry" that Epic claims produces triangles at the size of a pixel. The technology allows developers to use assets in Unreal Engine containing millions or even billions of polygons without hits to performance. It points to a statue within the demo that is comprised of 33 million triangles.

"No baking of normal maps, no authored LODs," the engineers explain.

They then go on to show a room containing 500 of the same statue, each at the same level of detail. That makes for around 16.5 billion triangles in the scene, not including the room geometry — all this, with no discernable lag or frame-rate hits running on a PS5.

Furthermore, developers can easily import these high-poly assets (up to 8K) from whatever tools they use.

"Nanite virtualized geometry means that film-quality source art comprising hundreds of millions or billions of polygons can be imported directly into Unreal Engine—anything from ZBrush sculpts to photogrammetry scans to CAD data—and it just works," said Epic in its press release.

Once the models are imported, Unreal handles all of the streaming and scaling in real-time, so devs don't need to worry about draw or polygon counts, or polygon memory budgets.

The other core technology, Lumen, is a dynamic global illumination (GI) system that reacts in real-time to changes in geometry or lighting source — again with no pre-baking or light maps needed.

"The system renders diffuse interreflection with infinite bounces and indirect specular reflections in huge, detailed environments, at scales ranging from kilometers to millimeters. Artists and designers can create more dynamic scenes using Lumen, for example, changing the sun angle for time of day, turning on a flashlight, or blowing a hole in the ceiling, and indirect lighting will adapt accordingly."

The levels of detail do not have to be contained in small areas either.

"So with Nanite, you have limitless geometry, and with Lumen, you have fully dynamic lighting and global illumination. All running on a PlayStation 5," said Epic's Technical Director of Graphics Brian Karis. "And this doesn't need to be constrained to small rooms. It can stretch all the way to the horizon."

Ninite and Lumen are not the only things Unreal Engine 5 brings to the table. Epic made improvements to existing Unreal technologies, including Chaos physics and destruction, Niagara VFX, convolution reverb, and ambisonics rendering. For example, Niagara particles can now communicate with each other and have a better understanding of their environment. So, developers can create swarms of creatures like bats or bugs that realistically react to their surroundings as well as each other.

Additionally, with the more complex environments that can be created, Epic modified the character animation system to react accurately to the changes. The system dynamically changes the body position to make for more fluid animations such as finding footholds while scaling a cliff or placing a hand on an appropriate edge when squeezing through a door or gap.

As we reported last week, Unreal Engine 4.25 added support for Xbox Series X and PlayStation 5. Epic says there are already "dozens" of studios using UE4 in upcoming next-gen games, but they will have to wait a bit longer to start using the new engine.

Unreal Engine 5 will be ready for early preview in the first part of 2021. Epic expects the full release to be available by late 2021. When ready, UE5 will support all current and next-gen consoles as well as PC, Mac, iOS, and Android. The company also said that developers currently creating next-gen games using UE4 would be able to migrate seamlessly to Unreal Engine 5 upon its release without having to restart production.

Epic also stated that starting today, it is waiving all royalties on the first $1 million in gross revenue developers earn from any project using Unreal Engine. This offer is retroactive to January 1, 2020, so even projects released in the last few months are eligible.

Permalink to story.

 
Hey, remember that fantastic looking android demo running on the PS3?


Remember the tech demos for the PS4?


Remember how all the games on those consoles had that level of detail and polish and stable frame rates? Remember the literal HUNDREDS of games that were displayed at E3 that had the exact same level of detail running at a pure, smooth, consistent 1080p30FPS, how their framerates never dropped, never ran below 1080p, and never had their texture, lighting, or physics effects tuned down to fit on limited console hardware?

So yeah I totally expect that all PS5 games will look as good as this tech demo does.
 
Apparently it took 25 people working full time for 6 months to make this tech demo...

Unrealistic for games (without an insane budget), but I don't mind the eye candy.
 
Read Niagara VFX starting with a V and missing an extra A, made for hilarious substitute.
 
Apparently it took 25 people working full time for 6 months to make this tech demo...

Unrealistic for games (without an insane budget), but I don't mind the eye candy.
Just look at the list of people involved in making Assassin's Creed Odyssey, as an example:


Taking just the programmers for the 3D rendering part of the engine comes to 20 people, in 3 different locations around the world. The number of full engine and gameplay developers dwarfs that figure. So while 25 might seem a lot for a tech demo, it's not really a huge number given that Epic have a lot riding on this.
 
Hey, remember that fantastic looking android demo running on the PS3?


Remember the tech demos for the PS4?


Remember how all the games on those consoles had that level of detail and polish and stable frame rates? Remember the literal HUNDREDS of games that were displayed at E3 that had the exact same level of detail running at a pure, smooth, consistent 1080p30FPS, how their framerates never dropped, never ran below 1080p, and never had their texture, lighting, or physics effects tuned down to fit on limited console hardware?

So yeah I totally expect that all PS5 games will look as good as this tech demo does.

Yeah but it's Unreal 5 engine so it could easily look like that on next gen PC hardware in a few years. It will be a fair while before we see Unreal 5 engine games and by then Hopper and RDNA3 will be out.
 
Just look at the list of people involved in making Assassin's Creed Odyssey, as an example:


Taking just the programmers for the 3D rendering part of the engine comes to 20 people, in 3 different locations around the world. The number of full engine and gameplay developers dwarfs that figure. So while 25 might seem a lot for a tech demo, it's not really a huge number given that Epic have a lot riding on this.
That would fall under "insane budget" lol

But will it be streamlined enough to create these environments for a game? Assuming 6 months full time for 25 people to create that few environments still sounds like a slow process. Will it be worth the time/cost balance at the beginning?
I know I personally would rather more money be spent towards gameplay :p

Also, how nicely does all of this stuff compress down? They said 1 billion triangles for that first little scene's geometry alone (assuming most of the assets in the scene are unique). Worlds are hundreds of times bigger than that, and then add 8K textures? Build size is going to go through the roof...

Anyways, I do realize it's a tech demo. But as a game dev I don't see how this tech demo is actually realistic this time lol
 
And this demo was running on a PS5, at 1440p, at 30 fps and without ray tracing.

To add to that.

Epic CEO Tim Sweeney says the PS5 is so impressive it’s ‘going to help drive future PCs’

- Epic CEO Tim Sweeney says next-generation gaming consoles, in particular Sony’s PlayStation 5, will bring about changes in game development that go far beyond a jump in graphics quality.

- Sweeney says the PS5 is a “remarkably balanced device.”

- Immense amount of GPU power, but also multi-order bandwidth increase in storage management. That’s going to be absolutely critical.

- Sony’s storage system is absolutely world-class. Not only the best-in-class on console, but also the best on any platform, better than high-end PCs. This is going to enable the types of immersion we only could have dreamed of in the past. The world of loading screens is over. The days of pop-in and geometry popping up as your going through these game environments are ended.

- Storage architecture on the PS5 is far ahead of anything you can buy on anything on PC for any amount of money right now.

- Enable developers to access the data their games are composed of with unprecedented speed. The result is larger game worlds loading much faster than ever before, which could result in drastic changes to how developers approach everything from balancing visual quality and performance to level design.


- Sweeney says the two companies have been working closely together during the development of UE5 and the PS5.
 
Last edited:
Waiving royalties for first million dollars....

This is a sweet deal for small developers. It will reduce their already high risks somewhat. Great move.

This shall also be done by Steam, Google Play, App store etc for first $100k of sales infnot a million. It will give impetus to small developers while having minimum impact on revenue of these behemoths.
 
Meanwhile AC Valhalla looks to run 30fps 4k on the next gen Xbox, can't the US team win in any competition but the "human malware" one :/
 
Enjoy the demo and now get back to your slow consoles and PC that won't be able to handle this grade of fidelity for at least another 5 years :D
And remember mobile gaming is the future :D
 
Everyone was raving over this as evidence of PS5 GODLEVEL power!

..then it became known it was running at 1440p.

Lol.
 
According to Epic, this demo is impossible to run on an HDD or even an Sata SSD.
Also, to run this demo on the PC it's necessary an RTX 2070 Super.
 
But will it be streamlined enough to create these environments for a game? Assuming 6 months full time for 25 people to create that few environments still sounds like a slow process. Will it be worth the time/cost balance at the beginning?
I know I personally would rather more money be spent towards gameplay :p
If you go back to the Assassin's Creed credit list, you'll see that artists and gameplay programmers feature more prominently than 3D programmers, so more money/resources is certainly dedicated towards these areas. However, I suspect that for the Unreal demo, the bulk of the team involved were programmers, rather than asset designers (although I could well be wrong there).

Making high polygon count models and assets by itself isn't so time consuming: it's all of the additional work required to change them for the game and platform used. As good as modern CPUs and GPUs are, they just can't process billions of vertices and pixels in a few milliseconds, so today's games use tessellation, normal maps, and other lighting tricks to give the impression that the assets you're looking at are highly detailed.

This takes time to get right, as the artists need to work closely with the 3D programmers to ensure that the correct tessellation level is used, and they have to generate the normal maps from their models (a quick job but then usually requires some additional manual manipulation to make sure they work right).

The idea behind the changes in UE5 is to save time by skipping all of this and just using the high resolution assets in the first place; the engine then manages the data in the form of streams (vertex, index, textures, etc) to/from the GPU.

Also, how nicely does all of this stuff compress down? They said 1 billion triangles for that first little scene's geometry alone (assuming most of the assets in the scene are unique). Worlds are hundreds of times bigger than that, and then add 8K textures? Build size is going to go through the roof...
Vertex data compresses very nicely; textures only up to a certain point. However, if you're not having to use additional maps (light, normal, height, etc) then you're saving a bit of space that way. That said, expect to see 100 GB+ as the usual size for AAA title in the next year or so.
 
Remember how all the games on those consoles had that level of detail and polish and stable frame rates? Remember the literal HUNDREDS of games that were displayed at E3 that had the exact same level of detail running at a pure, smooth, consistent 1080p30FPS, how their framerates never dropped, never ran below 1080p, and never had their texture, lighting, or physics effects tuned down to fit on limited console hardware?

So yeah I totally expect that all PS5 games will look as good as this tech demo does.
Exactly.
 
If you go back to the Assassin's Creed credit list, you'll see that artists and gameplay programmers feature more prominently than 3D programmers, so more money/resources is certainly dedicated towards these areas. However, I suspect that for the Unreal demo, the bulk of the team involved were programmers, rather than asset designers (although I could well be wrong there).
I believe you are way over estimating how much programming this tech demo would need (my area of game dev). Unreal has plenty of basic features to use. Locomotion, camera, triggers, and cinematics all should be quickly supported. Maybe the parkour elements took longer, but (with UE animation features) not by much. And maybe they got a programmer to do shaders, but UE does have powerful tools for artists to make their own.
Basically, a lot of what you see would be done by artists (creating particles, shaders using the visual editor, textures), designers (level design, particle placement, cinematics, misc), and modelers (rigging, model creation, animations). I'm just not sure what other features programmers would be needed to build from scratch (or to build off of) that would take so long (at least, nothing that would take several programmers 6 months).

Making high polygon count models and assets by itself isn't so time consuming: it's all of the additional work required to change them for the game and platform used. As good as modern CPUs and GPUs are, they just can't process billions of vertices and pixels in a few milliseconds, so today's games use tessellation, normal maps, and other lighting tricks to give the impression that the assets you're looking at are highly detailed.

This takes time to get right, as the artists need to work closely with the 3D programmers to ensure that the correct tessellation level is used, and they have to generate the normal maps from their models (a quick job but then usually requires some additional manual manipulation to make sure they work right).

The idea behind the changes in UE5 is to save time by skipping all of this and just using the high resolution assets in the first place; the engine then manages the data in the form of streams (vertex, index, textures, etc) to/from the GPU.
Yeah you could be right. I am assuming that the "old way" is streamlined at this point, and that scanning the assets won't be very streamlined yet (not very prominent of tech for creating games from what I can tell). Also I don't know how much work manually making a super high poly model would take (I hope the unwraps are convenient lol).
Though, that still leaves the question of what areas took so long in the demo. Maybe just tweaks to the scenes for realism for that "wow" factor...

Vertex data compresses very nicely; textures only up to a certain point. However, if you're not having to use additional maps (light, normal, height, etc) then you're saving a bit of space that way. That said, expect to see 100 GB+ as the usual size for AAA title in the next year or so.
Still, billions of triangles (including normals) can only be compressed so much. And the bump up to 8K textures looks like they'll more than makes up the space for the removal of the other maps lol.

After COD:MW, I do expect more devs to push the 100GB size. Don't know how happy gamers will be (I am lucky to have no download cap), but I'd hate to see how big a decent cinematic game could get along the lines of the above demo. A build could easily push 200-500GB without trying. And then imagine how big the project size would be (another hurdle for a game dev to manage).....
 
I believe you are way over estimating how much programming this tech demo would need (my area of game dev).
I was referring to the programming required for the new geometry and lighting systems - neither of these exist in UE4.5 so they would have to be generated for the demo. Not that 25 people are required for that (probably less than 5) and the list of other staff you mentioned would make the rest - as for the 6 month aspect, well that does seem unusually long for a simple demo. It's possible that it's longer than what one would normally expect because of the use of the PS5 - it's anyone's guess as to how long ago the dev kits were released. Sometimes the amount of time available to use them is pretty short: when I worked at Futuremark, the programmers would have to use software renderers for months because no GPU with the required feature set was available.

Still, billions of triangles (including normals) can only be compressed so much. And the bump up to 8K textures looks like they'll more than makes up the space for the removal of the other maps lol.
Normals are generated during the rendering and the whole point of the Nanite system is that normal maps aren't required. Do note that billions of vertices aren't being processed per frame - the assets used have that kind of vertex count, but they remain as is in the storage drive and the required vertices are then streaming in to the memory.

As for the textures, the likes of Doom (2016) is using textures as large as 16k x 8k, which are then all tiled into 128 x 128 pieces (texture tiling is supported fully in D3D11/12) - of course, the game isn't using a large number of such textures, but if UE5 for consoles is all about streaming assets, then this will almost certainly be true of textures. So although the game installation size will be monstrous, the memory load won't be as bad. That said, now that the new consoles have got plenty of video memory to play with, we may well see some minimal efforts behind streamlining the RAM footprint.

After COD:MW, I do expect more devs to push the 100GB size. Don't know how happy gamers will be (I am lucky to have no download cap), but I'd hate to see how big a decent cinematic game could get along the lines of the above demo. A build could easily push 200-500GB without trying. And then imagine how big the project size would be (another hurdle for a game dev to manage).....
Like yourself I have no data cap and it's a pretty good rate too, given that I live in the middle of nowhere, but I am concerned that games sizes are in danger of ballooning out of control.
 

Attachments

  • 1589471280342.png
    1589471280342.png
    182.8 KB · Views: 1
Back