Unity tech demo shows off a lifelike digital human rendered in real-time

jsilva

Posts: 325   +2
In brief: Real-time rendered 3D graphics have come a long way since the 80s, as we can see by Unity's "Enemies" tech demo. Besides the photorealistic human shown throughout the demonstration, it's also worth noting the impressive environments and animations, giving us a glimpse of how games will look before long.

Based on The Heretic, Unity created its Enemies tech demo to show the team's advancements in creating digital humans in real-time to GDC attendees. Highlighting elements such as the photorealistic eyes, hair, skin, the video shows a middle-aged woman playing chess in a transforming room.

The newly developed short made in Unity was built using the engine's new High Definition Render Pipeline (HDRP), Adaptive Probe Volumes, and Screen Space Global Illumination (SSGI). The demo also features ray tracing, DLSS, and other unlisted technologies.

Use of these technologies and features reflects in the character model, as you can see by the pronounced wrinkles, the effect of the blood flow on the skin, the remarkably realistic eyes. However, the team gave some special attention to the model's hair.

The developers used a new three-part Unity Hair solution to create that beautiful long hair. The first part is the hair system, responsible for the authoring, skinning, strand-based simulating, and modeling. Then, the hair shading component improves the visuals based on lighting conditions. The last part is hair rendering, capable of creating extremely thin strands while also reducing aliasing.

Some of the mentioned features are already available in Unity 2021.2, while developers will introduce others in the 2022.1 or 2022.2 versions. The new technologies showcased in Enemies are coming to Unity's Digital Human 2.0 package, which should be out in a month or two. Unity will also release the strand-based Hair system on GitHub.

Those looking to learn more about the demo can watch "The making of Unity's latest flagship demo" for free during GTC. This session is scheduled for Thursday, March 24, 7:00 am PT. Alternatively, you can also see the GDC Nvidia sponsored session (paid) this Friday, March 25, at 3:00 pm PT.

Permalink to story.

 

eforce

Posts: 1,026   +1,477
Unity engine devs: "Look at this cinematic rendering of the human form!"

Unity game devs: "Look at our top-down isometric game with low poly character models!"

This should start to change soon since Unity has recently released some half decent FPS/Third Person samples and dedicated server package.
 

Aaron Fox

Posts: 153   +90
Actors are going to have to adapt. At first, they'll be brought in to do the physical maneuvers. It's much easier to have real people do the bulk of the 'animation' with their movements. Their appearances as they move will be replaced by the avatars'.

Voice acting will supplement these movements. Voice will be the last frontier of replacing human actors with programs.

Eventually, human actors won't be necessary.
 

passwordistaco

Posts: 345   +795
Actors are going to have to adapt. At first, they'll be brought in to do the physical maneuvers. It's much easier to have real people do the bulk of the 'animation' with their movements. Their appearances as they move will be replaced by the avatars'.

Voice acting will supplement these movements. Voice will be the last frontier of replacing human actors with programs.

Eventually, human actors won't be necessary.
I think I'll be OK with that.
 

RudyBob

Posts: 704   +670
Actors are going to have to adapt. At first, they'll be brought in to do the physical maneuvers. It's much easier to have real people do the bulk of the 'animation' with their movements. Their appearances as they move will be replaced by the avatars'.

Voice acting will supplement these movements. Voice will be the last frontier of replacing human actors with programs.

Eventually, human actors won't be necessary.
Then they can get jobs and not worry about what the rest of us do
 

passwordistaco

Posts: 345   +795
This should start to change soon since Unity has recently released some half decent FPS/Third Person samples and dedicated server package.
Don't get me wrong - I enjoy games with the isometric view, but the closer the camera gets to the action, the worse they look. We have variable levels of detail for terrain and building models. I want to see that applied to character models, so the closer the camera gets, the better it looks.
 

trparky

Posts: 1,121   +1,268
Did anyone else's brain just completely reject this whole thing? Because mine did. The eyes were what did it for me, they just looked dead. Totally creeped me out.
 

passwordistaco

Posts: 345   +795
Did anyone else's brain just completely reject this whole thing? Because mine did. The eyes were what did it for me, they just looked dead. Totally creeped me out.
It's called "the uncanny valley" - the more realistic the rendering, the more our brains recognize it as false. With this demo from Unity and what Unreal engine 5 can do, I think we're approaching the other side of the valley.
 

waclark

Posts: 568   +356
Actors are going to have to adapt. At first, they'll be brought in to do the physical maneuvers. It's much easier to have real people do the bulk of the 'animation' with their movements. Their appearances as they move will be replaced by the avatars'.

Voice acting will supplement these movements. Voice will be the last frontier of replacing human actors with programs.

Eventually, human actors won't be necessary.
Except it's hard to do a stage play with computer rendered actors.
 

kiwigraeme

Posts: 1,209   +877
I focussed on the hair - it's still not that scene in Shrek ( 4th wall scene - this hair animation took a lot of processing ) .
It may show individual strands - but it's definitely not doing individual strands - games will still limited hair styles etc - in extreme movement, windy scenes - it won't be perfect - but it won't matter - I suppose its like those hypnotists at the beginning of shows - who asked people to squeeze their hands - or some such thing to spot people to hypnotise . The games are immersive as you want it to be - for some people that Ascii character moving across the screen really is a fearsome monster
 

eforce

Posts: 1,026   +1,477
Don't get me wrong - I enjoy games with the isometric view, but the closer the camera gets to the action, the worse they look. We have variable levels of detail for terrain and building models. I want to see that applied to character models, so the closer the camera gets, the better it looks.

That's nothing to do with Unity, models are the responsibility of the game developer.
 

mbk34

Posts: 358   +260
It's clever stuff and I'm impressed but I prefer my games to be interesting and fun rather than just life like. The other problem with life like characters in games is that they'll still look completely unreal unless there's better AI to drive them.
 

ragreeen2646

Posts: 28   +12
We are coming to a time when anything recorded, voice or video will no longer be reliable. We will be able to create all kinds of fake videos of supposedly real people which will be used by all kinds of nefarious people or governments. To me this is scary.
 

Gezzer

Posts: 287   +145
Did anyone else's brain just completely reject this whole thing? Because mine did. The eyes were what did it for me, they just looked dead. Totally creeped me out.

IMHO it's mostly due to the fact that human motion isn't really all that fluid. It's more twitchy in nature, even the eyes. I think it's due to the fact that we have to compensate for the blind spot, as well as how muscle fibers move as a unit. So for me that's the dead give away. I did notice some twitchiness in the face muscles that almost sold me, but then when she looked directly at me it all evaporated because as you said the eyes looked lifeless.
 

J R K

Posts: 11   +6
Except it's hard to do a stage play with computer rendered actors.

I agree. The level of nuance that comes from a real, living personality giving gravitas to an acted persona is still a lightyear away from what AI and purely computer-rendered actors can do. Thus, voice and mocap actors will still have jobs even with this fidelity.

Ostensibly.