Havok shares first physics engine tech demo in 10 years

Shawn Knight

Posts: 15,626   +198
Staff member
Why it matters: Havok, the Irish-based software maker whose physics engine can be found in countless blockbuster games, has published its first new technology demo on YouTube in a decade. The dynamic destruction demo focuses on Havok's particle physics engine; the company also has tools that help developers with in-game navigation and clothing-based physics.

The scene opens in an underground cave where debris starts raining down around a humanoid figure as a siren blares in the background. Tunnels with spikes and skull traps populate the middle of the demo, while explosions at the end provide an escape route for our hero. In total, the teaser runs a minute and a half long.

The clip was launched in conjunction with the release of Havok 2024.2, and is no doubt being used to demonstrate to devs what is possible with their physics tech. Havok's latest includes support for larger worlds, continued enhancements to CMake support, and performance improvements across all products, among other upgrades. The full changelog is available over on Havok's website.

Founded in 1998, Havok boasts a rich history of collaboration with industry giants such as Bethesda, Ubisoft, Electronic Arts, Nintendo, and many others. Intel purchased the outfit in 2007 for $110 million, but it was Microsoft that ended up with the company following its 2015 acquisition from Intel.

Also see: And Action! An Examination of Physics in Video Games

Despite changing hands multiple times, Havok has managed to remain relevant through it all. Hundreds of games have licensed its technology over the years. In addition to classics like Half-Life 2, Fallout: New Vegas, Heavy Rain, and Halo 3, Havok's physics engine is also featured in newer games such as The Legend of Zelda: Tears of the Kingdom, Helldivers 2, and Mortal Kombat 1.

To be fair, the demo is not the first video posted by Havok in the past decade. A 30-minute presentation on the evolution of physics was shared on the YouTube account three months ago. Havok also published a lesson on integrating third-party tech into Unreal Engine 5 in April 2024.

Permalink to story:

 
I like this. We can also see that lumen ue5 lighting instantly falls apart when the environment finally isn't a static scene for once. Lots of missing shadow casting on that debris.
Interested to see if we'll finally get some high octane actions games at some point using good old havoc instead of inefficient and glitchy physx.
 
Great for destructible environments , but maybe another engine though more inaccurate will look more convincing with rolling dust clouds etc
 
Bad Company 2 is still the high watermark for destructible environments IMO.

Red Faction Guerilla was fantastic, but the environments definitely had a "built from magnetic tiles" look and feel to it.

BC2 really felt like a wartorn wasteland after a 30 minute match.
 
It would be interesting to see games with more realistic physics and almost completely destructible scenery, but UE5 just gets in the way.

The problem is that dynamic multi-object interactions gets *messy* to compute, and quickly becomes an even larger task then rendering. Ageia had the right idea making a Physics API that ran on a co-processor; that's really the only way you are going to get to fully dynamic physics in games.
 
Anyone remember Silent Storm? The turn based tactic WW2 game.

Imagine that with this new tech.
 
I like this. We can also see that lumen ue5 lighting instantly falls apart when the environment finally isn't a static scene for once. Lots of missing shadow casting on that debris.
Interested to see if we'll finally get some high octane actions games at some point using good old havoc instead of inefficient and glitchy physx.
I caught the shadows too... but it does look good. I think this demo also showcases how important impactful synchronized sound is to the gaming experience. Some pieces look a little "bouncy"... but overall it still looks good.
 
The problem is that dynamic multi-object interactions gets *messy* to compute, and quickly becomes an even larger task then rendering. Ageia had the right idea making a Physics API that ran on a co-processor; that's really the only way you are going to get to fully dynamic physics in games.

You do not render the destruction, the game engine does.

To understand the difference, is having 2 Players in the same Cave. Each do not render the destruction, the engine does and they Raster those objects.

In-game physics are EASY for a modern computer to do. Much more easier than calculating ray-traced reflection in puddles.
 
In-game physics are EASY for a modern computer to do. Much more easier than calculating ray-traced reflection in puddles.
"Current" in-game physics is easy, yes; just apply a constant downwards force on objects to simulate gravity, and selectively apply ragdoll effects to objects. That's trivial.

Multi-object physics simulations, even when we're talking about two or three objects, is already an order of magnitude more computationally expensive then rendering. Easy example: Two characters thrown into eachother by an explosion; right now if this happens the effects either get ignored or (more likely) ragdoll breaks until both objects hits the ground. Actually trying to simulate that (even in an ideal situation where you consider both bodies one rectangular object) is just too damn expensive to bother doing.

So instead, we've been basically using the same simplified ragdoll and animation based systems we've had for a good two decades now, since "yet another new AA mode thats .5% faster then the last one" is easier to sell to gamers then "better physics".
 
The problem is that dynamic multi-object interactions gets *messy* to compute, and quickly becomes an even larger task then rendering. Ageia had the right idea making a Physics API that ran on a co-processor; that's really the only way you are going to get to fully dynamic physics in games.
Considering all the cores available in today's CPUs, don't we effectively already have
"co-processors"? Or is 64 bits of bandwidth not enough to accomplish this? If not, we'd all be out having to buy server boards. Which of course would be rebranded as "gaming", and marked up another 20% or so.
 
Last edited:
Considering all the cores available in today's CPUs, don't we effectively already have
"co-processors"? Or is 64 bits of bandwidth not enough to accomplish this? If not, we'd all be out having to buy server boards. Which of course would be rebranded as "gaming", and marked up another 20% or so.
Eh, no. Those types of multi-object interactions, even simplified, need "hundreds" of cores of throughput.

But here's what it gets you: Take a frag grenade in a FPS. Current implementation is basically a damage sphere, with damage reduced the farther from the center you are. Simple implementation, and basically free to compute. But with a physics implementation, you can track each fragment of said grenade individually, track if it hits anything, then based on its current velocity versus where/what it hits calculate how much damage it does. For every shard. Unless someone dives on it to save the rest of the team, which is something you *can't* do now.

One implementation is free, the other is almost comically expensive to calculate. And that's *before* you consider how each object each fragment of said grenade reacts (EG: Bodies go flying, which themselves hit things, and so on).

Multi-object dynamic physics are near impossible to compute, even if you dumb them down. Which is why literally no one bothers.
 
Back