New PowerVR Wizard mobile GPU will ray trace in real time

Scorpus

Posts: 2,162   +239
Staff member

At the Game Developers Conference this week in San Francisco, Imagination Technologies has announced a groundbreaking new family of mobile-oriented consumers GPUs. Known as 'Wizard', the GPU line expands on the company's 'Rogue' designs by offering a dedicated ray tracing unit (RTU), giving game developers the flexibility to produce mobile games with real-time ray tracing lighting effects.

powervr wizard gpu graphics card powervr wizard imagination rogue

The first of the Wizard GPUs is the GR6500, which is a variation of Imagination's Series6XT GX6450 Rogue hardware. This means we're getting a Unified Shading Cluster Array (USCA) with 128 ALUs, which at the GPU's reference clock speed of 600 MHz will provide around 150 GFLOPS of FP32 power.

The dedicated RTU is a non-programmable, stand-alone processing unit that performs all ray tracing tasks without the need for the USCA's ALUs. Rasterization, which is an important graphical task performed by the USCA, won't be affected on a performance level by the addition of the RTU, although it's possible that there might be an overall performance hit from the extra memory bandwidth the RTU may need.

With the GR6500 clocked at 600 MHz, the RTU will be capable of 300 million rays per second, 24 billion node tests per second, and 100 million dynamic triangles per second.

powervr wizard gpu graphics card powervr wizard imagination rogue

By having a dedicated ray tracer in the GR6500, advanced lighting effects can be hardware accelerated, producing more realistic graphics than we're otherwise used to on mobile hardware. Specifically, a game that ray traces will have high-quality shadows that are a product of every light source, much more accurate transparency and photorealistic reflections.

The inclusion of these sorts of effects is up to the game developer, who need a game engine to support Imagination's new hardware. Luckily, the company has announced that Unity's new 5.x game engine will support their ray tracing technology, although this is just one step in a larger push to get OEMs and software engineers supporting ray tracing.

One thing that Imagination didn't reveal was the additional power cost of the RTU. We're already looking at a GPU that will be integrated into battery-powered devices, where efficiency is key, so let's hope the cost of utilizing the RTU isn't huge.

Permalink to story.

 
For mobile gaming that screen looks great!

The problem I see here is... wouldn't that drain the battery a whole lot more since it will need those more million calculations to get that lighting?
 
DX12 better bring ray tracing or else it would be pretty sad that mobile technology is ahead of desktop!
 
I remember that video adapters based on Intel Larabee was the first take at ray tracing, which they cancelled in 2010 due to low overall performance compared to then current graphic cards.

I wonder how PowerVR managed to achieve good performance, while their overall performance is way lower than that of the top-range video cards today. They must be using a very different approach.
 
PowerVR originally specialized in only rendering what needed to be rendered and not rendering elements in the scene the player / object you couldn't see. This is how they got some pretty good results and lower power (this was back in the Nvidia Riva TNT, 3dfx Vodoo, Ati Rage 128 days). Nvidia / Ati relied on fast rendering the entire scene at the time through brute power.

The problem is they never did get a graphics card with the raw horsepower to leverage the optimization tech to be able to compete with the other cards in it's day.
 
I remember that video adapters based on Intel Larabee was the first take at ray tracing,
Ray tracing is available to most modern GPU architectures- it's just that GPUs aren't optimized for it, so it becomes a brute force implementation. IIRC Nvidia's OptiX real time ray tracing engine has been available since August 2009.
which they cancelled in 2010 due to low overall performance compared to then current graphic cards.
The die was also rumoured to be in the order of 650-700mm², along with yields in the basement thanks to that size which didn't help matters
I wonder how PowerVR managed to achieve good performance
1. It isn't pure ray tracing- just incorporates ray tracing elements with a conventionally rasterized frame - a hybrid approach
2. PowerVR use tile based deferred rendering (TBDR) which does away with non-visible geometry calculations.

There is a more in depth analysis at Anandtech
Will be interesting to see how it scales, and what momentum it gains versus the SVOGI approach, or proprietary global illumination solutions such as those used by Crytek or Unity 5
 
Last edited:
Back