The iPhone 14 Pro almost supported ray tracing

Daniel Sims

Posts: 1,374   +43
Staff
Why it matters: Ray tracing is still mostly exclusive to current-gen game consoles and recent mid to high-end PC graphics cards, but it's starting to appear on mobile hardware as well. A recent report reveals that Apple's latest iPhone could have marked a grand debut for portable ray tracing, but the company abandoned the plans at the last minute.

If you found the iPhone 14 Pro's graphics processing to be a rather tepid improvement from its predecessor, a report this week from The Information may reveal why. Sources claim the premium phone's GPU nearly supported hardware-accelerated ray tracing before a hasty backtrack hobbled it.

Unfortunately, Apple's engineers found a flaw late in the iPhone 14 Pro's development that caused its GPU to draw too much power. To shore up the device's battery life and thermals, Apple had to hurriedly pivot the A16 Bionic chip's graphics processor to that of the previous generation A15, found in the iPhone 13.

The setback was allegedly a major one in Apple's hardware design history, leading to a shakeup in the company's graphics processing team. The report mentions broader talent loss among Apple's hardware design teams.

Had the iPhone 14 Pro launched with ray tracing, it could have gone toe-to-toe with Android phones featuring Qualcomm's Snapdragon 8 Gen 2 platform which supports ray tracing and Unreal Engine 5. Flagship devices like the Xiaomi 13 series, Moto X40, and Vivo X90 Pro+ are already on the market powered by Qualcomm's latest SoC while others from Asus, OnePlus, and Sony are on the way.

Imagination Technologies showcased an early attempt at mobile ray tracing last year with its PowerVR IMG-CXT. In June this year, Arm announced the Immortalis G715, a mobile GPU supporting ray tracing and variable rate shading.

Apple's mishap seems to have put the iPhone 14 Pro behind the competition in GPU performance. Graphics processing benchmarks put the A16 Bionic behind the Snapdragon 8 Gen 2 and MediaTek's Dimensity 9200 – the first Immortalis G715 chip.

Once Apple manages to get ray tracing in its silicon, it will likely come to both its mobile and desktop chips. The Apple Silicon version of Resident Evil Village currently lacks the graphical feature unlike the Windows, PS5, and Xbox Series versions, but the game menu simply grays it out in the settings instead of removing it entirely. The insignificant detail could be interpreted as hinting that upcoming Apple Silicon was meant to support ray tracing.

Permalink to story.

 
I've never understood the hatred towards ray-tracing, all games use completely faked lighting, it's either baked in causing no change in time of day or it's dynamic, causing weird issues with stuff having no lighting or appear to be floating or glowing or hard shadows. Lighting in games has been faked well for many years now, but honestly doesn't look great.

I finally got a GPU recently that's allowed me to experience ray-tracing, it's much better, it clearly isn't being used to it's full potential yet but it's much better than the normal fake lighting and reflections.

It's also much easier for developers to implement making game development much quicker, why anyone would be against it is baffling to me.
 
I've never understood the hatred towards ray-tracing, all games use completely faked lighting, it's either baked in causing no change in time of day or it's dynamic, causing weird issues with stuff having no lighting or appear to be floating or glowing or hard shadows. Lighting in games has been faked well for many years now, but honestly doesn't look great.

I finally got a GPU recently that's allowed me to experience ray-tracing, it's much better, it clearly isn't being used to it's full potential yet but it's much better than the normal fake lighting and reflections.

It's also much easier for developers to implement making game development much quicker, why anyone would be against it is baffling to me.
I don't see where the hatred is (or has been), but more of a "why should I care?" with how top tier raytracing has been until recently.

And I mean, this is also mobile we're talking about. The hardware can barely play realistic games at this point, and raytracing is expensive (especially without dedicated hardware) and meant for realism.
 
I don't see where the hatred is (or has been), but more of a "why should I care?" with how top tier raytracing has been until recently.

And I mean, this is also mobile we're talking about. The hardware can barely play realistic games at this point, and raytracing is expensive (especially without dedicated hardware) and meant for realism.
One more point to be added is even if they can do it then we need to see what is the tradeoff wrt to power consumption. Nobody wants a mobile with two hour gametime. I never played with mobile plugged in and I doubt most people do. Mobile gaming is something that we do when we are bored while travelling.
 
I've never understood the hatred towards ray-tracing, all games use completely faked lighting, it's either baked in causing no change in time of day or it's dynamic, causing weird issues with stuff having no lighting or appear to be floating or glowing or hard shadows. Lighting in games has been faked well for many years now, but honestly doesn't look great.
OK, here's the thing, I'm completely ambivalent about ray tracing. I don't game, and better lighting in games, it's what's being pitched to me as the reason why, "I need it".

With the expectation that Nvidia's GTx 16xx series will soon be discontinued, it means that I would have to pay at least a hundred dollars more to equal their performance in the 30xx, or 40xx series. When coupled with Jensen Huang's "addict's tax", Nvidia's cards are being tagged with, you have my, (along with everybody else here who thinks graphics cards should be cheaper), sole objection to it, price. Although I confess it does get quite irksome, when you get told things, "well, if you can't afford it, get out of the hobby". For me, there's nothing more rewarding than coming to Techspot, and being talked down to by some windbag gaming addict. (present person excluded of course. I've never got anything of the sort from you)

Here's the thing though, my credit cards aren't anywhere near bagged the limit. So, if I felt like buying something for $1,600, I could, it just wouldn't be a VGA.

More to the topic, this "ray tracing in phones", is just another over hyped Apple scam.So, would you notice all those defects in game lighting on a 6" screen phone? I think not. But hey, I could be wrong. Sooner or later there'll be someone here claiming 20/5 vision, (or better), who can count every pixel on an iPhone from five feet away.

Apple is just regurgitating it's hype and claims that you need a phone with a resolution, "equal to that of the human eye", as an excuse to introduce, 'the next new necessity' for the iSheep'. Well, that sold a crap ton of phones to the under educated, so ray tracing (IMO) just a, "let's see what we can get away with this time". from a pure marketing standpoint.

From a social effect standpoint, the "better" the phones get, the tighter peoples blinders get clamped down. as bulwarks against the real world. So many people seem to be living out their lives through a 3" by 6" tunnel. From what I haven't managed to not overhear, people will talk for hours, and not have anything worthwhile to say. It's mostly just babbling, seemingly purposed to overcome a fear of being alone.

I can't believe I rambled, (some would call it 'babbled'), on for so long, when what I meant to say was, "Merry Christmas, Burty". :) (y) (Y)
 
But, but, but...I want to see accurate shadows and lighting effects on my 6" screen!
And I'm ready to give the non-removable battery and the passively cooled SoC a pounding in the process.
-
Meanwhile, others couldn't care less about Raytracing even on their desktop PCs.
-
Has anyone seen what Rockstar's GTA V looks like with Raytracing turned up to 11? Stupid. Everything seems made of glass: cars, concrete...you name it.
 
But, but, but...I want to see accurate shadows and lighting effects on my 6" screen!
And I'm ready to give the non-removable battery and the passively cooled SoC a pounding in the process.
-
Meanwhile, others couldn't care less about Raytracing even on their desktop PCs.
-
Has anyone seen what Rockstar's GTA V looks like with Raytracing turned up to 11? Stupid. Everything seems made of glass: cars, concrete...you name it.
OK, first, the only thing you're supposed to turn the knobs to "11" on. is a guitar amp, and only then if you've just recently bought it.

It's a bit early for New Years, but let's see if Ms. Lee can calm you down a bit. I can assure you, there's no ray tracing in this video:

 
Damn, just imagine those ads in App store rendered with real-time reflections and light bounce physics!
I struggle to envision the splendor, the grandeur, the veritable visual orgasm that would be. Maybe I'll have to spring for another new card after all.
 
I've never understood the hatred towards ray-tracing, all games use completely faked lighting, it's either baked in causing no change in time of day or it's dynamic, causing weird issues with stuff having no lighting or appear to be floating or glowing or hard shadows. Lighting in games has been faked well for many years now, but honestly doesn't look great.

I finally got a GPU recently that's allowed me to experience ray-tracing, it's much better, it clearly isn't being used to it's full potential yet but it's much better than the normal fake lighting and reflections.

It's also much easier for developers to implement making game development much quicker, why anyone would be against it is baffling to me.
It's more of a pushback from people who noticed that as soon as ray tracing came out it was marketed as a feature you absolutely must have. It raises the bar for what kind of a GPU you must have to have RT on which only benefits the leatherman's pockets. Shadows and lighting are fake, as is everything else in a video game, it doesn't matter. Buyer's remorse was strong with people who bought any other cards than the latest RTX cards as they praised ray tracing while I can simply watch a YT video and see that it isn't possible to achieve playable framerates with their cards.
 
I've never understood the hatred towards ray-tracing, all games use completely faked lighting, it's either baked in causing no change in time of day or it's dynamic, causing weird issues with stuff having no lighting or appear to be floating or glowing or hard shadows. Lighting in games has been faked well for many years now, but honestly doesn't look great.

I finally got a GPU recently that's allowed me to experience ray-tracing, it's much better, it clearly isn't being used to it's full potential yet but it's much better than the normal fake lighting and reflections.

It's also much easier for developers to implement making game development much quicker, why anyone would be against it is baffling to me.

There are several factors for the “hate”.

Among them:

1- Neither the hardware nor the software is there to make this a must have as everyone is making it out to be, specially all reviewers.

2- The performance hit is simply ridiculous.

3- Adds very, very little (if anything) to gameplay and only on some games, it adds any visible difference that strictly falls under eye candy.

RT might be something that one day will bring real value to games, but it’s easily at least 3 to 4 hardware gens away and unlike reviewers, whom dont pay for the over priced nvidia gpus, those gpus would need to be way under US$ 1K and provide full RT at 4K@120 fps.

Until we get there, this is a useless gimmick and we hate the fact that its being rammed down our throats by tech sites, Tubers and other places.

 
Just to play devils avocate, when games started getting good at fake lighting and was absolutely punishing on hardware, Crysis being a prime example. Why wasn't there as much outage then?

From what I can see, ray-tracing is the future for games as we can't push fake lighting much further without also causing a massive drop in performance and taking way longer to develop games.

I agree with a lot of points here by the way, price of GPU's capable of actually running ray-tracing are overly expensive and marketing has been the most over-hyped bullsh*t I've seen in a long time.

I don't even think Ray-tracing will be fully embraced until the next gen of consoles come out with much better ray-tracing performance than the current gen. In order to make game development faster, they have to completely abandon the normal baked in fake lighting and fully switch to just relying on ray-tracing to do the heavy lifting which just can't happen this gen.

But, I do believe it's the future for games, when it's working it does look amazing. Just going to be a 10 year+ transition period as the tech gets better.
 
I wonder if in the mobile space they could skip high and ultra textures, and everything and go straight to RT candies. IMG showcased a video with actual mobile capabilities but with RT, although there are no mobile games taking advantage of this. One question is, if the RT die space is actually that beefy? Since it fits into a mobile GPU and barely into a PC GPU?


"IMG-CXT should deliver up to 1.3G gigarays per second (GRay/s) for ray traced shadows, ambient occlusion, reflections, and global illumination. In comparison, Nvidia's mid-range RTX 2060 desktop GPU delivers around 5GRay/s."

 
Last edited:
I feel Apple sat on their success with Tim at the helm, and slowly becoming the Nokia/ Blackberry that they displaced. Besides iOS that is unique to Apple, they are really falling behind competition in almost every way. Even the last stronghold which is their SOC superiority is getting chipped away over the years. Between the A14 to 16, the improvement in performance is not looking good. Most of the performance is due to an additional GPU core that bumps the overall performance up. Otherwise, the CPU side of things is mostly stagnant.
 
I've never understood the hatred towards ray-tracing, all games use completely faked lighting, it's either baked in causing no change in time of day or it's dynamic, causing weird issues with stuff having no lighting or appear to be floating or glowing or hard shadows. Lighting in games has been faked well for many years now, but honestly doesn't look great.

I finally got a GPU recently that's allowed me to experience ray-tracing, it's much better, it clearly isn't being used to it's full potential yet but it's much better than the normal fake lighting and reflections.

It's also much easier for developers to implement making game development much quicker, why anyone would be against it is baffling to me.
I am not sure about hating RT, but to me, it is just about practicality. Sure RT images generally looks better and realistic, but it comes with a steep cost in terms of performance, and power consumption. And while most people praise how good RT titles look, but I am not sure if they even realized that RT is on or not without looking at the settings, or without comparing it side by side.

In addition, while it could be true that RT makes it easier for the developer, but it is at the end users' expense since you will need to spend more money to buy hardware to get these eye candies, while its easier for the developers. I do have doubts whether it will simplify developers' effort because they will have to develop both RT and non RT versions since not many people will be able to afford high end SOC or GPUs to run RT enabled games fluidly.
 
I wonder if in the mobile space they could skip high and ultra textures, and everything and go straight to RT candies. IMG showcased a video with actual mobile capabilities but with RT, although there are no mobile games taking advantage of this. One question is, if the RT die space is actually that beefy? Since it fits into a mobile GPU and barely into a PC GPU?


"IMG-CXT should deliver up to 1.3G gigarays per second (GRay/s) for ray traced shadows, ambient occlusion, reflections, and global illumination. In comparison, Nvidia's mid-range RTX 2060 desktop GPU delivers around 5GRay/s."

Ray tracing can't be used to generate detail that isn't already present via the complexity of the assets -- I.e. a low poly, low resolution textured model will still look bad, no matter how accurate the lighting model is.

The amount of die space required for ray tracing specific acceleration (e.g. ray-triangle intersection calculations) isn't big -- Imagination is claiming that a Level 2 RT unit requires 44 times less die area than the equivalent amount of shader units required to perform the same calculations.

But it does depend on what the hardware is doing. For example, AMD's RT units in RDNA 2 just handle ray-triangle intersection calculations, whereas Nvidia's does the same plus carries out BVH traversal operations. In the case of IMG-CTX, there are multiple implementations of the architecture with the most basic (Level 1) being entirely software-based through to a complete hardware-based system, similar to Nvidia's. The simplest hardware one is like AMD's (I.e. just handle intersection calculations).
 
Back