Today we're taking a look at Halo Infinite's graphics performance by testing over 30 Nvidia and AMD GPUs at three resolutions. We'll also take a look at quality preset scaling and VRAM requirements.
The Halo Infinite single player campaign was released just days ago on December 8, and with this 8th main entry in the Halo series supersoldier Master Chief is back at it. For testing we're using a demanding section of the campaign, which we find suitable for GPU stress testing and it's potentially far more consistent than using the multiplayer portion of the game.
Halo Infinite's multiplayer has been in open beta since November 15 and it's a free to play affair, which is extremely cool and we wish more games did this. Reviews have also been out for a little while. The game has been well received with 80% positive reviews on Steam from over 100,000 players who have given their 2-cents worth.
Personally, I've found the single player campaign enjoyable, it seems to run quite well on modest hardware and I've not run into any bugs or stability issues, so that in itself is a big win these days. It's also available on Steam, so you can avoid the Windows Store if you want, so that's nice.
As for the game engine, developer 343 Industries is using their own in-house "Slipspace Engine." This was purposely built for Halo Infinite, although its roots belong to a heavily updated version of the Blam! Engine. The updates have afforded the developer more creative and technical freedoms and that shows on the game's environments and mechanics. The custom designed engine also enables Halo Infinite to evolve as a platform, with new content, mechanics, and stories. It also allows Infinite's non-linear and sprawling campaign to function with the addition of real-time exterior lighting created by an in-game day/night cycle.
Of course, what we want to know is how well optimized Halo Infinite is, and what kind of hardware you will need to play the game at your desired frame rate.
Our benchmark pass has been done in the Outpost Tremonius mission, starting from when the blast door first opens. There's a heap of enemies here and loads of action. With the difficulty set to benchmark mode, a.k.a. easy, we could run past all the enemies and rarely were they able to kill us and spoil the benchmark pass... we're just too quick.
The test data comprises benchmarks for 33 different GPUs at 3 resolutions as well as a few extra tests mentioned earlier. We're using our Ryzen 9 5950X test system with 32GB of dual-rank, dual-channel DDR4-3200 CL14 memory. Display drivers used are AMD's Adrenalin 21.12.1 Optional and Nvidia's GeForce Game Ready Driver 497.09.
Let's now jump into the data...
Starting with the 1080p resolution, we find GeForce GPUs at the top of our graph, with the RTX 3090 beating the 6900 XT by a comfortable 10% margin, while the RTX 3080 was also 10% faster than the 6800 XT, so this one goes in favor of the green team.
These high-end GPUs were all good for around 90 fps or better at 1080p, so not amazing performance given the hardware being used. That said, previous generation hardware still performed very well. The RTX 2080 Ti, for example, was good for 86 fps, placing it just ahead of the RX 6800 and 3060 Ti...
The RTX 2070 Super was also impressive with 72 fps on average and that meant it was 22% faster than the 5700 XT which is a huge win, especially given the 5700 XT has been found matching and even beating the more expensive GPU in many recent games.
The new Radeon RX 6600 XT only matched the previous-gen RTX 2060 Super, which is a disappointing result, and the standard RX 6600 was slower than the standard RTX 2070. This is a weak result for the lower-end RDNA2 GPUs and first-gen RDNA.
Pascal GPUs also had a weak showing with the GTX 1080 Ti only good for 54 fps on average and while it was still very playable, that's only RTX 2060 levels of performance. Then we find the RX 5700 with just 53 fps on average and beyond that we're down in the 40's.
Ideally, you won't want to go slower than a GTX 1080, as Vega 56 and the GTX 1070 were good for just 40 fps on average. The last playable cards include the 5600 XT and the 8GB version of the RX 5500 XT. Once again, we have evidence that 4GB graphics cards are no longer sufficient, at least not without heavily compromising on visuals.
Bumping up the resolution to 1440p sees Nvidia Ampere GPUs pull further ahead. Now the RTX 3090 sits at 90 fps on average, which is 14% faster than the 6900 XT. The RTX 3080 is 11% faster than the 6800 XT as well, which was good for 75 fps on average. These are all great results for the single player experience, and was only a few frames faster than the vanilla RTX 3070.
The Turing-based GPUs do well in Halo Infinite as the RTX 2080 Ti matched the RTX 3070, and that's a best case result for the previous-gen part. Meanwhile, the RX 6800 was only a fraction faster than the 3060 Ti, which was a little faster than the 6700 XT.
The Radeon RX 6600 XT drops down to 51 fps on average, which is about what we got out of the standard RTX 3060. This time the 5700 XT managed to match the RTX 2070 and 2060 Super -- a decent result, but not nearly as impressive as what we've seen in other recently released games.
The 5700 XT managed to edge out the standard RX 6600, as well as the GTX 1080 Ti. Then we have the RX 5700 beating the RTX 2060 followed by the GTX 1660 Super, 1660 Ti and GTX 1080. Beyond that we're dropping down to 30 fps making parts like Vega 56, the GTX 1070, and slower GPUs unsuitable for these quality settings.
Finally we have our 4K benchmark and it's pretty straightforward as just a few GPUs manage to push over 50 fps on average. Nvidia dominates the top of our graph again with the GeForce RTX 3090, 3080 Ti and 3080 all beating the 6900 XT, while the 6800 XT was good for 54 fps on average. Then we see the RTX 3070 Ti dropping down to 50 fps, closely followed by the RX 6800 and RTX 2080 Ti.
Anything less than that and we run into the low 40's, so anything slower than the Radeon RX 6700 XT or GeForce RTX 3060 Ti are unacceptable for smooth 4K gameplay.
All the tests above were run using the Ultra quality preset, which is the highest available. But how much extra performance can you squeeze out of these GPUs with lower quality settings?
In the case of the Radeon RX 6900 XT, lowering the preset to High boosted frame rates by 11%, which is a minor improvement. Then from High to Medium we see a further 17% increase and from Medium to Low a massive 44% boost for a 176 fps average.
Similar scaling results were seen with the RTX 3090 which saw a 15% performance uplift from Ultra to High, then 14% from High to Medium and finally 19% from Medium to Low. The low preset didn't offer the same kind of boost that we saw with RDNA2, but this is to be expected with Ampere at 1080p.
With slower GPUs we found relatively similar results. AMD's old Vega 56 and the GeForce GTX 1080 saw a 15-18% increase going from Ultra to High, then a 17-20% increase from High to Medium and then a 26 to 33% increase from Medium to Low.
Visual Quality Comparison
To put the preset scaling data into perspective, watch the video below for a quick look at a visual comparison. Frankly, there isn't a huge difference between these presets, especially when comparing Medium, High and Ultra. Low stands out more, with poorer lighting and shadow quality. But overall it's quite surprising just how little difference there is between the various presets and while we know Ultra typically offers very little over High, and the performance difference in Halo Infinite wasn't huge, the difference between Medium and Ultra is minimal...
Some of the more noticeable differences include shadow details and crisper looking textures, especially distant textures, but again these are subtle. In our opinion, the difference for indoor environments with shorter draw distances are even less obvious and we could easily find ourselves playing using the Medium preset if needed.
Here's a look at VRAM usage corresponding to the section of the game we tested (third mission called Outpost Tremonius). Please do note that testing VRAM usage is not exact as different sections of the game will require more or less memory, so there's a chance that the demand will be even higher in other parts of the game.
For 1080p, you want a minimum of 6GB when using the Ultra quality preset, and we certainly saw evidence of this with 4GB graphics cards suffering a lot at this resolution.
Then at 1440p you'll ideally want a minimum of 8GB, though GPUs like the RTX 2060 played fine. For those of you gaming at 4K, VRAM is less of a concern in the sense that most GPUs capable of delivering satisfactory results here have more than 8GB.
What We Learned & Boosting Performance
First, let's talk about quality presets. Usually we try to test at least two presets across all 30+ GPUs, but instead we took a shortcut, settled on Ultra plus some reasonably detailed preset scaling results and a basic image quality comparison. What we found was that most gamers wanting to squeeze more performance out of their GPUs in Halo Infinite should give the Medium preset a try, and this will boost performance by at least 30%.
... most gamers wanting to squeeze more performance out of their GPUs in Halo Infinite should give the Medium preset a try, and this will boost performance by at least 30%.
Both the RTX 3090 and 6900 XT were 30% faster using Medium opposed to Ultra, while the GTX 1080 and Vega 56 were both 38% faster. For the older GPUs, that was the difference between ~40 fps at 1080p to something closer to 55 fps, and that's a huge impact on how smoothly the game plays.
Another noteworthy mention is "Async Compute," which can be found towards the bottom of the "Video" menu. For some reason this option is disabled by default. All modern GPUs now support Async Compute and we found that enabling that will boost frame rates by 5-10%, depending on the quality settings and GPU.
For those of you happy to use the Medium preset at 1080p and are seeking a 60 fps experience, this should be possible with a GTX 1080, 1660 Ti, 1660 Super or Radeon RX 5700. For the same performance using Ultra you'll require a 5700 XT, RX 6600, RTX 2070 or RTX 3060, so the hardware requirements on PC are quite steep. Unfortunately, for those of you still clinging to older hardware such as the Radeon RX 580 or GeForce GTX 1060, you'll need to use the lowest quality settings for over 40 fps at 1080p.
Also of note, we are not alone in testing Halo Infinite and other well respected outlets like Computer Base, KitGuru and GameGPU have ran similar tests with varied results. There is a follow-up video on Hardware Unboxed that explains where the differences lie and more importantly why some of those publications are reporting higher frame-rate averages with Radeon GPUs. In a nutshell, two of those sites used a benchmark run that is not as demanding, while the third one used the same mission that we used for testing, and in that case the results are a close match to ours. Neither result is right or wrong, but simply we have tested different parts of the game and as long as all GPUs were tested under the same conditions, the results are all valid.
Finally, it appears as though Halo is very CPU intensive. We played the campaign for an hour on a Ryzen 5 3600 and while performance was generally good, we observed the CPU was constantly maxed out, often hitting 100% load with a high-end GPU. RAM usage often hovered around 10 GB in total when playing the game, so this title should be perfectly playable with 16GB, but 32GB is always a nice luxury.