Gears 5 at 8K resolution brings the $2,500 Nvidia Titan RTX to its knees

Humza

Posts: 1,026   +171
Staff member
Bottom line: Although a majority of gamers play at 1440p or 1080p resolutions (and for good reasons), there are some who like to go all out in spending towards making the ultimate gaming PC that goes through AAA titles like a hot knife through butter. However, it seems that even money isn't going to solve the present limitations with 8K gaming as demonstrated by a Gears 5 benchmark that went through five of the most powerful GPUs with the required VRAM on-board to see if the game is playable at 8K resolution.

Gears 5 was praised for its brilliant single-player campaign even though it got off to a rocky start on PC if Steam reviews are to be considered. Nonetheless, a PC release made it possible for the game to run at an 8K resolution, an experience which Tweaktown recently showed as being barely playable on even the current most powerful "consumer" graphics card: the Nvidia Titan RTX.

The benchmark didn't just include Nvidia's current flagship but four other GPUs as well, all of which had at least 11GB of VRAM or more since the game refused to load at 8K resolution for cards with 8GB VRAM. Thus the only ones to qualify other than the Titan RTX were Nvidia's GTX 1080 Ti, RTX 2080 Ti, Titan Xp and AMD's Radeon VII.

The cards were tested on a gaming rig with an Intel Core i7-8700K running at 5GHz (cooled by Corsair's H115i Pro), 16GB of DDR4-2933 HyperX Predator RAM and 512GB + 1TB OCZ NVMe M.2 SSDs on a Z370 AORUS Gaming 7 motherboard, powered by InWin's 1065W PSU.

With the high resolution pack installed, motion blur and vertical sync were disabled to get 8K benchmarks at medium settings.

While the game was rendered somewhat playable by the RTX Titan at nearly 30fps, the average number reduced to less than 20 (and even lower for other cards) with the graphics set to Ultra Settings (motion blur and vertical sync disabled).

"Gears 5 at 7680 x 4320 with 4K high-res textures and Ultra settings brings every graphics card known to man to its knees. Nvidia's flagship $2499 monster Titan RTX graphics card can't even muster 20FPS, while the 16GB of HBM2 is spinning its wheels it seems with only 12.8FPS on the Radeon VII," noted the publication.

Although such benchmarks may just simply be a stress test for these GPUs, they also give a hint about the time frame by which we can expect 8K gaming to become mainstream, which as of now seems to be at least 2 or 3 years away for 8K@60fps.

Speaking of which, it'll be interesting to see how upcoming consoles like the PS5/PS5 Pro and Xbox Project Scarlett manage to support 8K gaming at high frame rates, especially considering their (expected) lower price of admission compared to an 8K-capable gaming PC.

Permalink to story.

 
Why are ppl still talking about 8K gaming on consoles? I think it is pretty clear that they are purely talking about the ability to output at that resolution and maybe play 8K media content. At most we might get some easy to run indie game to run at 8K.

What we might get to see are some impressive upscaling results from 4K to 8K.
 
8k gaming? Honestly, wouldn't it need a screen measured in square yards (or meters) to need that kind of detail, yes? I must be missing some kind of basic idea - or is the industry just intent on personal cineplex as the next necessary thing?
 
2560 x 1440 @ vsynced 60FPS+ and ultra settings is good enough for me given present GPU and monitor hardware.

A $500 monitor and a $500 GPU will do the business there. Not cheap, but not extravagant either.

Much above this (I.e 4K, 60FPS+, ultra) it's a $2000 monitor with a $1000 GPU and the result is still barely satisfactory.
 
I truly wonder what kind of market exists for 8k displays. The vast majority of consumers have no use for anything over 4k, so why are manufacturers spending so much R&D for something that doesn't have a practical use? Do they think they can trick enough people into buying 8k panels to make it financially worthwhile?

I'm still excited to see gpus gets more powerful for 4k 144hz+ gaming and VR, but I'm already a little tired of hearing about 8k.
 
8k gaming? Honestly, wouldn't it need a screen measured in square yards (or meters) to need that kind of detail, yes? I must be missing some kind of basic idea - or is the industry just intent on personal cineplex as the next necessary thing?

8K is totally pointless for gaming and this is coming from a resolution junkie, I have a 4K tv, phone and PC rig and I'm not going any higher with resolution, as a matter of fact I regret getting a 4K monitor, should have got 1440p instead......
 
as a matter of fact I regret getting a 4K monitor, should have got 1440p instead......

I'm with you. I bought dual 4k screens for my rig, then decided to move more to PC gaming like a dumb dumb. Even a 9th gen i7 + RTX2070 is a struggle.
 
I'm with you. I bought dual 4k screens for my rig, then decided to move more to PC gaming like a dumb dumb. Even a 9th gen i7 + RTX2070 is a struggle.

I'm running a 2700X + 32GB RAM + 1975Mhz Radeon VII and its a struggle too.....
 
Whats the point of this article?

To show that 8k gaming is not feasible in the near future. Also, when next gen consoles refer to "8k capable", they are obviously referring to 8K movie playback, NOT gaming... unless the consoles are coming with a Titan RTX class card, which would make them unaffordable for most consumers.

Mind u, for 8k gaming they wouldn't be targeting the average consumer, as the average consumer can't even afford 4k gaming. They would be targeting the rich, and the profit margins for charging the rich are MUCH higher than they are for the average consumer. I believe I recall reading they only make 20%-30% net profit from the average $250 video card, but they make 60%-70% net profit from the $1000+ cards. Remember, the GPUs are all binned from the same wafer, so when Nvidia or AMD make GPUs they're paying the foundry for each the wafer and have to see which GPUs can handle Titan speeds, and which can only handle RTX 2060 speeds. So, the low-end 2060 GPUs aren't really giving much profit from that wafer of RTX GPUs, the RTX 2080 to Titan class cards are.
 
Last edited:
To show that 8k gaming is not feasible in the near future. Also, when next gen consoles refer to "8k capable", they are obviously referring to 8K movie playback, NOT gaming... unless the consoles are coming with a Titan RTX class card, which would make them unaffordable for most consumers
The output resolution might well be 8K, but the internal resolution absolutely won't be for the vast majority of "8K" games. It'll be how the likes of the XBox One and PS4 handle 1080p, or the Xbox One X and PS4 Pro handle 4K - a combination of dynamic output resolution and a lower internal resolution.

Edit:
Remember, the GPUs are all binned from the same wafer, so when Nvidia or AMD make GPUs they're paying the foundry for each the wafer and have to see which GPUs can handle Titan speeds, and which can only handle RTX 2060 speeds. So, the low-end 2060 GPUs aren't really giving much profit from that wafer of RTX GPUs, the RTX 2080 to Titan class cards are.
The likes of Nvidia use 3 different chip designs for the RTX models:

TU102 - 2080 Ti, Titan RTX
TU104 - 2070 Super, 2080, 2080 Super
TU106 - 2060, 2060 Super, 2070

+plus all the relevant mobile and Quadro models too

One wafer can only support one chip design, so a TU106 wafer will be used to generate processors for the listed models above and nothing else. Since the TU106 has a die area of 445mm2 compared to the 745 mm2 for the TU102, the same size wafers (they're typically 300mm in diameter) will yield more chips for the former compared to the latter. So a TU106 wafer will actually generate more profit than a TU102 wafer will, despite the markup on the final graphics card, especially as the 102 is only used in a total of 4 models compared to the 8 for the 106.
 
Last edited:
I grew up playing Quake 3 on a CRT with damn near perfect motion and responsiveness.

You can't sell me this low frame junk. If I wanted 24fps content I'd watch a movie.

/old man yells at cloud
 
To show progression in performance and technology. This is a tech site.
That tech and performance can be both low and middle tier as well. Grok Android Go, play 'enhanced' Pac-man, build Raspberry Pi, DIY Pringles antenna are all good tech (well, ok, sometimes it is tacky tech) - but all to a good cause. I'm looking forward to an article on a community-built mesh network - could be fun.
 
That tech and performance can be both low and middle tier as well. Grok Android Go, play 'enhanced' Pac-man, build Raspberry Pi, DIY Pringles antenna are all good tech (well, ok, sometimes it is tacky tech) - but all to a good cause. I'm looking forward to an article on a community-built mesh network - could be fun.

This is a specific scenario. 8K. We are playable at 4K, and manufacturers are already making 8K displays, so it only makes sense we see what today's hardware can do with it. Now we know, and we know how far we have to go, and you don't have to ask in the future if you were ever wondering.

I don't see the problem with that, and that was what my comment was specifically about.
 
Whats the point of this article?

To show progression in performance and technology. This is a tech site.

"Techies" commenting here are almost offended it exists which continues to shock me for some reason.
this isnt showing progression though, this is just annoying

any techie worth their salt knows nothing can push 8k at this moment, well no single card solution, if they showed a rig with the parts to actually push that resolution, whats needed, pricing and performance then it would be interesting, thats why I asked "what's the point?"
 
Back