10 Games to Work Out Your GPU to the Max

Instruments of destruction is another mention. It's good at bringing GPUs to their knees, especially once destruction kicks in, as it seems to calculate everything on the GPU.
 
Also try GTA V with grass detail set to anything higher than 'normal'

That would cripple my 3080, from a rock solid 4K 60fps, down into the low 30's.

I'm assuming its a CPU draw call bug they've never bothered to fix rather than it being a genuine need fro that much GPU grunt, regardless, it will make any top end GPU collapse in on itself.
 
You may try Watch Dog 1. It is not demanding but for some reason it was increasing the GPU temperature to the 90 C.
 
You may try Watch Dog 1. It is not demanding but for some reason it was increasing the GPU temperature to the 90 C.

Something like that isn't unheard of. I had a similar thing I noticed with Sniper Elite 3. I could max the game out on my GTX 570s in SLI without issues (on a 5760x1080 resolution). However, the cards ran 90-95C and the fans were just screaming. I had to drop settings down some to keep the cards closer to the 85C mark.

The game wasn't that demanding, but for some reason it just worked my GPUs hard.
 
Just play Killing Floor 2 or Starcraft 2, even though they're ancient, they'll never max out on Ultra @ 2160p. Never.
 
Also try GTA V with grass detail set to anything higher than 'normal'

That would cripple my 3080, from a rock solid 4K 60fps, down into the low 30's.

I'm assuming its a CPU draw call bug they've never bothered to fix rather than it being a genuine need fro that much GPU grunt, regardless, it will make any top end GPU collapse in on itself.
if I remember correctly when you max out the grass settings in gta5 it renders every blade AND its shadow, which is why it murders framerates, the grass setting is what I used for years to judge cards I bought, only my current one, an rtx 3070ti is able to run grass maxed out along with those 2 advanced sliders I forget the name of.
 
Plague tail does not support FSR, as of today. Its been added by modders but the game dev's have not, as of yet.
 
Plague tail does not support FSR, as of today. Its been added by modders but the game dev's have not, as of yet.
Yep, brain failure of my part. That said, I could have sworn I saw a document a while ago that said FSR was going to be there, along with DLSS. Anyway, thanks for the info - will fix the text shortly.
 
Hopefully, when Crysis 4 launches, we'll be able to revisit the age old question and add it to this list: can it run Crysis?
 
Hopefully, when Crysis 4 launches, we'll be able to revisit the age old question and add it to this list: can it run Crysis?
Ah the days of brute force computing I surely don't miss it. Crytec has to learn to code for more than 2 cores first before competing with current big boys like Epic's Unreal engine 5, Cdproject red, Rockstar and even Ubisoft made it to this list in the article.

Anyone know when dlss 3 is coming to the public in Cyberpunk?


Another game that will put your hardware to shame is the upcoming coop game Darktide warhammer. The beta was definitely a sh!t show for my old 3090 xc3 ultra hybrid at 4k while vermitide 2 max at 4k was averaging 115 fps.
 
Ah the days of brute force computing I surely don't miss it. Crytec has to learn to code for more than 2 cores first before competing with current big boys like Epic's Unreal engine 5, Cdproject red, Rockstar and even Ubisoft made it to this list in the article.
Poor CPU optimization (and software bugs) seems to be the biggest performance limiter for games these days. With the exception of the 4x strategy games genre, there are few categories of games that actually utilize as much CPU as you have. Some games truly are limited by the GPU, especially with ray tracing, but some just put too much in a single thread.
 
10 Games to Work Out Your GPU to the Max

*Sigh*... just run any game at 4K with all settings maxed out.

And as others have said, poor game optimizations will stress the card without you doing anything.
 
Poor CPU optimization (and software bugs) seems to be the biggest performance limiter for games these days. With the exception of the 4x strategy games genre, there are few categories of games that actually utilize as much CPU as you have. Some games truly are limited by the GPU, especially with ray tracing, but some just put too much in a single thread.
I've seen cpu utilization go up in some title's outside of RTS games specifically Vermitide 2 when there is a a lot of npcs especially when being stagnated by ramming them and in Cyberpunk in the city when the most dense npcs areas. I am using the 9900ks at 4k. While playable on both instances Some developers seem to know how to code better than others.
FYI in Vermitide 2 you can even toggle how many cores you want to scale with I believe capped at 14 threads.
 
*Sigh*... just run any game at 4K with all settings maxed out.
Except that for some games, depending on the graphics card used, 4K+max settings will induce asset swapping. You'll get super low fps but the GPU itself will barely be doing anything.

As covered in the article :)
 
Except that for some games, depending on the graphics card used, 4K+max settings will induce asset swapping. You'll get super low fps but the GPU itself will barely be doing anything.

As covered in the article :)
I concur. My 9900ks is definitely bottlenecking my 4090 at 4k Luckily I have something to look forward to with zen 4 3d.
Even with a fierce cpu competition this past few years we are finally getting to the point where gpu generational performance gains is outpacing the cpu performance gains.
The 2022 and probably the 2023 flagship cpu are probably fine for 4k 240 hz gaming.
FYI a significant cpu bottleneck is anything greater than a 5% delta.
Techpower up has a great article comparing the 5800x vs 5800x3d in 53 games and even at 4k there was an average 6.8% delta gain.
 
I hoped for article to mention about different "99% load", I.e. when you get 2 games load GPU at 100% but temps and powerdraw vary by 50%. Like some blocks are underloaded, almost unnoticeable cpu bottleneck or other reasons I don't recall - not just enum all fancy graphics games everyone knows anyway...
 
Techpower up has a great article comparing the 5800x vs 5800x3d in 53 games and even at 4k there was an average 6.8% delta gain.
The most interesting aspect to note from their testing was the use of an RTX 4090. With 24GB of memory, it's not going to be doing any asset swapping at all, even at 4K. That means all those gains are coming entirely from the additional L3 cache. But then when one looks at Digital Foundry's testing, CPUs such as the Core i5-12400F outperforms the 5800X3D in some games, despite having a far smaller amount of L3 cache. The obvious conclusion from this is AMD's CPU fundamental design very much likes cache.
 
I hoped for article to mention about different "99% load", I.e. when you get 2 games load GPU at 100% but temps and powerdraw vary by 50%.
The article explains why GPU Load shouldn't be used as a metric to determine how hard the processor is working.
 
Back