There are plenty of amazing games with incredible graphics that will push your GPU to its absolute limit. Here are ten great choices, along with tips for settings, for graphics card calisthenics!
There are plenty of amazing games with incredible graphics that will push your GPU to its absolute limit. Here are ten great choices, along with tips for settings, for graphics card calisthenics!
You may try Watch Dog 1. It is not demanding but for some reason it was increasing the GPU temperature to the 90 C.
if I remember correctly when you max out the grass settings in gta5 it renders every blade AND its shadow, which is why it murders framerates, the grass setting is what I used for years to judge cards I bought, only my current one, an rtx 3070ti is able to run grass maxed out along with those 2 advanced sliders I forget the name of.Also try GTA V with grass detail set to anything higher than 'normal'
That would cripple my 3080, from a rock solid 4K 60fps, down into the low 30's.
I'm assuming its a CPU draw call bug they've never bothered to fix rather than it being a genuine need fro that much GPU grunt, regardless, it will make any top end GPU collapse in on itself.
Yep, brain failure of my part. That said, I could have sworn I saw a document a while ago that said FSR was going to be there, along with DLSS. Anyway, thanks for the info - will fix the text shortly.Plague tail does not support FSR, as of today. Its been added by modders but the game dev's have not, as of yet.
Ah the days of brute force computing I surely don't miss it. Crytec has to learn to code for more than 2 cores first before competing with current big boys like Epic's Unreal engine 5, Cdproject red, Rockstar and even Ubisoft made it to this list in the article.Hopefully, when Crysis 4 launches, we'll be able to revisit the age old question and add it to this list: can it run Crysis?
Poor CPU optimization (and software bugs) seems to be the biggest performance limiter for games these days. With the exception of the 4x strategy games genre, there are few categories of games that actually utilize as much CPU as you have. Some games truly are limited by the GPU, especially with ray tracing, but some just put too much in a single thread.Ah the days of brute force computing I surely don't miss it. Crytec has to learn to code for more than 2 cores first before competing with current big boys like Epic's Unreal engine 5, Cdproject red, Rockstar and even Ubisoft made it to this list in the article.
10 Games to Work Out Your GPU to the Max
I've seen cpu utilization go up in some title's outside of RTS games specifically Vermitide 2 when there is a a lot of npcs especially when being stagnated by ramming them and in Cyberpunk in the city when the most dense npcs areas. I am using the 9900ks at 4k. While playable on both instances Some developers seem to know how to code better than others.Poor CPU optimization (and software bugs) seems to be the biggest performance limiter for games these days. With the exception of the 4x strategy games genre, there are few categories of games that actually utilize as much CPU as you have. Some games truly are limited by the GPU, especially with ray tracing, but some just put too much in a single thread.
Except that for some games, depending on the graphics card used, 4K+max settings will induce asset swapping. You'll get super low fps but the GPU itself will barely be doing anything.*Sigh*... just run any game at 4K with all settings maxed out.
You have to have a good graphics card for this. When I want to impress people I play Sokoban on ultra settings.
I concur. My 9900ks is definitely bottlenecking my 4090 at 4k Luckily I have something to look forward to with zen 4 3d.Except that for some games, depending on the graphics card used, 4K+max settings will induce asset swapping. You'll get super low fps but the GPU itself will barely be doing anything.
As covered in the article![]()
The most interesting aspect to note from their testing was the use of an RTX 4090. With 24GB of memory, it's not going to be doing any asset swapping at all, even at 4K. That means all those gains are coming entirely from the additional L3 cache. But then when one looks at Digital Foundry's testing, CPUs such as the Core i5-12400F outperforms the 5800X3D in some games, despite having a far smaller amount of L3 cache. The obvious conclusion from this is AMD's CPU fundamental design very much likes cache.Techpower up has a great article comparing the 5800x vs 5800x3d in 53 games and even at 4k there was an average 6.8% delta gain.
The article explains why GPU Load shouldn't be used as a metric to determine how hard the processor is working.I hoped for article to mention about different "99% load", I.e. when you get 2 games load GPU at 100% but temps and powerdraw vary by 50%.
dxvk(-async) may help hereJust play Killing Floor 2 or Starcraft 2, even though they're ancient, they'll never max out on Ultra @ 2160p. Never.