10 Games to Work Out Your GPU to the Max

neeyik

Posts: 2,424   +2,975
Staff member
And you actually believe adding 2 layers of API translation to the mix will double the performance, or ... ?
Depending on the game and hardware combination, there are definite gains to be had. It's absolutely not a blanket fix for instant performance, though.
 

zulu53

Posts: 196   +72
Poor CPU optimization (and software bugs) seems to be the biggest performance limiter for games these days. With the exception of the 4x strategy games genre, there are few categories of games that actually utilize as much CPU as you have. Some games truly are limited by the GPU, especially with ray tracing, but some just put too much in a single thread.
Almost like the game vendors have a financial deal on the side with the GPU vendors to purposely design their software this way?
 

zulu53

Posts: 196   +72
The article explains why GPU Load shouldn't be used as a metric to determine how hard the processor is working.
That's why the article said that power draw should be used to as the metric for load. Actually, like for any man-made machine it should follow nature (just like the human machine) and be the ONLY metric for load. As those guys in the Tour de France found power output is pretty much the only metric worth following if you want to gauge maximum performance ability.
 

human7

Posts: 152   +131
Almost like the game vendors have a financial deal on the side with the GPU vendors to purposely design their software this way?
I doubt it. If anything it would be with the CPU vendors to encourage year-over-year upgrades. CPU bottlenecking would just discourage the need for new GPUs every year.
Besides, there's too much competition out there, and such deals would attract a lot of unwanted attention from both consumers and regulators.
 

neeyik

Posts: 2,424   +2,975
Staff member
Actually, like for any man-made machine it should follow nature (just like the human machine) and be the ONLY metric for load
Using just power as the sole GPU load metric has its flaws, though. For example, running Furmark at 1080p will max out the power consumption of pretty much any GPU. Run it at 4K and it will actually use less power:

2080_super_furmark_1080p.jpg
2080_super_furmark_4K.jpg

So does this suggest that the GPU isn't being worked hard at 4K compared to 1080p? Of course not, as the frame rate will testify. What's happening is the shader cores are swamped with pixels to process, so the memory controller/DRAM gets hit less.

However, at 1080p the fps was 136, so the full frame time was 7.35 milliseconds; at 4K, the time becomes 17.2 ms. With the power demand at 4K being 224W, a total of 3.85 joules of energy was required to render and present the frame; at 1080p, it was 1.85 joules.

Thus more energy was consumed to render at the higher resolution, as one would expect - I.e. the GPU is working 'harder'. But just using the power figure alone wouldn't necessarily tell us that.