Destiny 2 Benchmarked: 30 GPUs Tested

They still have work to do on the CPU side for AMD, but in general it looks ok.

Runs buttery smooth on a Ryzen 3 so I'm not sure what they need to improve.

Everybody judges performance in a relative manner, meaning that when you see a longer bar in the review you think you will need/feel that difference. In reality, performance of Ryzen CPUs is almost always more than good enough and only maybe the hard core guys which want 144Hz might need to look on the Intel side for certain games.
 
They still have work to do on the CPU side for AMD, but in general it looks ok.

Runs buttery smooth on a Ryzen 3 so I'm not sure what they need to improve.

Everybody judges performance in a relative manner, meaning that when you see a longer bar in the review you think you will need/feel that difference. In reality, performance of Ryzen CPUs is almost always more than good enough and only maybe the hard core guys which want 144Hz might need to look on the Intel side for certain games.

It's not only hard core guys, you will need that difference when you wanna stick a new gpu in the future.
 
Hey short haired Steve, a new radeon driver just came out with huge gains in Destiny 2.
What are you going to do now?
 
So what I'm seeing here is that a 3 year old GTX970 is at least as good as the most powerful consoles available on this particular game. 4K 30FPS.
 
What part of the game was tested here? This is why I can never trust Graphs Here is a Video showing the 1080Ti at 1440p dropping into 70s
 
What part of the game was tested here?

From the article:

"Since you can't save your progress during the single-player tutorial/intro, I only tested the first 60 seconds of the game."

Also from the article:

"The maximum quality preset -- called 'highest' -- completely smashes performance because of MSAA being enabled. Seeing as I didn't really notice a difference between MSAA and SMAA, I opted for the second-highest quality preset ('high') which uses the latter anti-aliasing method."

The Youtuber uses "highest" setting, but with MSAA switched to SMAA. His system specs are also pretty vague. "GTX 1080 TI and a Kaby Lake 7700K", but at what clockspeeds and with what memory? Did he have a clean install of Windows or was he running all sorts of bloatware in the background? What program was he using to capture FPS figures? All of these things can affect performance.

Note also that the graphs do not show the absolute minimum FPS figure encountered, but the 1% low. You can't deduce the 1% low just by watching the Youtube video, so comparing dips seen on the video to the 1% low graph just makes it more of an apples to oranges comparison - which it already is because of the different graphics settings and possibly different hardware settings as well.
 
Back