Red Dead Redemption 2 PC Graphics Benchmark: What You'll Need to Play

Looking forwards to Tim's guide on this one - something in the settings does not play nicely with GTX cards.
 
Thanks for the review. Still running a 980, so glad I haven't bought this game yet. Oh well. Graphics card prices are still insane for minimal improvement, plus I won't downgrade to Win10 to build a new rig. :/

These results won't sell hardware for me.
Actually there is a program that will allow you to run the supposedly "Windows 10 only" hardware on Windows 7. It is called mufuc and worked fine for us on a recent Ryzen 5 2600 build with an RTX 2060.
 
GTX 1060 @1080p is delivering a mind-blowing 34/38 FPS. Why would you consider anything lower than that if even a 1060 is useless in this game at High settings. There are 21 cards on the benchmark charts.
But as we know Steve, if there is enough demand for Low Graphical settings benchmark with low-end cards just to see what is the absolute minimum you can play this game with, he will deliver that.

Actually, the rx 560 can do 24 fps at 1080p high, which is somewhat playable
https://www.game-debate.com/low-vs-...-red-dead-redemption-2/3961-radeon-rx-560-4gb

but it would be interesting to see a benchmark for low end cards at low/med settings
 
Listen, if none of the gtx 10x cards of Nvidia won't go above 60fps then in my book the game isn't coded or optimized properly. Not way in hell I'll do an upgrade for 1 game. Rockstar has a tendency to make low effort towards mid to low range anyways. I wasn't expecting anything from them. I got a gtx 1060 6gb and I'm surprised the game wont go above 40fps which is shameful
I got a Msi Laptop with GTX 1060 back during college which was a very popular choice among my friends(almost 80 percent of gaming laptops in my dorm had a 1060) finished college then I got a rig with i7 8700k with a RTX 2070 and another i9 9900k + RTX 2080Ti rig for an Ubuntu Server..now I play on my 2070 rig ...time to switch the GPUs between the rigs I think...
 
Actually there is a program that will allow you to run the supposedly "Windows 10 only" hardware on Windows 7. It is called mufuc and worked fine for us on a recent Ryzen 5 2600 build with an RTX 2060.

I did some searching and could not find "mufuc". Would you please share a link? That would be great.
 
I got a Msi Laptop with GTX 1060 back during college which was a very popular choice among my friends(almost 80 percent of gaming laptops in my dorm had a 1060) finished college then I got a rig with i7 8700k with a RTX 2070 and another i9 9900k + RTX 2080Ti rig for an Ubuntu Server..now I play on my 2070 rig ...time to switch the GPUs between the rigs I think...
Huh? Why on earth would you switch GPUs? Just switch the purposes of each machine? Why is a 2080Ti and 9900K in an Ubuntu server? Just wanted to talk?
 
Why not test also lower performance GPUs? It is not like everyone has GTX1060 or higher.
Read the article. Or watch the youtube video. Steve explains in both why the range of tested cards and graphics settings is so small for this game.
 
Where is the test bench specs? Where the specific version of the drivers and windows (and even the game) used for testing? All of this info should be present in any benchmark and without it is impossible to compare my own results now and in in the future after drivers, OS, and game updates.
 
Where is the test bench specs? Where the specific version of the drivers and windows (and even the game) used for testing? All of this info should be present in any benchmark and without it is impossible to compare my own results now and in in the future after drivers, OS, and game updates.
Go ahead and read the review. If you do, you'll find this:

"Our GPU test rig has been used as usual sporting a Core i9-9900K overclocked to 5 GHz and 16GB of DDR4-3400 memory. The latest AMD and Nvidia drivers have been used, testing at 1080p, 1440p and 4K."
 
I use the preset settings upscaled from 1080p to 4k and it looks amazing. 5700 non xt ryzen 2600 oced. 48 avg fps. Can't wait to see how it runs on Stadia, well then again I won't be whopping down another $60 for the same game.
 
Hi guys. Thanks for this performance review. I agree that the performance on the Pascal cards seems very odd. How can the 1660Ti be only a few frames slower than a GTX 1080 at 1080p? I would not even consider those two cards in the same performance league. Also the RTX 2060 is almost akin to the 1070Ti in terms of performance, but here it trounces both the 1070Ti and the GTX 1080. And the Radeon RX 5700 and 5700XT is just delivering awesome performance here as are the older Vega 56 and 64. I think AMD has the upper hand here in terms of optimized drivers.
 
Nvidia loves brute force optimization! RTX Titan anyone? I thought low level APIs was supposed to give us better performing ports. What's up with the Vulkan api version? Why does the PC community have to fix this game? This is not acceptable.
Can it run Red Dead redemption 2 PC meme ? Really that is a discrace to Crysis which launched on PC 1st!
 
Last edited:
I have an i5 6600k overclocked to 4.4Ghz and a GTX 1060 6gb and on high settings (with the advanced stuff turned off or on low) it varies from about 35fps to just under 60fps at times at 1440p. My monitor has G-Sync so it looks ok. The main problem is sometimes the CPU usage goes to 100% and the game freezes for a couple of seconds. If the game came out at the same time as the PS4 version I could understand but after porting it for a year I thought it would run better. Bit pointless making a game that only runs smoothly on top end hardware which the majority of people don't have.
 
I have an i5 6600k overclocked to 4.4Ghz and a GTX 1060 6gb and on high settings (with the advanced stuff turned off or on low) it varies from about 35fps to just under 60fps at times at 1440p. My monitor has G-Sync so it looks ok. The main problem is sometimes the CPU usage goes to 100% and the game freezes for a couple of seconds. If the game came out at the same time as the PS4 version I could understand but after porting it for a year I thought it would run better. Bit pointless making a game that only runs smoothly on top end hardware which the majority of people don't have.
R* i5 bug.

I've read using process lasso to disable one single core when the CPU hits 98% usage when RDR2.exe is running will eliminate most of the i5 (non HT) stutter. I don't know if you need to tell it to renable the core after usage drops or after a certain time period but iirc they wrote to disable the single core for only 1 second and that it should be configurable for time in process lasso.
 
Go ahead and read the review. If you do, you'll find this:

"Our GPU test rig has been used as usual sporting a Core i9-9900K overclocked to 5 GHz and 16GB of DDR4-3400 memory. The latest AMD and Nvidia drivers have been used, testing at 1080p, 1440p and 4K."

I’m not sure if those specs were there before, but regardless driver version is not documented, since Nvidia came out drivers so close to launch we already don’t know which version was used.
 
Something tells me clever Lisa Su managed to catch Nvidia off guard with better drivers just like how she fooled Intel years ago stating we aren't focused on high end cpu chips. Intel was caught off guard.

It's either that or Nvidia is gimping it's drivers on purpose to push more of their higher end cards. To be honest I'm surprised by AMD's GPU's benchmarks.
 
Something tells me clever Lisa Su managed to catch Nvidia off guard with better drivers just like how she fooled Intel years ago stating we aren't focused on high end cpu chips. Intel was caught off guard.

It's either that or Nvidia is gimping it's drivers on purpose to push more of their higher end cards. To be honest I'm surprised by AMD's GPU's benchmarks.
AMD clearly performs better in this title, which made me think if it is thanks to having developed the game primarily for consoles which use AMD chips.
I think Nvidia has a much much better track record of drivers than AMD.
 
Last edited:
Ok, we are different, I think 24 fps in either gta5 or rdr2 are definitely not enough

You must not have grown up playing Goldeneye on the N64. Tolerance for lower FPS can be inculcated if you play it long enough. Playing overseas on a 720p TV@30Hz with a GTX 880M laptop for months in 2014 trained my brain to be fine with it even though I had 144Hz monitors, a 5930K, and 780Ti SLI back home. Many amongst the average PC gamers are happy with the smoothness they have been conditioned to, even if that is well below 1080p/60FPS. Really they are the smart ones, saves them a lot of $$ and doesn't impact their enjoyment.
 
Last edited:
You must not have grown up playing Goldeneye on the N64. Tolerance for lower FPS can be inculcated if you play it long enough. Playing overseas on a 720p TV@30Hz with a GTX 880M laptop for months in 2014 trained my brain to be fine with it even though I had 144Hz monitors, a 5930K, and 780Ti SLI back home. Many amongst the average PC gamers are happy with the smoothness they have been conditioned to, even if that is well below 1080p/60FPS. Really they are the smart ones, saves them a lot of $$ and doesn't impact their enjoyment.
Ture, I never had an n64 :)
We had commodores, zx spectrum and early x286 PCs. ;-)
Now that you pointed this out I remember my gaming experience 20 years ago was more like a stuttered powerpoint presentation than smooth frames but I never had problem with it.
Perhaps the last 5 years with 144hz and g-sync made me soft... :) but there is no turning back from here for me.
 
Back