We nearly lost count of how many graphics cards we put down for this test, but let's say it was roughly sixty GPUs from both camps to check out Borderlands 3 performance. For testing we're running the game in the DirectX 11 mode which we found to be overall smoother than DX12, which is still in development for this title.
Borderlands 3 seems to hammer the CPU quite a bit and we suspect the DRM is to blame for that. A lot of quad-core owners are complaining about horrible stuttering, so that's something to be aware of. For testing we've used our Core i9-9900K test bed clocked at 5 GHz with 16GB of DDR4-3400 memory. After some investigation we decided against using the built-in benchmark and we generally try to avoid canned benchmarks when we can. Instead we are using the 'propaganda center' for our test though performance does appear to be close to the numbers you'll get with the benchmark tool.
On that note, if you're looking to squeeze the best possible Borderlands 3 performance with a marginal visual hit, check out Tim's excellent optimization guide on YouTube. He was able to boost performance by around 50% using customized settings for virtually no image quality difference when compared to the Ultra preset.
Some additional testing notes. For the GeForce graphics cards we're using the Game Ready 436.30 driver and for Radeons the Adrenalin 2019 Edition 19.9.2 driver. Both are optimized for Borderlands 3, so make sure you're running the latest driver version. Our benchmark was run using the ultra preset at 1080p, 1440p and 4K, then re-tested with a few GPUs at 1080p with the medium preset. We've thrown in a heap of older GPUs as well for good measure.
Starting from the top we have, of course, the RTX 2080 Ti where it pumped out an average of 122 fps, which for 1080p doesn't seem all that high. The RTX 2080 and 1080 Ti were limited to around 105 fps which is honestly the kind of performance you'd hope to see at 1440p in a game like this.
Moving on the RTX 2070 Super was 9% faster than the 5700 XT on average which is a pretty typical margin and a good result for the Radeon GPU considering the 1% low result was also slightly better for the 5700 XT.
A little further down we see the standard 5700 averaged 80 fps in our test making it a little faster than the GTX 1080 on average and much better when comparing the 1% low data. Unfortunately, we couldn't test the RTX 2060 Super nor the 2080 Super as we have those cards on a separate duty for testing Ray Tracing on content that will go live soon.
When compared to the standard RTX 2060, the Radeon 5700 was 8% faster, so not a huge margin and that means it should be comparable to the 2060 Super. As you might have expected Vega 56 matched the GTX 1070 while Vega 64 was a smidge faster, placing it just behind the 1070 Ti.
The GTX 1660 Ti does well here, basically matching the GTX 1070 while beating the old GTX 980 Ti, the flagship from a few generations back. The vanilla GTX 1660 gives the RX 590 a hard time, delivering 12% more frames and that's quite good given it's only about 10% more expensive. The GTX 1660 also averaged over 55 fps in our test whereas the RX 590 fell below 50 fps.
Making our way down to the GTX 1060 and RX 580 at just under 50 fps, even the 3GB model of the 1060 does pretty well at 1080p. The old GTX 970 does okay with 41 fps and we see just 39 fps from the RX 570. The GTX 1650 is the cut off here though and even then you'll want to start optimizing the quality settings for better performance.
The 1440p results show a similar pattern, now starting at 90 fps on average with the RTX 2080 Ti. That's comparable performance to what you'll see in a title like Assassin's Creed Odyssey and much worse than The Division 2, Shadow of the Tomb Raider, and Metro Exodus, to cite a few examples.
The standard RTX 2080 was 21% slower rendering 71 fps on average and we see a further 7% performance downgrade with the 2070 Super. This time the 2070 Super was 11% faster than the 5700 XT, but again that doesn't really justify the 25% increase in price. The standard 2070 and non-XT version of the 5700 were comparable in terms of performance, pushing frame rates into the mid 50s. Then right on the cusp of 50 fps we find the GTX 1080 and RTX 2060.
Dropping below 50 fps you'll find GPUs such as Vega 56 and 64 along with the GTX 1070 and 1660 Ti. Once we get down to the GTX 1660 you'll really need to reduce the quality settings for better performance. A lot of these GPUs averaged around 25-30 fps and obviously that's not an ideal situation on PC.
Jumping up to 4K hammers performance as you'd expect and now even the RTX 2080 Ti can't achieve 60 fps on average. That said, with Tim's optimizations the 2080 Ti should be good for a little over 70 fps on average in our test. Maybe we would have done more benchmarking using tweaked graphics settings across the board but by the time we had the information we were already 50 GPUs deep into testing.
This also means that with optimized quality settings you should be able to push near 60 fps at 4K with the GTX 1080 Ti or RTX 2080.
Medium Settings on Budget GPUs
To wrap up testing we ran a few GPUs with the medium quality preset while also adding more budget and older GPUs to the mix. Here we see a massive 80% performance improvement for the GTX 1660 when going from ultra to medium. The GTX 1060 saw a 75% performance increase and is now pushing 82 fps on average at 1080p.
The excellent value Radeon RX 570 saw a massive 109% performance uplift when downgrading the quality preset from ultra to medium. We also so a big 78% increase for the R9 390, though it wasn't as staggering as the 570's jump. We see some old GPUs like the GTX 970 performing well with the medium quality preset. The R9 290 is also hanging in there quite comfortably with 67 fps on average.
There's a relatively new GPU we've yet to mention and that's because it was too slow using the ultra quality settings, the GTX 1650. With medium settings it average 65 fps which is decent, though for less money the RX 570 was nearly 30% faster, hence our usual recommendation in favor of the budget Radeons.
Still the GTX 1650 was much faster than older high-end GPUs such as the R9 280X and GTX 770. The oldies still average 50 fps at 1080p, so that puts you on fully playable territory which is great. There were loads of GPUs in this performance class, such as the R9 380X, 7970, 1050 Ti, 680, 1050, 380 and 680.
For an average of 40 fps or better all you'll need is the Radeon HD 7950 or GTX 960.
That's how Borderlands 3 performs at launch using a range of GPUs with the ultra and medium quality settings. It's a shame the quality presets do so poorly in this title. You can achieve a lot by tweaking the settings manually and doing will lead to around a 50% boost over the ultra quality results shown here.
As we noted on the feature's opening, quad-core CPUs seem to be getting maxed out in this title, something we'll have to investigate further. As for memory, you'll also want 16GB of RAM for smooth performance, we often saw total system usage around 7GB when running Borderlands 3, with the game itself consuming around 3GB.
As for graphics memory, 4GB appears fine for 1080p. The 3GB GTX 1060 did well in our testing but we have a high-end system that may be masking performance issues with that configuration. We got away with 4GB even at 1440p. Naturally for gaming at 4K you'll want at least 6GBs of VRAM, although GPU horsepower becomes a larger issue at 4K and high-end GPUs have well over that amount of memory these days, so that should be a non-issue.