Borderlands 3 Graphics Benchmark: 60 Nvidia and AMD GPUs Tested

Too bad there’s no test for the 2080 Super.

Fortunately, Nvidia has the top spots sewn up and my 2080Ti is pretty much untouchable.

Great work!
 
Thank you for great article specially including old graphic cards.
Just one thing please, can you add CPU performance in the next articles as Ryzen become good competitive
thanks in advance
 
Interested in what settings could have been turned down to achieve 144fps. Because seeing the 2080Ti in 1080p like that is a little disappointing. Lows of 84 don't seem right.
 
Nice run down. Good job including "budget" GPUs like the 1050ti. Considering that the 1060 and 150 series account for 30% of all systems in identified in the Steam Survey, they should be front and center in any game testing.
 
The optimization is pretty disappointing. Even RTX 2070 barely approaching 60fps on 1440p. Considering it's still using a same cell shaded technique as BL2, welp...
 
The framerates seem unusually low/bad for a game with basically cell shaded graphics that I presume is far less graphically intensive than more realistic style graphics.
 
Techspot, will there be any reporting on the denuvo DRM in borderlands 3 being used as a keylogger, even when the game is not running, as found by pirates and crackers investigating why the cracked game will only run on certian CPUs? Or the constant uploading of data to Epic's servers?

https://imgur.com/a/ZV8RT1L
https://www.reddit.com/r/borderlands3/comments/d3ojqa/borderlands_3_requires_2_mbs_of_drm_uploading/

This kind of thing is why I refuse to use the epic store.
I saw one comment saying it's only an issue on a 32-bit OS. But it definitely needs more accurate analysis. The constant upload is suspicious and a keylogger is completely unacceptable.
 
Looks like anyone who bought this game on Epic is taking part in a Beta test with the performance the game is getting, higher resolution BL2 with some minor lighting and shadow updates should not be making the current line up of high end GPUs struggle.

Anyway we could see what system resource usage looks like to possibly identify some kind of bottle neck?
 
Looks like one of the ultra settings is running something in the distance that's not necessary maybe, something simlar to assasins creed odisey, with clouds on high you got a massive performance hit(it's fixed now I think).
 
DX12 is starting to smell more and more like abandonware with its performance track record.
Nah, tis perfectly on track. DX11 didnt take off until long after its release, and the release of DX 11.1, and games are still coming out on DX9.

The big issue with DX12 is simply what benefit it provides. That closer-to-the-metal coding takes more skill and more time, two things game companeis dont have in abundance right now with rising costs, constant employee churn, and the pressure to release games after a few years of development.

And at the end of the day, DX11 offers plenty of graphical effects. DX12 jsusst wasnt a very big jump like DX11 was from 9, with terrain deformation and better tesselation.

DX12 likely wont be mainstream until the end of the upcoming console generation.
 
Unreal Engine 4 doesn't seem well suited to open world games. Ark:SE uses UE4 and runs like pants too, and how many years old is it now? Are there any UE4 open world games that are actually well optimized?
 
Unreal Engine 4 doesn't seem well suited to open world games. Ark:SE uses UE4 and runs like pants too, and how many years old is it now? Are there any UE4 open world games that are actually well optimized?

Even if this is true, nothing is stopping devs from making their own plugins to improve performance. Being a AAA title, gearbox has no excuse for not getting better performance. If Bethesda of all companies can add 64 bit support to the almost 20 year old Gamebyro engine, Gearbox can improve performance of their game.
 
Nah, tis perfectly on track. DX11 didnt take off until long after its release, and the release of DX 11.1, and games are still coming out on DX9.

The big issue with DX12 is simply what benefit it provides. That closer-to-the-metal coding takes more skill and more time, two things game companeis dont have in abundance right now with rising costs, constant employee churn, and the pressure to release games after a few years of development.

And at the end of the day, DX11 offers plenty of graphical effects. DX12 jsusst wasnt a very big jump like DX11 was from 9, with terrain deformation and better tesselation.

DX12 likely wont be mainstream until the end of the upcoming console generation.
Heard the same line for DX10.
 
Even 2080Ti can't keep up at 4K for it's exorbitant wastefully expensive price for a very recently released game.

The 2080Ti has been running away with the flamboyant pricing in the name of playing 4K games at 60fps average. No need to even talk about achieving minimum 60fps in all the currently available games. But couldn't keep up here.

This is the reason the consumers shouldn't jump into purchasing the so-called fastest card just because it's the most expensive card.

On the other hand, I'm surprised to see the 5700XT comes very close to the more expensive cards' minimum FPS. And this is not even the flagship yet. And it comes at a fraction of Nvidia's arrogant pricings of it's flagships.

Thanks to wise Techspot readers, especially @Evernessince, I made a very sound purchase decision after a long time. And I am very happy with it. The Sapphire Pulse 5700XT is as quiet (if not quieter) as my previous Palit Super Jetstream 980Ti purchase.

What I aim for is minimum 60fps at 1080p, (even though I have a 4K monitor and 4K TV), because the minimum 60fps is the deciding factor for consistent, smooth performance from beginning to end. As an added bonus, the 5700XT is able to run some games at 4K fluidly.

Coming back to Borderlands 3, I and my friend have been playing Borderlands 2 almost everyday nowadays and still enjoy it. I think may get B3 when on sale next year.
 
Even 2080Ti can't keep up at 4K for it's exorbitant wastefully expensive price for a very recently released game.

The 2080Ti has been running away with the flamboyant pricing in the name of playing 4K games at 60fps average. No need to even talk about achieving minimum 60fps in all the currently available games. But couldn't keep up here.

This is the reason the consumers shouldn't jump into purchasing the so-called fastest card just because it's the most expensive card.

On the other hand, I'm surprised to see the 5700XT comes very close to the more expensive cards' minimum FPS. And this is not even the flagship yet. And it comes at a fraction of Nvidia's arrogant pricings of it's flagships.

Thanks to wise Techspot readers, especially @Evernessince, I made a very sound purchase decision after a long time. And I am very happy with it. The Sapphire Pulse 5700XT is as quiet (if not quieter) as my previous Palit Super Jetstream 980Ti purchase.

What I aim for is minimum 60fps at 1080p, (even though I have a 4K monitor and 4K TV), because the minimum 60fps is the deciding factor for consistent, smooth performance from beginning to end. As an added bonus, the 5700XT is able to run some games at 4K fluidly.

Coming back to Borderlands 3, I and my friend have been playing Borderlands 2 almost everyday nowadays and still enjoy it. I think may get B3 when on sale next year.

Happy to see you are enjoying the card!

If you haven't played it already, there is also borderlands the pre-sequel. You can typically find it on sale for around $12.
 
Back