The RTX 3080 hits 100+ fps in many top titles at 4K with max settings and RTX on

midian182

Posts: 9,759   +121
Staff member
Something to look forward to: We know that the RTX 3080 is a beast of a card, one that leaves the Turing flagship RTX 2080 Ti in the dust. Nvidia recently showed how the two compare in a 4K showdown on Doom Eternal. Now, the company has promised frame rates over 100 fps in other AAA games, all while in 4K with settings maxed and RTX on.

Nvidia yesterday posted a video that displayed the RTX 3080's ability to power Doom Eternal in 4K at around 150 fps with settings turned all the way up. That's very impressive, but what about games that aren't as well-optimized as id Software's shooter? Nvidia says we can still expect anywhere between 60 and over 100 fps, even at that high resolution.

In a Reddit Q&A, Nvidia's director of GeForce product management, Justin Walker, answered a question on whether the RTX 3080's 10GB of GDDR6X was enough to run next-gen AAA titles.

"We're constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games," said Walker. "The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price."

Walker named Shadow of the Tomb Raider, Assassin's Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3, and Red Dead Redemption 2 as games that will hit that 60 – 100 fps range while in 4K with max settings, including any texture packs, and with RTX on where supported.

In a question regarding the RTX 3080's ability to handle 144Hz monitors, Walker named Doom Eternal, Forza 4, and Wolfenstein Youngblood as titles that can run at 144fps while maxed out at 4K. He added that more demanding games such as Red Dead Redemption 2, Control, and Borderlands 3 would be closer to 60 fps.

As to why the RTX 3080 didn't come with more than 10GB of GDDR6X, Walker said it was a matter of balancing price vs. performance. "Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance."

The $699 RTX 3080 launches on September 17, though we've heard stock levels could be limited, so you might struggle to buy one this year.

Yesterday brought news that an RTX 3070 Ti featuring 16GB of GDDR6 might be in the works. If true, don't expect to see one anytime soon.

Permalink to story.

 
If it can do Assassin's Creed Odyssey at 60fps+ that's a huge improvement. 2080ti regularly was in the 40s with that.
 
Amazing how Borderlands 3 is one the the games that is more demanding when the game looks like crap

Man, B3 was a performance nightmare at launch, and it took them many months to even iron out the major kinks of frame rate drops & game crashes.

I didn’t even purchase the game watching the reviews... terrible coding all around.
 
Man, B3 was a performance nightmare at launch, and it took them many months to even iron out the major kinks of frame rate drops & game crashes.

I didn’t even purchase the game watching the reviews... terrible coding all around.
It's still a very buggy game a year after release... doubt it would be fixed at this point.
 
I’ve been wondering about whether 8-10GB is enough VRAM for 4K. My 2080 can use most of its 8GB memory at 1440p. But game engines can use the extra memory it doesn’t need to cache things so it’s hard to tell how much is actually needed. If much bigger capacity cards are coming say next year then that’s a good enough excuse not to upgrade my 2080 now. Not that I really need one, at 1440p 144hz my 2080 handles games quite well enough. But as seems to be the case when I buy a new graphics card now, I need a monitor too. So whilst 4K cards may be available at reasonable prices, reasonably priced 4K 144hz monitors are not. I think il wait until I can get both a 4K 144hz monitor and a GPU of 3080 level performance or more to power it for less than £2000 combined before I pull the trigger on 4K.
 
3080 does not have NVLink, Nvidia doing so because of the huge price increase from the 3080 to the 3090.

Supersad because some folks like adding a second card when the next generation comes out and prices drop to extend the life of it.
 
I’ve been wondering about whether 8-10GB is enough VRAM for 4K. My 2080 can use most of its 8GB memory at 1440p. But game engines can use the extra memory it doesn’t need to cache things so it’s hard to tell how much is actually needed. If much bigger capacity cards are coming say next year then that’s a good enough excuse not to upgrade my 2080 now. Not that I really need one, at 1440p 144hz my 2080 handles games quite well enough. But as seems to be the case when I buy a new graphics card now, I need a monitor too. So whilst 4K cards may be available at reasonable prices, reasonably priced 4K 144hz monitors are not. I think il wait until I can get both a 4K 144hz monitor and a GPU of 3080 level performance or more to power it for less than £2000 combined before I pull the trigger on 4K.
It's fine for 4k, take Doom Eternal it reserves 9.1Gb of VRAM but never uses all of it most 4k games use 2-6gb actual usage, on top of that there is new better compression and NVcache, and it's technically not ddr but qdr ram wafers.
 
3080 does not have NVLink, Nvidia doing so because of the huge price increase from the 3080 to the 3090.

Supersad because some folks like adding a second card when the next generation comes out and prices drop to extend the life of it.
Multi gpu is dead when Peterson left.
The only reason a 3090 has a bridge is because it's technically a Titan made for content creators, there is zero reasoning for needing a second one, the bandwidth on PCIe 4 means the scaling would be garbage for 3000 series.
 
Definitely. Still there is something to it - Nvidia are clearly claiming their share of gamers wallet before the big two consoles land. The new consoles suddenly don't look as exciting.

Depends on price. If a 500ish *system* offers a good gaming experience, what‘s not to like about it ? Check what kind of gaming PC $1000 gets you from an OEM.
 
It's fine for 4k, take Doom Eternal it reserves 9.1Gb of VRAM but never uses all of it most 4k games use 2-6gb actual usage, on top of that there is new better compression and NVcache, and it's technically not ddr but qdr ram wafers.
Well it’s clearly going to be fine for 4K now. But what in a couple of years? I usually keep a GPU for 2 years minimum. We have had 8GB flagship cards for quite a while now, 7 years I think? Was the 290X the first 8GB card? Can’t remember. 10GB is def an improvement but it feels small.
 
Well it’s clearly going to be fine for 4K now. But what in a couple of years? I usually keep a GPU for 2 years minimum. We have had 8GB flagship cards for quite a while now, 7 years I think? Was the 290X the first 8GB card? Can’t remember. 10GB is def an improvement but it feels small.
In a couple of years 4000 series will be released, by then make the choice, your also not taking into account NVcache and compression as well as the faster transmission rates. While I can understand it may seem like it.
It the card for the hear and now at most you will get 1-4 years out of what you are buying.
Also by then we will probably be gaming at 8k on the high end, so there is that.
 
Back