Nvidia Resizable BAR Tested, Benchmarked

"It’s worth noting that in order to enable/disable resizable BAR with a Radeon or GeForce graphics card, you need to reboot the system, enter the BIOS, and toggle it on or off there. So that’s not exactly a practical solution and we’d argue doing so means the performance gains are no longer free, they come at the expense of your time and energy"

This is the best reason not to enable it.
Who has +-2 minute these days?
 
How come a 3080 was getting ~100fps at 4K and not its getting nearly 150fps with suposedly the same settings ?? Almost 50% more..
And all other outlets also get around ~100fps at 4K high settings . Its looks like you are running at 4K med , not high settings.

The change of CPU doesnt justify that. its at 4K..
And even all other results, are around 10% higher or more than results from 2 months ago at 4K; when the CPU change would have little to now effect on that..

https://www.techspot.com/review/2099-geforce-rtx-3080/
 
"It’s worth noting that in order to enable/disable resizable BAR with a Radeon or GeForce graphics card, you need to reboot the system, enter the BIOS, and toggle it on or off there. So that’s not exactly a practical solution and we’d argue doing so means the performance gains are no longer free, they come at the expense of your time and energy"

This is the best reason not to enable it.
Who has +-2 minute these days?
I have. I did it already. Simple stuff.
 
"It’s worth noting that in order to enable/disable resizable BAR with a Radeon or GeForce graphics card, you need to reboot the system, enter the BIOS, and toggle it on or off there. So that’s not exactly a practical solution and we’d argue doing so means the performance gains are no longer free, they come at the expense of your time and energy"

This is the best reason not to enable it.
Who has +-2 minute these days?

Not to mention that everyone agrees, you should never reboot your computer for any reason whatsoever. You lose all the extra fps's cached in your system.
 
Not surprising that AMD made their card with this feature in mind. Nvidia just panicked, enabled it without proper support and got much worse results.

This feature AMD does much better than Nvidia. Even better, with AMD this works on every game. Not on just few like DLSS or RT.
 
I prefer to simply go for the most VRAM.
My 3090 FTW3 is 24GB DDR4.
Easier to just "brute force" my way forward.
I ain't got time for overclocking and memory gimmicks.

Last time I checked gpu's don't use DDR4 memory, Seems you also don't have time to be technically correct :p

24 GB GDDR6X memory and in a non bar enabled setup your system is accessing that in 256mb chunks.
 
If Techspot wasn’t biased they would be slamming AMD for marketing this as an exclusive tech.

Also I do find it amusing that Techspot still tests GPUs like a 3080 at 1080p. We all know it’s because that’s the only place Radeon can get any ground on Nvidia. Nobody is buying a 3080 or a 6800XT and plugging it into a 1080p monitor and turning off RT. But that’s exactly what Techspot test and then weight into the conclusion. Lol.
 
If Techspot wasn’t biased they would be slamming AMD for marketing this as an exclusive tech.

Also I do find it amusing that Techspot still tests GPUs like a 3080 at 1080p. We all know it’s because that’s the only place Radeon can get any ground on Nvidia. Nobody is buying a 3080 or a 6800XT and plugging it into a 1080p monitor and turning off RT. But that’s exactly what Techspot test and then weight into the conclusion. Lol.
For someone who goes into every GPU article to make absurd comments about TS being biased you sure like to spend a lot of time here. I can't even imagine how you must feel spending hours on a forum you dislike so much.
 
If Techspot wasn’t biased they would be slamming AMD for marketing this as an exclusive tech.

Also I do find it amusing that Techspot still tests GPUs like a 3080 at 1080p. We all know it’s because that’s the only place Radeon can get any ground on Nvidia. Nobody is buying a 3080 or a 6800XT and plugging it into a 1080p monitor and turning off RT. But that’s exactly what Techspot test and then weight into the conclusion. Lol.
To be brutally honest here I was looking forward to buying a better screen.
Fell by the wayside though.
 
If Techspot wasn’t biased they would be slamming AMD for marketing this as an exclusive tech.

Also I do find it amusing that Techspot still tests GPUs like a 3080 at 1080p. We all know it’s because that’s the only place Radeon can get any ground on Nvidia. Nobody is buying a 3080 or a 6800XT and plugging it into a 1080p monitor and turning off RT. But that’s exactly what Techspot test and then weight into the conclusion. Lol.

I'm confused: how can the test result be biased? If Techspot was biased the way you have described, they would have tested in 1080p only, and not in all 3 popular resolutions (not to mention that if 1080p data were missing, they would get a lot of heat for not including those results...among others, from me :) ).

So now everyone (including you) can refer to the data set they prefer/need...freedom of choice, it's great, isn't it?

Oh, and "lol", of course, that is obligatory, I almost forgot :)
 
I'm confused: how can the test result be biased? If Techspot was biased the way you have described, they would have tested in 1080p only...
FYI - Attempts to untangle SB rants will pretty much peg the "waste of time" meter. Just consider the moniker.
 
Last edited:
Back