BioShock Infinite Tested, Benchmarked

By on April 1, 2013, 3:00 PM

Three years having passed since BioShock 2 and the dawn of a new console generation on the horizon, BioShock Infinite has taken the opportunity to mix things up. Although it's still a first-person shooter published by 2K Games and contains similar concepts and themes, the third installment doesn't follow the same story, being set decades before the previous entries in a floating city called Columbia.

Along with DX11 effects, folks playing on PC can look forward to higher resolution textures and a healthy range of customization. Infinite comes with six graphical presets from "very low" to "ultra" that should hopefully cover a broad performance spectrum, not to mention individual control over settings like anti-aliasing, texture detail and filtering, dynamic shadows, post-processing, and so on.

As the cherry on top, the developer has fully embraced widescreen gaming with what it calls "horizontal plus" widescreen support, so the wider you go, the more you'll see of Columbia’s gorgeous vistas. In that same vein, it should be noted that there's also multi-monitor support for AMD Eyefinity, Nvidia Surround and Matrox TripleHead2Go. Plenty to see for sure, and we're eager to dig in.

Read the complete article.




User Comments: 23

Got something to say? Post a comment
TomSEA TomSEA, TechSpot Chancellor, said:

Bioshock Infinite also comes with its own benchmarking tool. I run two MSI GTX 660ti's in SLI and at 2560 x 1440 resolution with the ultra setting was getting 60-90 FPS depending on scene.

As the benchmarking shows here in the article, you can get some really nice FPS with a mid-range machine and card.

Rage_3K_Moiz Rage_3K_Moiz, Sith Lord, said:

Have you guys seen any difference in performance with the alternate post-processing option enabled? The game certainly looks quite different with it to me.

LNCPapa LNCPapa said:

I occasionally hit the memory limits of my 2GB 680 SLI setup running 2560x1440 at ultra during the benchmark. It makes for some seriously stuttery moments during the benchmark. Watching the memory usage on my G15 plugin shows it maxed out as well. If I drop it down just one level on the quality settings it runs much smoother and VRAM usage maxes out around 1.6-1.7GB for me.

JC713 JC713 said:

It is interesting that this game isnt very CPU intensive. Usually you will see a 5-10FPS increase from an increased clock speed from 2.5GHz to 4.5GHz. Also it is odd how OCing the CPU made the minimum FPS drop.

hahahanoobs hahahanoobs said:

Also it is odd how OCing the CPU made the minimum FPS drop.

Those aren't min framerates, they're frame times, and lower is better.

JC713 JC713 said:

Those aren't min framerates, they're frame times, and lower is better.

My bad, didnt see that.

hahahanoobs hahahanoobs said:

My bad, didnt see that.

It's cool.

Guest said:

Hmmm, those 6970 results seem strange to me. It performs worse than a 5870?

hammer2085 said:

Have you guys seen any difference in performance with the alternate post-processing option enabled? The game certainly looks quite different with it to me.

http://www.youtube.com/watch?v=4oIusxRznG4

Guest said:

As always, excellent graphical comparison from TechSpot. I'd love seeing this game on a PC, currently own it on the 360 and looks marvelous.

Regarding the game itself, I highly recommend it to everyone. Especially to those who like utopia-themed environment.

Twixtea said:

Sorry if this is a dumb question but what causes performance drop when measured frame time in milliseconds compared to the regular FPS benchmark? I'm not really understanding the difference.

Guest said:

No GTX 690 ?

Deejay214 Deejay214 said:

Why no GTX 690 ?

Skidmarksdeluxe Skidmarksdeluxe said:

I'm really not interested in frame rate & time frame numbers. As long as I can have smooth gameplay and an enjoyable gaming experience with most of the eye candy turned up @ 1980x1080 I'm more than satisfied. I'll definitely get this game as soon as it hits bargain basement price. I'm sure my GTX 670 will be up to the task. If not, I want my money back. (for both game & card)

1 person liked this | Guest said:

Come on, TechSpot. You measure average framerate and then 99th percentile frametimes and you are shocked when there's a big difference.

Steven Walton obviously doesn't understand frametimes testing: it's not an average, it's a threshold under which 99% of the frames were produced. Much like minimum framerate, it's there to give an idea how each card performs in the worst-case scenario in the test. If you were doing average frametimes, or 50th percentile frametimes, it would match the framerate when converted.

1 person liked this | slh28 slh28, TechSpot Paladin, said:

Come on, TechSpot. You measure average framerate and then 99th percentile frametimes and you are shocked when there's a big difference.

Steven Walton obviously doesn't understand frametimes testing: it's not an average, it's a threshold under which 99% of the frames were produced. Much like minimum framerate, it's there to give an idea how each card performs in the worst-case scenario in the test. If you were doing average frametimes, or 50th percentile frametimes, it would match the framerate when converted.

I believe the point Steve was trying to make is that there's a much bigger fluctuation in fps in this game compared to other games, hence the statement:

It isn't uncommon to see a gap between frame time performance and average frames per second, but we generally observe a smaller 5 - 10 fps reduction with the former. With BioShock Infinite the margins were massive, anywhere from 10 to 30fps.

amstech amstech, TechSpot Enthusiast, said:

I see a triple core or two on there but one chip that is still widely used for games that I wouldn't mind seeing, is the 720 X3/740 X3. You don't have to even unlock/overclock it. I would send you mine but its still doing a nice job pushing my 570 in my HTPC for 1080p gaming.

Staff
Steve Steve said:

Come on, TechSpot. You measure average framerate and then 99th percentile frametimes and you are shocked when there's a big difference.

Steven Walton obviously doesn't understand frametimes testing: it's not an average, it's a threshold under which 99% of the frames were produced. Much like minimum framerate, it's there to give an idea how each card performs in the worst-case scenario in the test. If you were doing average frametimes, or 50th percentile frametimes, it would match the framerate when converted.

I believe the point Steve was trying to make is that there's a much bigger fluctuation in fps in this game compared to other games, hence the statement:

It isn?t uncommon to see a gap between frame time performance and average frames per second, but we generally observe a smaller 5 ? 10 fps reduction with the former. With BioShock Infinite the margins were massive, anywhere from 10 to 30fps.

slh28 you are correct, thank you.

It shouldn't be a surprise though as I said virtually that in the article.

From the conclusion...

"It isn't uncommon to see a gap between frame time performance and average frames per second, but we generally observe a smaller 5 - 10 fps reduction with the former. With BioShock Infinite the margins were massive, anywhere from 10 to 30fps."

Rage_3K_Moiz Rage_3K_Moiz, Sith Lord, said:

I've already seen it in action. I'm just wondering if the performance hit is higher on some cards compared to others.

brkbeatjunkie brkbeatjunkie said:

In your screenshot comparison high settings show way more AA around edges ie trees and leaves compared to ultra which is quite aliased IMO

Why does ultra have inferior/less anti aliasing??

Guest said:

This is extremely biased towards Nvidia, they intentionally left post processing to "normal" which is DX10 instead of maxing it out to "Ultra" which is DX11 because Post-Processing Ultra is significantly faster on AMD cards, shame on you techspot !

This is not the first time that you intentionally manipulated the settings in favor of Nvidia .

Staff
Steve Steve said:

I guess we have been caught red handed once again, or should I say green handed.

JC713 JC713 said:

I've already seen it in action. I'm just wondering if the performance hit is higher on some cards compared to others.

Well, PostProcessing has been used for a while by companies like nVidia and AMD. So as the tech progresses and drivers mature, the hit will be more and more minor.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.