Titanfall's performance is still iffy but this benchmark should answer some of your questions

Matthew DeCarlo

Posts: 5,271   +104

With frantic combat and fluid mechanics, Titanfall is precisely the sort of game you'd want to play on max quality with no hiccups -- perhaps even across multiple graphics cards and monitors. That's easier said than done without support for SLI and Crossfire, which is among other issues that have affected many players and prevented us from completing TechSpot's usual performance analysis.

On the bright side, Respawn, Nvidia and AMD are working on updates and we're ready to test them as soon as they're available, though there doesn't seem to be a good estimate on when that will be. In the meantime, you should be able to determine approximately where your rig stands with these results of GeForce cards running Titanfall at 1920x1200 on high with 4xAA and 16xAF.

At those settings, the GTX 580 and GTX 760 comfortably exceeded 60fps while the GTX 650 Ti Boost and GTX 750 Ti managed 40 and 45fps. Those numbers aren't too surprising considering the minimum specs call for an HD 4770 or 8800 GT, so most modern cards should be able to handle the game in some capacity, even if it means reducing certain settings. Stay tuned for our full breakdown.

Permalink to story.

 
Unfortunately this is only an Average. Even with a 670 WF3 I get dips down to 40-50 FPS during alot of action. Or on that dreaded jungle level. *shudders*
 
I play at max settings on my 650 Ti BOOST, i5-3570k and 8gb RAM, very good game, love it all, but it is quite badly optimized, it get around 60fps but I used up to 6 gb of RAM at times and during heavy titan fights I dropped to about 30 fps.
 
Is "high" the max settings for Titanfall, or is there an ultra graphical option?
That seems like an odd assortment of gpus to test on a new game like Titanfall. I understand testing for older systems, but a gtx 480 instead of a 780 or even a 770?
 
Is "high" the max settings for Titanfall, or is there an ultra graphical option?
That seems like an odd assortment of gpus to test on a new game like Titanfall. I understand testing for older systems, but a gtx 480 instead of a 780 or even a 770?
There is high (which is maximum) for everything except texture resolution which can go from high to insane.
 
Is "high" the max settings for Titanfall, or is there an ultra graphical option?
That seems like an odd assortment of gpus to test on a new game like Titanfall. I understand testing for older systems, but a gtx 480 instead of a 780 or even a 770?
There is high (which is maximum) for everything except texture resolution which can go from high to insane.
What resolution do you play at?
 
I play at max settings on my 650 Ti BOOST, i5-3570k and 8gb RAM, very good game, love it all, but it is quite badly optimized, it get around 60fps but I used up to 6 gb of RAM at times and during heavy titan fights I dropped to about 30 fps.
Yeah it is a port after all.
 
Excellent. I have a GTX 760 SC and play @ 1080p so the performance should be awesome (and by the time I play this the game and drivers should be perfectly optimized)
 
Is "high" the max settings for Titanfall, or is there an ultra graphical option?
That seems like an odd assortment of gpus to test on a new game like Titanfall. I understand testing for older systems, but a gtx 480 instead of a 780 or even a 770?
The reason for the gtx 480 is probably to compare it directly to the gtx 750 Ti, since nvidia claimed they had similar performance. And finally, to see if the gtx 750 Ti can really outperform the xbone, also claimed by nvidia.
 
Is "high" the max settings for Titanfall, or is there an ultra graphical option?
That seems like an odd assortment of gpus to test on a new game like Titanfall. I understand testing for older systems, but a gtx 480 instead of a 780 or even a 770?

Its a small snapshot of our normal 20+ GPU lineup, we have of course tested all the GeForce 700 series cards. We are waiting for an AMD vsync fix before we show all the results in an upcoming article.

The game is capped at 60 fps - how is this benchmark even relevant?

How is it relevant? If the game is capped at 60fps how did we just show you performance above 60fps, actual multiplier gaming performance ;S Vsync locks the frames at your monitors refresh rate so if that exceeds 60Hz then you will see more than 60fps with vsync turned on.
 
I don't know what it is, but it feel unoptimized. GTX 660SC no AA and with certain settings on Medium and STILL get frame rate drops to about 45 fps. This is on an i7 2600K at 4.4ghz. Something's not right.
 
I'm curious how this will run on my FX-8320 and GTX 770.
I have a GTX 770 and it plays very well with all settings to the highest limit and 1080p (native). However I adjusted anti-A down a nodge same as the texture filtering. Just to make sure I don't get frame rate drops like a crazy... I'm not looking at my FPS the entire time but I can't feel that a drop in FPS occurs on my system. I btw have 16 gb of RAM which doesn't give me the problem of eating up 6GB because I have plenty left.
 
Beta was alright on a 460gtx with i7 920. Stock settings. By alright I mean high textures, 1680x1050, aa off, anisotropic high, effects and shadows were medium. Played fluidly and was a good trade off. I'd get a different card but it needs to be short and have the power sockets on the back due to my case. Also preferably a closed fan design to blow air out the back of the case as the graphics card sits right at the bottom.
 
I'm running an I5 4670, 16GB ram, GTX 760 with Samsung Evo SSD @ 2560X1080(LG Ultrawide). 4XMSAA Everything on High Anisotropic 16X and it runs very well. Smooth. Not sure of the frame rate however, something I need to check. What I have noticed is my HUD is almost entirely missing no matter if I set it to large or small... I will most likely drop to native 1080P to regain my HUD and increase performance whenever they patch it.
 
I can max everything with my 2600k 4.0ghz, gtx 670 and 32gb 1866mhz ram apart from the AA I had to put that down to 8x when I found out texture quality had an insane setting to keep the game at a solid 1080p60 image
 
I have a 2500k with a Radeon 290, and I have everything maxed as high as it can go with no performance issues whatsoever. The frame rate is very fluid.
 
I am getting around 50fps in most areas with my 9600GSO, 1600 x 900 medium with no af but x4 aa. Definitely feels like it needs bit of work, but fun nonetheless.
 
I have a GTX 770 and it plays very well with all settings to the highest limit and 1080p (native). However I adjusted anti-A down a nodge same as the texture filtering. Just to make sure I don't get frame rate drops like a crazy... I'm not looking at my FPS the entire time but I can't feel that a drop in FPS occurs on my system. I btw have 16 gb of RAM which doesn't give me the problem of eating up 6GB because I have plenty left.

What kind of processor do you have? The GTX 770 should be overkill but I'm wondering how the FX-8320 does.
 
Is anybody else getting connection issues when using the Insane texture resolution? Every time I bump my textures from High to Insane, it loads slowly, and then as soon as I pop into the game, it says that the connection was lost. I've got an i7-920, 6GB RAM, and GTX 670 2GB.
 
Is anybody else getting connection issues when using the Insane texture resolution? Every time I bump my textures from High to Insane, it loads slowly, and then as soon as I pop into the game, it says that the connection was lost. I've got an i7-920, 6GB RAM, and GTX 670 2GB.

Probably a VRAM issue. The developer has stated that insane textures require a 3GB frame buffer so that could be your issue with the GTX 670.
 
Back