Splinter Cell: Blacklist Tested, Benchmarked

By on August 27, 2013, 3:24 AM

Equipped with his iconic night vision goggles and a new counter-terror agency, Sam Fisher has returned to foil another anti-US plot in Ubisoft's sixth Splinter Cell game. Titled Blacklist, the latest entry is a sequel to 2010's Splinter Cell: Conviction and attempts to find a better balance between stealth and action while reintroducing some franchise favorites.

Like Conviction, Blacklist was built using LEAD, a heavily modified version of Unreal Engine 2.5 with Havok physics that Ubisoft seems to prefer over UE3. PC gamers can look forward to a typical array of graphics options including TXAA antialiasing, soft shadows, horizon-based ambient occlusion and advanced DX11 tessellation -- all of which we plan to test with nearly two dozen graphics setups and a handful of processors...

Read the complete article.




User Comments: 26

Got something to say? Post a comment
Maximum Payne Maximum Payne said:

Why you didn't post screenshots with different settings like medium/high/ultra; on/ff stuff to see is there actually big difference ?

lmike6453 said:

I enjoy these benchmarks except how come the 670 4gb edition is never tested?

1 person liked this | Alpha Gamer Alpha Gamer said:

When I realized this game was running in a 9 year old engine with just some new bells and whistles, I thought: "no way Steve will do a performance article on this". And here we are, detailed as always. Thanks a lot, Steve.

My two cents: not only in this game, but most of the time, FXAA adds so much blurriness to the textures, I don't think it ever pays off.

3 people like this | Lionvibez said:

I enjoy these benchmarks except how come the 670 4gb edition is never tested?

I think because that amount of memory on a 670 is a waste the gpu isn't fast enough to make use of it.

Secondly I have a feeling it will produce the same numbers as a 2GB 670.

If you own this card you will have to look at the 670 2GB results and use your imagination for the rest.

3 people like this |
Staff
Steve Steve said:

I enjoy these benchmarks except how come the 670 4gb edition is never tested?

Because it's a pointless product.

When I realized this game was running in a 9 year old engine with just some new bells and whistles, I thought: "no way Steve will do a performance article on this". And here we are, detailed as always. Thanks a lot, Steve.

My two cents: not only in this game, but most of the time, FXAA adds so much blurriness to the textures, I don't think it ever pays off.

If they didn't add in all the fancy DX11ness we wouldn't have bothered.

I think because that amount of memory on a 670 is a waste the gpu isn't fast enough to make use of it.

Secondly I have a feeling it will produce the same numbers as a 2GB 670.

If you own this card you will have to look at the 670 2GB results and use your imagination for the rest.

You are 100% correct on all accounts.

1 person liked this | JC713 JC713 said:

I miss these benchmarks. Good to see one again. It is interesting to note the CPU performance. Overclocking really helps. But more importantly, the AMD X6 1100T used to be a popular choice and performs at almost half of the i7 4770K. I guess it shows that it is time for an upgrade.

hahahanoobs hahahanoobs said:

lol@X6 1100T performance

1 person liked this | Guest said:

Just FYI, you should not recommend an i3 for playing the game in DX11.

I have an i3-2100 and this game does not make use of Hyperthreading. Consequently, the two cores are constantly pegged at 100% in DX11 mode, no matter the graphic settings. There is terrible stuttering that doesn't show up in frames-per-second analysis, indicative of long frametimes. Some less-CPU-intensive levels are playable, but most are not, and the worst stuttering occurs when looking at the SMI area of the plane (easily testable area). Not asking you guys to go back and test again, or even change the review, but this is a major issue for playability.

Switching to DX9 solves the issue, but blocks access to Ultra shadows, HBAO, Tessellation, TXAA and MSAA.

lmike6453 said:

Because it's a pointless product.

If they didn't add in all the fancy DX11ness we wouldn't have bothered.

You are 100% correct on all accounts.

Both of you guys are incorrect. The core clock specification alone is 66hz higher than the base 2gb 670. It's fine, I know that they are close but still...there's so many vid card variations that are tested, figured that I'd give it a shot.

Lionvibez said:

Both of you guys are incorrect. The core clock specification alone is 66hz higher than the base 2gb 670. It's fine, I know that they are close but still...there's so many vid card variations that are tested, figured that I'd give it a shot.

lol and 66mhz is going to do what give you .8-2 extra fps.

Now you are just reaching bro.

I have an idea you should buy one and send it in so they can test and add it to the graph.

lmike6453 said:

lol and 66mhz is going to do what give you .8-2 extra fps.

Now you are just reaching bro.

I have an idea you should buy one and send it in so they can test and add it to the graph.

Wow...I'm only replying to correct you for other readers. The core clock spec is just one spec that's different. How does 2gb extra vid card ram not matter either? And I truly am asking you since you said that "the core clock is not fast enough to make use of it"

1 person liked this | cliffordcooley cliffordcooley, TechSpot Paladin, said:

Wow...I'm only replying to correct you for other readers.
The only thing I know for sure is they can't be bothered with testing every iteration of every card. At some point the consumer will need to estimate their won cards potential with the ones tested. Do you honestly think there will be ground breaking difference between the 2GB and 4GB models?

Let me make an observation for the other readers as well. Use the review as a guideline not a concrete science. If you read the review and don't know basically where your card will fit in, you are not trying hard enough.

1 person liked this | Adhmuz Adhmuz, TechSpot Paladin, said:

@ Guest with the i3-2100

The review tested with the newer gen i3-3220, it might not seem like a big difference but its 200 mhz faster with a better architecture, you'd be surprised at how much that can alter a game to go from playable fps to unplayable.

@ The 670 4GB owner

The reason for that card is to SLI it with another card, one by itself is just an overpriced 670, 66MHz faster, lol yeah you tell us.

As for the game itself, I was lucky enough to get to do some beta testing for it and saw how unimpressively generic of a Tom Clancy game it is, but aren't they all? Anyway don't spend $60 on this game, it's absurd that they feel they can ask $10 more than the rest of the games on PC, and use a 9 year old engine to boot...

lmike6453 said:

The only thing I know for sure is they can't be bothered with testing every iteration of every card. At some point the consumer will need to estimate their won cards potential with the ones tested. Do you honestly think there will be ground breaking difference between the 2GB and 4GB models?

Let me make an observation for the other readers as well. Use the review as a guideline not a concrete science. If you read the review and don't know basically where your card will fit in, you are not trying hard enough.

Sorry that it bothered or disrupted the thread. Nope, I didn't think that it would be groundbreaking...however, I thought that it would be worthy of testing against the base model.

ikesmasher said:

How does 2gb extra vid card ram not matter either? t"

Just because it has the extra memory doesnt mean it uses it.

Lionvibez said:

Wow...I'm only replying to correct you for other readers. The core clock spec is just one spec that's different. How does 2gb extra vid card ram not matter either? And I truly am asking you since you said that "the core clock is not fast enough to make use of it"

I know but there is nothing to correct in my post.

I will let the rest of the community tell you why 4GB's of ram is a waste on that card. Also I said the gpu is not fast enough to use all the memory. You reply to me is misquoting that as if I said the clock speed was not fast enough. You bought up clock speed in your second post I never mentioned anything about it.

When I said the gpu isn't fast enough I ment that architecture or the whole 6xxx series . Does the 680 do any better with 4GB's of ram ?

As someone else also posted for SLI there is value since you aren't double the amount of ram in that setup so 4GB makes sense since both gpus are copying the same contents into memory.

Sorry if I came off sounding hostile.

5 people like this |
Staff
Steve Steve said:

Just FYI, you should not recommend an i3 for playing the game in DX11.

I have an i3-2100 and this game does not make use of Hyperthreading. Consequently, the two cores are constantly pegged at 100% in DX11 mode, no matter the graphic settings. There is terrible stuttering that doesn't show up in frames-per-second analysis, indicative of long frametimes. Some less-CPU-intensive levels are playable, but most are not, and the worst stuttering occurs when looking at the SMI area of the plane (easily testable area). Not asking you guys to go back and test again, or even change the review, but this is a major issue for playability.

Switching to DX9 solves the issue, but blocks access to Ultra shadows, HBAO, Tessellation, TXAA and MSAA.

Obviously we didn't find this or we would have said. Look at the frame rates, our Core i3 was clearly performing very well.

When I said the gpu isn't fast enough I ment that architecture or the whole 6xxx series . Does the 680 do any better with 4GB's of ram ?.

The answer is no. The GTX 680 cannot utilize 4GB of memory, even at 2560x1600. The only resolution where the memory starts to show performance benefits is at the triple-monitor resolution of 7680x1600 and here 3-way GTX 680 SLI cards can't even provide playable performance in the latest games.

Both of you guys are incorrect. The core clock specification alone is 66hz higher than the base 2gb 670. It's fine, I know that they are close but still...there's so many vid card variations that are tested, figured that I'd give it a shot.

As far as I am aware the Nvidia spec says that both 2GB and 4GB versions of the GTX 670 run a core clock speed of 915MHz. Either way a 66MHz speed bump does not warrant including the card, I assume you just have a factory overclocked card. Use your imagination and add the extra frame or two.

I am glad you are keen for us to add more cards and thank you for reading the article/commenting. Please understand that the reason we didn't include the card is because we feel its a pointless product and more importantly the GTX 670 2GB was included which should give you a very strong guideline to go off.

1 person liked this | amstech amstech, TechSpot Enthusiast, said:

The VRAM thing has been completely spun out of control by the millions of noob PC gaming/modding enthusiasts. I see these kiddies everywhere on tech sites/forums recommending 3GB/4GB VRAM GPU's for 1080p gaming and it drives me nuts. (yes I know the examples of games that can use more at this resolution not including mods, very very few with specific settings that don't make a visual difference anyways, and I understand future proofing)

It's plausible to need 4GB VRAM at 1440p/1600p and above, and even then 2GB does VERY well. I thought this review covered all bases and loved the CPU frequency chart. The comments about an old game engine are hilarious. They can evolve. The Corvette is old too, but the latest version of it or the C7 Stingray portrays that things improve over time.

Remember WoW when it first came out?

My laptop can't max it out anymore at 900p and my laptop is quick. (i5-2430M- 144 shader 1GB GT-550M @ 600core /6GB DDR3)

JC713 JC713 said:

@Steve, do you think you guys can include the AMD Athlon X4 750K in the benchmarks also. That is the king of <$100 CPUs IMO.

Guest said:

Dude with the i3-2100 here. Okay, after hours of testing in DX11, I found the issue. The game exe only sets affinity for cores 0 and 2 instead of All Processors. I can't figure out why, and I have to change it every time I launch the game now, but it's smooth now - using all 4 "cores" at 80-90%.

Sorry about jumping to conclusions about this being the case for all systems with i3s. Noticed some other bugs in the game interface, too - hope a polished patch is coming.

Staff
Steve Steve said:

Dude with the i3-2100 here. Okay, after hours of testing in DX11, I found the issue. The game exe only sets affinity for cores 0 and 2 instead of All Processors. I can't figure out why, and I have to change it every time I launch the game now, but it's smooth now - using all 4 "cores" at 80-90%.

Sorry about jumping to conclusions about this being the case for all systems with i3s. Noticed some other bugs in the game interface, too - hope a polished patch is coming.

We definitely didn't have that issue, did you install the day 1 patch?

cliffordcooley cliffordcooley, TechSpot Paladin, said:

Dude with the i3-2100 here. <snip> I can't figure out why, and I have to change it every time I launch the game now, but it's smooth now - using all 4 "cores" at 80-90%.
Could it be the i3 is not a 4 core CPU? I mean a two cores with hyper-threading, is not four cores.

Guest said:

Tried reinstalling the patch today, just to make sure nothing went wrong when I installed it on day 1, but it just told me the game was already patched. Also, it looks like the game is resetting affinity back to just 2 cores whenever a mission is started - even though I already changed it after launching the game.

I opened a thread on Ubi's Forum...

LukeDJ LukeDJ said:

Dude with the i3-2100 here. <snip> I can't figure out why, and I have to change it every time I launch the game now, but it's smooth now - using all 4 "cores" at 80-90%.
Could it be the i3 is not a 4 core CPU? I mean a two cores with hyper-threading, is not four cores.

Yes, hence the inverted commas "." His problem is that the game isn't taking advantage of his hyper-threading, a problem which Steve didn't encounter.

Btw, great review as always

Gorge09 said:

Hi, I have an i5-3210 and have the exact same problem both the dx9 and dx11 .exe only use core 0 and 2 and I have to change it every time I launch the game, in order to play smoothly. I really don't get it.

Also I still get some eventual lag when running DX11 version but since my graphics card isn't that great(HD7670m, is a laptop) dx9 is good enough.

Guest said:

Yep - this issue affects all hyperthreaded dual-cores, mobile and desktop. The stuttering isn't noticeable on the first level because it's not very CPU-heavy, but definitely noticeable in the SMI area of the Paladin (basically the game's menu).

Ubi Support has forwarded the issue to the dev team, but has no more information at this time. And today's patch doesn't include a fix for the CPU-Affinity issue. Still have to change to All Cores in Task Manager every time a new area loads.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.