Splinter Cell: Blacklist Tested, Benchmarked

Julio Franco

Posts: 9,097   +2,048
Staff member
Read the full article at:
[newwindow=https://www.techspot.com/review/706-splinter-cell-blacklist-benchmarks/]https://www.techspot.com/review/706-splinter-cell-blacklist-benchmarks/[/newwindow]

Please leave your feedback here.
 
Why you didn't post screenshots with different settings like medium/high/ultra; on/ff stuff to see is there actually big difference ?
 
When I realized this game was running in a 9 year old engine with just some new bells and whistles, I thought: "no way Steve will do a performance article on this". And here we are, detailed as always. Thanks a lot, Steve.
My two cents: not only in this game, but most of the time, FXAA adds so much blurriness to the textures, I don't think it ever pays off.
 
I enjoy these benchmarks except how come the 670 4gb edition is never tested?

Because it’s a pointless product.

When I realized this game was running in a 9 year old engine with just some new bells and whistles, I thought: "no way Steve will do a performance article on this". And here we are, detailed as always. Thanks a lot, Steve.
My two cents: not only in this game, but most of the time, FXAA adds so much blurriness to the textures, I don't think it ever pays off.

If they didn’t add in all the fancy DX11ness we wouldn’t have bothered.

I think because that amount of memory on a 670 is a waste the gpu isn't fast enough to make use of it.

Secondly I have a feeling it will produce the same numbers as a 2GB 670.

If you own this card you will have to look at the 670 2GB results and use your imagination for the rest.


You are 100% correct on all accounts.
 
I miss these benchmarks. Good to see one again. It is interesting to note the CPU performance. Overclocking really helps. But more importantly, the AMD X6 1100T used to be a popular choice and performs at almost half of the i7 4770K. I guess it shows that it is time for an upgrade.
 
Just FYI, you should not recommend an i3 for playing the game in DX11.

I have an i3-2100 and this game does not make use of Hyperthreading. Consequently, the two cores are constantly pegged at 100% in DX11 mode, no matter the graphic settings. There is terrible stuttering that doesn't show up in frames-per-second analysis, indicative of long frametimes. Some less-CPU-intensive levels are playable, but most are not, and the worst stuttering occurs when looking at the SMI area of the plane (easily testable area). Not asking you guys to go back and test again, or even change the review, but this is a major issue for playability.

Switching to DX9 solves the issue, but blocks access to Ultra shadows, HBAO, Tessellation, TXAA and MSAA.
 
Because it’s a pointless product.



If they didn’t add in all the fancy DX11ness we wouldn’t have bothered.




You are 100% correct on all accounts.

Both of you guys are incorrect. The core clock specification alone is 66hz higher than the base 2gb 670. It's fine, I know that they are close but still...there's so many vid card variations that are tested, figured that I'd give it a shot.
 
Both of you guys are incorrect. The core clock specification alone is 66hz higher than the base 2gb 670. It's fine, I know that they are close but still...there's so many vid card variations that are tested, figured that I'd give it a shot.

lol and 66mhz is going to do what give you .8-2 extra fps.

Now you are just reaching bro.

I have an idea you should buy one and send it in so they can test and add it to the graph.
 
lol and 66mhz is going to do what give you .8-2 extra fps.

Now you are just reaching bro.

I have an idea you should buy one and send it in so they can test and add it to the graph.

Wow...I'm only replying to correct you for other readers. The core clock spec is just one spec that's different. How does 2gb extra vid card ram not matter either? And I truly am asking you since you said that "the core clock is not fast enough to make use of it"
 
Wow...I'm only replying to correct you for other readers.
The only thing I know for sure is they can't be bothered with testing every iteration of every card. At some point the consumer will need to estimate their won cards potential with the ones tested. Do you honestly think there will be ground breaking difference between the 2GB and 4GB models?

Let me make an observation for the other readers as well. Use the review as a guideline not a concrete science. If you read the review and don't know basically where your card will fit in, you are not trying hard enough.
 
@ Guest with the i3-2100

The review tested with the newer gen i3-3220, it might not seem like a big difference but its 200 mhz faster with a better architecture, you'd be surprised at how much that can alter a game to go from playable fps to unplayable.

@ The 670 4GB owner

The reason for that card is to SLI it with another card, one by itself is just an overpriced 670, 66MHz faster, lol yeah you tell us.

As for the game itself, I was lucky enough to get to do some beta testing for it and saw how unimpressively generic of a Tom Clancy game it is, but aren't they all? Anyway don't spend $60 on this game, it's absurd that they feel they can ask $10 more than the rest of the games on PC, and use a 9 year old engine to boot...
 
The only thing I know for sure is they can't be bothered with testing every iteration of every card. At some point the consumer will need to estimate their won cards potential with the ones tested. Do you honestly think there will be ground breaking difference between the 2GB and 4GB models?

Let me make an observation for the other readers as well. Use the review as a guideline not a concrete science. If you read the review and don't know basically where your card will fit in, you are not trying hard enough.

Sorry that it bothered or disrupted the thread. Nope, I didn't think that it would be groundbreaking...however, I thought that it would be worthy of testing against the base model.
 
Wow...I'm only replying to correct you for other readers. The core clock spec is just one spec that's different. How does 2gb extra vid card ram not matter either? And I truly am asking you since you said that "the core clock is not fast enough to make use of it"

I know but there is nothing to correct in my post.

I will let the rest of the community tell you why 4GB's of ram is a waste on that card. Also I said the gpu is not fast enough to use all the memory. You reply to me is misquoting that as if I said the clock speed was not fast enough. You bought up clock speed in your second post I never mentioned anything about it.

When I said the gpu isn't fast enough I ment that architecture or the whole 6xxx series . Does the 680 do any better with 4GB's of ram ?

As someone else also posted for SLI there is value since you aren't double the amount of ram in that setup so 4GB makes sense since both gpus are copying the same contents into memory.

Sorry if I came off sounding hostile.
 
Just FYI, you should not recommend an i3 for playing the game in DX11.

I have an i3-2100 and this game does not make use of Hyperthreading. Consequently, the two cores are constantly pegged at 100% in DX11 mode, no matter the graphic settings. There is terrible stuttering that doesn't show up in frames-per-second analysis, indicative of long frametimes. Some less-CPU-intensive levels are playable, but most are not, and the worst stuttering occurs when looking at the SMI area of the plane (easily testable area). Not asking you guys to go back and test again, or even change the review, but this is a major issue for playability.

Switching to DX9 solves the issue, but blocks access to Ultra shadows, HBAO, Tessellation, TXAA and MSAA.

Obviously we didn't find this or we would have said. Look at the frame rates, our Core i3 was clearly performing very well.

When I said the gpu isn't fast enough I ment that architecture or the whole 6xxx series . Does the 680 do any better with 4GB's of ram ?.

The answer is no. The GTX 680 cannot utilize 4GB of memory, even at 2560x1600. The only resolution where the memory starts to show performance benefits is at the triple-monitor resolution of 7680x1600 and here 3-way GTX 680 SLI cards can’t even provide playable performance in the latest games.

Both of you guys are incorrect. The core clock specification alone is 66hz higher than the base 2gb 670. It's fine, I know that they are close but still...there's so many vid card variations that are tested, figured that I'd give it a shot.

As far as I am aware the Nvidia spec says that both 2GB and 4GB versions of the GTX 670 run a core clock speed of 915MHz. Either way a 66MHz speed bump does not warrant including the card, I assume you just have a factory overclocked card. Use your imagination and add the extra frame or two.

I am glad you are keen for us to add more cards and thank you for reading the article/commenting. Please understand that the reason we didn't include the card is because we feel its a pointless product and more importantly the GTX 670 2GB was included which should give you a very strong guideline to go off.
 
The VRAM thing has been completely spun out of control by the millions of noob PC gaming/modding enthusiasts. I see these kiddies everywhere on tech sites/forums recommending 3GB/4GB VRAM GPU's for 1080p gaming and it drives me nuts. (yes I know the examples of games that can use more at this resolution not including mods, very very few with specific settings that don't make a visual difference anyways, and I understand future proofing)
It's plausible to need 4GB VRAM at 1440p/1600p and above, and even then 2GB does VERY well. I thought this review covered all bases and loved the CPU frequency chart. The comments about an old game engine are hilarious. They can evolve. The Corvette is old too, but the latest version of it or the C7 Stingray portrays that things improve over time.

Remember WoW when it first came out?
My laptop can't max it out anymore at 900p and my laptop is quick. (i5-2430M- 144 shader 1GB GT-550M @ 600core /6GB DDR3)
 
Steve, do you think you guys can include the AMD Athlon X4 750K in the benchmarks also. That is the king of <$100 CPUs IMO.
 
Dude with the i3-2100 here. Okay, after hours of testing in DX11, I found the issue. The game exe only sets affinity for cores 0 and 2 instead of All Processors. I can't figure out why, and I have to change it every time I launch the game now, but it's smooth now - using all 4 "cores" at 80-90%.

Sorry about jumping to conclusions about this being the case for all systems with i3s. Noticed some other bugs in the game interface, too - hope a polished patch is coming.
 
Dude with the i3-2100 here. Okay, after hours of testing in DX11, I found the issue. The game exe only sets affinity for cores 0 and 2 instead of All Processors. I can't figure out why, and I have to change it every time I launch the game now, but it's smooth now - using all 4 "cores" at 80-90%.

Sorry about jumping to conclusions about this being the case for all systems with i3s. Noticed some other bugs in the game interface, too - hope a polished patch is coming.

We definitely didn't have that issue, did you install the day 1 patch?
 
Dude with the i3-2100 here. <snip> I can't figure out why, and I have to change it every time I launch the game now, but it's smooth now - using all 4 "cores" at 80-90%.
Could it be the i3 is not a 4 core CPU? I mean a two cores with hyper-threading, is not four cores.
 
Tried reinstalling the patch today, just to make sure nothing went wrong when I installed it on day 1, but it just told me the game was already patched. Also, it looks like the game is resetting affinity back to just 2 cores whenever a mission is started - even though I already changed it after launching the game.

I opened a thread on Ubi's Forum...
 
Dude with the i3-2100 here. <snip> I can't figure out why, and I have to change it every time I launch the game now, but it's smooth now - using all 4 "cores" at 80-90%.
Could it be the i3 is not a 4 core CPU? I mean a two cores with hyper-threading, is not four cores.
Yes, hence the inverted commas "." His problem is that the game isn't taking advantage of his hyper-threading, a problem which Steve didn't encounter.

Btw, great review as always :)
 
Back