Call of Duty: Black Ops II Tested, Benchmarked

It doesn't matter who you are or what GPU brand you prefer, to even pretend that a re-release of the 7970 with a special name called the 'GHz Edition' is just a normal 7970 is ridiculous.
I do however, agree with GTX 570 testing/results, atleast for 1080p/1200p and 1680X1050. A GTX 570, although now a mid-range GPU, can still stay with a 580 or damn near with a modest overclock and it would be nice (I also have a GTX 570 in my HTPC) to see where it stacks up on newer games compared to a 560Ti and 580.
 
"Meanwhile, Nvidia’s flagship was on average 5% faster than the Radeon HD 7970, with brief bright moments where it could be as much as 30% faster. And this is why the 7970 was cut down from $549 to it's current $449.
Fast forward to the present day and it'd appear that AMD is desperate to claim the bragging rights of offering the single fastest GPU money can buy"

Your words from the 7970 GHz edition review. (I'm assuming the Steve writing the article is you).
 
The 7000 series cards have serious problem on DX9. Black artifacts on many DX9 games.

On some it happens rarely, on other more and on some if you find hot-spots, it happens continuously. This bug exists since the series had been released. Since it's not fixed yet, I'm afraid it's hardware issue. I sent my first 7950 for RMA, they found it problematic and now the replacement does the same. I wouldn't recommend to anyone to invest so much money on a 7000 series AMD card.

If the next official drivers won't include a solid fix, I'lll send the card back to receive credit -and go Nvidia since no other options exist. AMD's economics are so bad that I wouldn't even in the slightest expect confirmation and recall by them.

Incidentally, I like that you test every new CoD. You could copy and paste the results from all previous CoDs. The technical part of the game remains fully stagnant.
 
"Meanwhile, Nvidia’s flagship was on average 5% faster than the Radeon HD 7970, with brief bright moments where it could be as much as 30% faster. And this is why the 7970 was cut down from $549 to it's current $449. Fast forward to the present day and it'd appear that AMD is desperate to claim the bragging rights of offering the single fastest GPU money can buy"

Your words from the 7970 GHz edition review. (I'm assuming the Steve writing the article is you).

I am certain I said using the latest drivers and not those from 6 months ago when that review was published. Back then the margin was just 5% on average, now its less with the new Cat 12.11 drivers.

Regardless of all that your previous comments are still ridiculous. You are claiming that the 7970 performance was poor because on average it is just 5% slower than Nvidia's GTX 680, based on our findings 6 months ago. You have a very strange way of defining poor performance. As I said previously the 7970 is now around 15% cheaper than the GTX 680, so even if it were still 5% slower its still better value. However as it stands now its slightly faster at 2560x1600.

The 7000 series cards have serious problem on DX9. Black artifacts on many DX9 games.

Never ever heard that one before :S I have been using a Radeon HD 7870 in my gaming system since day one. I have not upgraded it to something more powerful because all the games I play in my free time are actually DX9 games, such as StarCraft II for example and I have never seen the “black artifacts” that you speak of.

Incidentally, I like that you test every new CoD. You could copy and paste the results from all previous CoDs. The technical part of the game remains fully stagnant.

Also not exactly the case here if you read the article. While the game is not overly demanding and certainly not the best looking game of the year (or even last year) it has been upgraded to support DX11 and a handful of effects making it around 20 – 30% more demanding than the previous title.
 
ghasmanjr,

You are arguing semantics / naming. AMD could have called HD7970 Ghz as "HD7980" etc. It really doesn't matter. It's an official SKU that's priced to compete directly against GTX680, HD7970 is priced against GTX670 and HD7950 V2 / Boost are priced against GTX660Ti.

Using your logic,

GeForce Ti 4600 was an overclocked Ti 4400 card, and 4400 was an overclocked Ti 4200 card.

GeForce 6800 Ultra Extreme was just an overclock GeForce 6800 Ultra, and 6800U was just an overclocked 6800GT card.

You get the picture how flawed your argument is.

Other websites have already tested HD7970 Ghz against after-market GTX680 cards (such as Asus TOP 1137mhz, Gigabyte GTX680 SOC 1137mhz and even against a 1202 core clocked GTX680); and HD7970 Ghz didn't get destroyed either:

http://www.xbitlabs.com/articles/gr...680-super-overclock-windforce-5x_6.html#sect2

In fairness, if you start talking about $475-550 GTX680s, you should test them against after-market 7970 Ghz cards like Asus Matrix HD7970:

http://www.hardwarecanucks.com/foru...70-3gb-matrix-platinum-edition-review-20.html

---

As far as COD: BO2 goes, the DX11 features seem almost non-existent. The game's performance is a bit low given the level of graphics to be honest. Perhaps once PS4 / Xbox next hit the scene, they will finally build a brand new engine for COD games with DX11 from the group-up with tessellation, high resolution textures, global illumination, etc.

The TXAA filter is also extremely ugly in this game, blurring textures.
 
ghasmanjr,

HD7970 Ghz was already the fastest single-GPU as early as June 2012 when Catalyst 12.7 drivers were released. That gap grew even more, up to 15% now at 2560x1600:
http://www.hardwarecanucks.com/foru...12-11-never-settle-driver-performance-17.html

Steve is right in that HD7970 / 7970 Ghz are better value than GTX680 today. A 1ghz HD7970 can now be purchased for $380 with 3 free games. At 1Ghz = GTX680, and you get free games, higher overclocking headroom:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202008

Things change as prices changes, new drivers are released. GTX670/680 were better buys until early summer but since then AMD cards have reclaimed the price/performance and single-GPU performance crown. That's how the GPU industry rolls though, always trading blows. Deal with it.
 
nVidia lovers... it´s time to assume that... 7970 wins all the battles, but on conclussions you put on first a nVidia 660... mmmmmmmm

I feel that nvidia will win this when 310.xx comes out of beta. AMDs latest 12.11 beta only upgrades performance in BO2 with crossfire. If they do not up the performance on singles cards, then they will be in a for a fight.
 
Fortunately for this ATI Fanboy, the GTX 690 was not included in this test.
Perhaps Techspot should invest in one?

Why is that? Crossfire 7970's are faster and cheaper, but then so is a pair of GTX 680 SLI cards. The GTX 690 is a terrible buy, unless for some reason you must have a pair of GPU's on a single card.

You're kidding right?

There are plenty of reviews on Anandtech and other places showing the 690 outperforming 7970's and dual 680's..

Whether it's because you have a bias for AMD, or a bias against dual GPU cards, you've completely invalidated you and your sites credibility with this post.

The 690 is a terrible buy, yet you don't have a card to test or reference to prove this?

Okay...

***.
 
You're kidding right?

There are plenty of reviews on Anandtech and other places showing the 690 outperforming 7970's and dual 680's..

Whether it's because you have a bias for AMD, or a bias against dual GPU cards, you've completely invalidated you and your sites credibility with this post.

The 690 is a terrible buy, yet you don't have a card to test or reference to prove this?

Okay...

***.

I'm not kidding because I do know what I am talking about. I have to question if you know what you are talking about ... how is the 690 faster than a pair of 680 SLI cards when that is all it is, just slightly under-clocked.

Again at 2560x1600 the Radeon HD 7970 is faster than the GeForce GTX 680 (on average) and two 7970's are faster than the 690, on average.
 
I'm not kidding because I do know what I am talking about. I have to question if you know what you are talking about ... how is the 690 faster than a pair of 680 SLI cards when that is all it is, just slightly under-clocked.

Again at 2560x1600 the Radeon HD 7970 is faster than the GeForce GTX 680 (on average) and two 7970's are faster than the 690, on average.

http://www.anandtech.com/show/5805/...view-ultra-expensive-ultra-rare-ultra-fast/11

That right there shows you're wrong about the 7970's..

I was wrong about the 680's and was referring to a test where two 690's outperformed three 680's, I'll try to find the link.

But with where drivers are at today, I bet the 690's are even better.

Regardless, it's not a categorically scientific performance analysis when one of the main variables (the 690's) is left out of the experiment. It's simply not a thorough analysis.

If it's a matter of not being able to acquire or afford one, I've got one sitting in box you can test if you want to sign to take financial responsibility if you fry it.

But until true benchmarks from the 690's are added in this article is incomplete. And no subjective opinion; one not based in facts or current benches with current drivers, even from a tech journalist is going to change that truth.

I've got no particular loyalty to either brand and own both Nvidia and AMD GPU's, but to toss off Nvidia's Boutique/Flagship card with the insouciance that it's not worth it because of it's configuration or your personal opinion, definitely discredits this article as incomplete, but worse, discredits this site's journalistic integrity.
 
You cannot hand pick results from a single game and if you are going to don’t do it from an old article. The 7970 is much faster in BF3 with the Cat 12.11 drivers.

But all that aside because I don’t care to argue that with you the 690 costs more than a pair of 680 SLI cards and in most cases is slower or at best offers the same performance. How is that a good buy?

I do not see how the article is incomplete because it doesn’t include the GeForce GTX 690 :S This is a guide to show gamers who want to play Black Ops II what kind of hardware they need for playable performance and they don’t need the 690 so why must it be included? The GTX 680 is overkill for this game so again why must the 690 be included?
 
http://www.anandtech.com/show/5805/...view-ultra-expensive-ultra-rare-ultra-fast/11

That right there shows you're wrong about the 7970's..

I was wrong about the 680's and was referring to a test where two 690's outperformed three 680's, I'll try to find the link.

But with where drivers are at today, I bet the 690's are even better.

Regardless, it's not a categorically scientific performance analysis when one of the main variables (the 690's) is left out of the experiment. It's simply not a thorough analysis.

If it's a matter of not being able to acquire or afford one, I've got one sitting in box you can test if you want to sign to take financial responsibility if you fry it.

But until true benchmarks from the 690's are added in this article is incomplete. And no subjective opinion; one not based in facts or current benches with current drivers, even from a tech journalist is going to change that truth.

I've got no particular loyalty to either brand and own both Nvidia and AMD GPU's, but to toss off Nvidia's Boutique/Flagship card with the insouciance that it's not worth it because of it's configuration or your personal opinion, definitely discredits this article as incomplete, but worse, discredits this site's journalistic integrity.
I'm not kidding because I do know what I am talking about. I have to question if you know what you are talking about ... how is the 690 faster than a pair of 680 SLI cards when that is all it is, just slightly under-clocked.

Again at 2560x1600 the Radeon HD 7970 is faster than the GeForce GTX 680 (on average) and two 7970's are faster than the 690, on average.

I will concede that according to that same article, the 7970 did beat both the 680 or 690 in two games. Crysis:Warhead and Metro 2033. Both games that use Crytek's engine, both games that everyone inside the industry agree are taxing really due to poor optimization rather than graphics engine sophistication.

Again, the point isn't about which card is the best. The point is, you can't cavalierly leave out one of the top cars on the market with the insouciance it doesn't deserve it because it's dual GPU. That same Anandtech review concedes that a single 690 will achieve %95-%96 the same performance of two 680's.

I have one driving 3 Asus VG278H at bezel corrected 6050x1080 on a Z68 and 2700K with 16gb of 2100 speed ram and get on average 80-100fps per second on most games. The system gets taxed running in 3D, but I still maintain 35+ frames, or even better with AA off.

Bottom line, this article and evaluation is incomplete, and no subjective opinion, even from a tech journalist who authored the article is going to change that truth. Your dual GPU bias is like MotorTrend writing an article on fuel efficient cars and leaving the BMW 335D out of the evaluation because it's a diesel when the rest of the cars are hybrids. It's not an objective evaluation for the reader to help make an informed decision on what hardware to purchase.

No matter what you think, until we all actually see the benches added in for the 690, your opinion and this article is invalid.

All you got to do to fix that is test the 690 and post the results with the rest. Until then, this is an invalid evaluation and article to anyone that's a professional.
 
You cannot hand pick results from a single game and if you are going to don’t do it from an old article. The 7970 is much faster in BF3 with the Cat 12.11 drivers.

But all that aside because I don’t care to argue that with you the 690 costs more than a pair of 680 SLI cards and in most cases is slower or at best offers the same performance. How is that a good buy?

I do not see how the article is incomplete because it doesn’t include the GeForce GTX 690 :S This is a guide to show gamers who want to play Black Ops II what kind of hardware they need for playable performance and they don’t need the 690 so why must it be included? The GTX 680 is overkill for this game so again why must the 690 be included?

How can you know this unless you bench it next to the 690 with the new 310.54 drivers??

How is it a good buy? Well, it's a good buy for anyone that wants SLI performance but only have a single slot solution, or want SLI power in a shuttle form factor? It's a good buy for people who can't afford a truly liquid cooled rig, but want SLI performance without the extra heat and noise?

Are those valid enough reasons why it's a good buy over two 680's?

ETA: I don't disagree with you on your last point, but going by that logic, almost the whole article is redundant because it's a port from a console build. My old GTX 275's in SLI can maintain 60+fps on that engine..
 
How can you know this unless you bench it next to the 690 with the new 310.54 drivers??

How is it a good buy? Well, it's a good buy for anyone that wants SLI performance but only have a single slot solution, or want SLI power in a shuttle form factor? It's a good buy for people who can't afford a truly liquid cooled rig, but want SLI performance without the extra heat and noise?

Are those valid enough reasons why it's a good buy over two 680's?

I know because I have two GTX 680 cards and those drivers did little to improve performance. The 310.61 are the latest now anyway. As I said previously I don't wish to argue this further with you, I have said all I want to and you have said plenty, its there for everyone to read.
 
I know because I have two GTX 680 cards and those drivers did little to improve performance. The 310.61 are the latest now anyway. As I said previously I don't wish to argue this further with you, I have said all I want to and you have said plenty, its there for everyone to read.

Steve, that still does not correlate, because that's your personal experience. Unless the actual 690 card is benched, conjecture is still conjecture and not facts.

Your bias towards the 690's affected the objectivity of the evaluation and leaves the article incomplete and invalid. And with all due respect, you crossing your arms and storming off refusing to discuss it further does not negate that truth.

Thanks for the heads up on the new drivers. I've been distracted lately from playing the hell out of my PS Vita..

It's too bad personal bias on hardware discredits such a long standing site's credibility in this particular case.
 
Are you considering it solely to play Black Ops II or for a wide range of games? If Black Ops II is all you want to play the GeForce GTX 660 will work just fine, it was roughly 20% slower than the Ti version in Modern Warfare 3 and overall just 14% slower on average in the 16 games that we tested it with in or original review.

Based on those numbers you should see around 50fps at 2560x1600 and over 60fps at 1920x1200 when playing Black Ops II with the GeForce GTX 660.

Thanks for deleting my comments at my request Steve. So here it is, I wanna pick up a card solely for Black Ops 2. Im trying to decide between the Nvidia budget/mainstream lineup anywhere between GTX 650 Ti and the 660 Ti. According to Tom's Hardware the GTX 650 Ti is good for no more than 1080p/max settings/4x/msaa. Would a oc version with the latest drivers allow me to play at 8xmsaa smoohthly or is that a gamble versus GTX 660?

Your thoughts please and thanks,

Prog
 
Back