AMD and Nvidia are at it again. The two reigning champs in the market for video game graphics have been fighting since late last month when some performance issues on the PC version of Watch Dogs kicked up a fresh controversy. And given that AMD is still talking about the issue publicly, it doesn't look like things are going to settle down anytime soon.

Are you one of the people perplexed by all the sound and fury emanating from PC gaming forums? Don't worry: I am, too. To help us all get up to speed, I prepared a handy guide to the main talking points here.

Have they ever been at peace with one another?

Not really, no. They're sort of like the Coke vs. Pepsi of video games. That comparison is all the more relevant considering that some of their other competitors, like Intel, have captured a much larger portion of the overall graphics market by appealing to PC users who don't need to play serious games and thus don't care as much about spending upwards of $300 for the best graphics card imaginable. Something similar happened when Pepsi and Coke locked horns so intensely that they didn't notice other, smaller competitors had started making little things called energy drinks.

Is there a substantial difference between their cards?

It depends on who you ask. Last year when we polled our readers, the Kotaku community seemed to overwhelmingly favor Nvidia cards. That doesn't say anything about performance, mind you—just people's preferences. But market share could be a significant issue here, since Nvidia has been beating out its closest competitor specifically in the PC realm in recent years. Here's quick description of Nvidia's current, enviable position from the financial site The Motley Fool:

NVIDIA has benefited from the growing PC gaming market, with revenue from its GeForce gaming GPUs rising by 15% in fiscal 2014. This growth came during a continuing decline in the PC market as a whole, with NVIDIA specializing in one of the few areas that have remained immune to the PC sales slump. NVIDIA's share of the discrete GPU market has also been on the rise, with the company now commanding around 65% of the market. NVIDIA was nearly even with rival AMD back in 2010 in terms of market share, but the gap has been widening each year.

What does that have to do with anything?

Well, each company's influence in the PC gaming market rises and falls depending on the worth that individual game developers give to it. So if a company like, say, Ubisoft thinks that it should form some special partnership with Nvidia because lots of PC gamers use its cards over AMD tech, the company's executives would probably feel more inclined to form such a special partnership if they were convinced that keeping Nvidia happy would guarantee them the rapt attention of 65 percent of PC gamers.

As an aside, when we tested GPU performance on Watch Dogs a month ago, we didn't see huge discrepancies in benchmarks that suggested the game was one sided in Nvidia or AMD's favor. With that said, after it was discovered that the PC version of the game could have received considerably better graphics than consoles, the situation begs the question if these so-called "optimizations" are hurting PC gaming in general. You can check out our full Watch Dogs benchmarks here.

Whenever AMD and Nvidia butt heads, vocal critics begin to use rhetoric about innovation, corporate bullying, even monopolization. Nvidia's relative success doesn't necessarily mean that AMD users should suffer anything just because they're not enjoying the fruits of some special partnership, however.

Ok, so then what happened recently to kick the hornet's nest?

So speaking of Ubisoft, the company finally released its new open-world game Watch Dogs last month. Once it came out, it was met with an outcry from many irate gamers experiencing problems with the game's PC version. Many of these were online connectivity issues, but breakdowns of relevant versions of the game also suggested that it wasn't always running nearly as well as it could—particularly on PCs that were equipped with AMD hardware rather than Nvidia cards.

Late last month, AMD's Robert Hallock took Nvidia to task for this, blaming his rival's GameWorks program for deliberately undermining AMD products and thus effectively disenfranchising gamers using its graphics cards. Even for a corporate rivalry, the language here was incredibly strong; Hallock called GameWorks "a clear and present threat to gamers." From the original Forbes story:

"Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favor of NVIDIA products," Hallock told me in an email conversation over the weekend. But wait, it stands to reason that AMD would be miffed over a competitor having the edge when it comes to graphical fidelity and features, right? Hallock explains that the core problem is deeper: "Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization."

Holy crap, I'm a gamer. Am I being threatened?

I don't know, maybe! I'm a gamer, and since I created a Kinja account people have been saying some pretty scary stuff to me. I'm ok though, thanks for asking.

Oh wait, you're asking about AMD! Well, maybe in some long-term sense if GameWorks really does end up stifling innovation and impeding performance the way AMD is warning us it will, the entire industry will stagnate and we'll all be cast back to the stone age of chunky pixels and bloops.

But no, I think you're fine for now. The thing we do need to keep track of, however, is how many games that come out for the PC exhibit radically different performance levels on AMD or Nvidia cards. GameWorks, the program at the center of the current spat, is less than a year old. We don't have enough material to sink our teeth into yet, let alone start making dramatic predictions. And seeing as Watch Dogs took five rocky years to make it to market and doesn't look as good as many expected it to, it's not clear if its problems were an anomaly or symptoms of something truly systemic.

Ok, back up a sec. What is GameWorks exactly?

It's a new-ish program Nvidia came up with to work more closely with PC game developers than it had been previously in an effort to spiff up the performance of big games on PC rigs using the company's tech. Nvidia announced it last October around the same time that Batman: Arkham Origins came out—one of the first games the company highlighted as benefitting from the new program. As I said in a recent story on a deal Nvidia just inked with Ubisoft, GameWorks "was designed to put the company closer to the entire development process of a given game—giving a company like Ubisoft access to a more robust set of tools so they can 'bring an enhanced gameplay experience to [...] PC players,' as Ubisoft VP Tony Key put it in today's press release."

So basically: Nvidia gives a game developer an enhanced set of tools to help a game look better. In exchange, Nvidia gets to join in on the marketing for a highly anticipated game like Watch Dogs and thus spread its "The Way It's Meant To Be Played" mantra.

Just to be clear here in case the name confuses you, GameWorks is a developer-side initiative. That's part of what's been making PC gamers so upset: many feel that their experience as average customers is being negatively effected by a lot of high-level corporate deal-making in which they have no say. So if you're a PC gamer who just happened to buy a high-end AMD card a few months before Watch Dogs dropped, you're out of luck.

Ok, so AMD is upset that Nvidia is getting some kind of special treatment from top-tier game developers like Ubisoft that it's not receiving?

Well, AMD is framing it as a clash of two different ideals: an open-source ethic that it embodies as opposed to a ruthless competitive spirit that's driving Nvidia to make unfair backroom deals with no regard for the state of the industry as a whole. Here's how AMD's Robert Hallock described Mantle, the company's software that's designed to make it easier for game developers to process visual effects for games (particularly when played on PCs with AMD graphics cards), in an email to me this week:

Operating in the spirit of sharing and transparency positively impacts the industry by creating an environment of mutual learning. If game developers are open and honest with us about their challenges, we can design hardware/software solutions that directly address development issues. Mantle is a great example of this. And, if we are open and honest about the form and functions of those solutions, they may be analyzed and investigated by developers to learn more about the nature of our hardware or to discover a previously unthought-of technique. In contrast, if one party is not playing with all cards on the table, then problems can certainly be solved, but the nature of the solution is indecipherable and nothing is learned—the industry has not moved a step forward.

That's the idealistic part. As for the Nvidia-is-terrible portion of its current messaging, AMD's Richard Huddy doubled down on the company's previous charges in a recent interview with Maximum PC. You can watch the whole interview here:

The relevant discussion comes around the 30-minute mark, when Huddy starts talking about how GameWorks forces ISVs (independent software vendors) into contracts in which they must use code provided by Nvidia that hampers the performance of AMD hardware.

Now: Mantle isn't necessarily optimized for Nvidia cards either. Huddy's critique here is that Nvidia is prompting game developers to use software that not only runs less gracefully on AMD cards, but is harder to work around because the company black-boxes relevant parts of its code. Meanwhile, Huddy insists—like Hallock did above—that at the very least Mantle is an open source technology in comparison.

The key quote, to me, is when Huddy says: "We are running code in a benchmark which is harming us and this code has been written by Nvidia, and their contract is stopping the ISVs from changing it. This is not equitable."

Oh, snap! So what does Nvidia have to say for itself?

Starting with the original Forbes story, a number of Nvidia spokespeople have repeatedly denied AMD's assertions, calling them "mysterious" at best. When I reached out to an Nvidia spokesperson this week, he responded by saying: "Hi...boy, if AMD spent as much time working on their drivers and actually making investments in gaming than they did talking about us, then maybe their customers would not be stuck with sub-par gaming experiences in today's cutting-edge titles."

These are some serious fighting words. So who's in the right here?

It's hard to say, honestly. Tech-heavy analysis by writers at places like ExtremeTech, Forbes (which ran the story that started the recent kerfuffle), and Digital Foundry have broken down the issue to show that Watch Dogs' performance really is subpar with AMD cards. But that's a different statement than concluding Nvidia is to blame for AMD's problems. Digital Foundry, for instance, simply suggested: "a particular rethink is required in the way that AMD graphics hardware handles this game."

Wait, has AMD ever entered into exclusive contracts with big game companies?

It has. You know that Mantle program I was just talking about? Well, the company has formed plenty of partnerships with top-tier game developers to optimize their work for AMD's tech. When I asked an AMD representative for a full picture, she told me that "the list is hundreds long" but offered up some of the most recognizable examples:

  • Battlefield Hardline
  • Dragon Age: Inquisition
  • Plants vs. Zombies: Garden Warfare
  • Murdered: Soul Suspect
  • The Banner Saga
  • Dyad
  • Guacamelee!
  • Tales from Space: Mutant Blobs Attack
  • Civilization: Beyond Earth (Mantle support)
  • Sniper Elite III
  • Lichdom
  • Star Citizen
  • Thief
  • Deus-Ex: Human Revolution
  • Dirt 3
  • Dirt Showdown
  • Far Cry 3
  • Far Cry 3 Blood Dragon
  • Sleeping Dogs
  • DMC Devil May Cry
  • Hitman Absolution
  • Sleeping Dogs
  • Tomb Raider
  • Battlefield 4
  • Crysis 3
  • BioShock Infinite

That's a lot of games.

It sure is! And that's not even all of them. Plus, when I asked Nvidia for a similar list of all GameWorks-powered games, a representative said that it's already such a far-reaching program that "it would be impossible for us to maintain a current list."

The funny part about this is that while many people seem to enjoy playing up the rivalry between the two companies, once you start to reflect on the two you'll notice plenty of similarities. I mean, just look at Nvidia's recent demo showing off a wolf's hair in The Witcher 3. Then compare that to AMD's "TressFX Hair" tech that it promoted with Tomb Raider to show off the best, most next-gen version of Lara's lovely locks. They're going after similar things here, often in similar ways.

Couldn't you argue that AMD is just doing the same thing as Nvidia then?

It's sort of a "he said, she said" problem, isn't it? AMD insists that it's current initiatives are more ethical and beneficial to the game industry as a whole. In the statements Nvidia has fired back at AMD in recent weeks, company representatives keep suggesting that AMD is playing the same game that it is—only not as well.

So are these companies constantly at war with one another?

Yes. The ongoing rivalry is pretty much the final battle in Godzilla, in that it looks gigantic and expensive but moves at a bewilderingly tedious pace.

But I'd actually argue that in spite of all the strong rhetoric being thrown about here, the two companies have a great deal invested in playing nice with one another—as long as both keep putting out popular graphics cards that PC gamers want to buy. Deliberating hampering the performance of one another's products would bifurcate the market in a way that could be disastrous for both companies.

Early on in that Huddy interview I referenced, he talks about how the entire industry has centralized around a small cluster of companies like AMD and Nvidia, leaving smaller and less successful competitors to slouch into obsolescence. The same thing happened in the console market, leaving gamers with three main choices (the Xbox, PlayStation, and Wii U currently) after companies like Sega and Atari bowed out of the hardware game.

This is one of the things that makes the game industry so much fun to watch: the entire ecosystem still so young that it's not clear if this kind of centralization is the natural order of things or a historical anomaly.

But I'm talking about market forces as if they're mad gods we must bow down to. The important thing to remember here is that this is a consumer-facing entertainment industry. We, the average gamers, have the power to tell companies what we want. Why else would we be buying and installing these graphics cards in the first place? Because we're just waiting for something better than Nvidia and AMD to come along?

Fuck that. That's not the thinking that brought us Mario, or Princess Peach, or Nazi murder simulators, or the insane mirror image of myself I've made in Tomodachi Life and feel strangely drawn to. If you don't like something that's happening in games (on your PC or otherwise), you have a voice now more than ever. So keep your eyes peeled for more games with problems like Watch Dogs has had, and let us know about them.

Ok, but I play my games on consoles. Does this matter to me?

For the time being, it sounds like everything is copacetic on the console front. An AMD representative affirmed in an emailed statement that Huddy's comments were only about the GameWorks program, which, again, is PC-centric game development platform.

As I mentioned earlier, AMD tech is in all three of the major current-gen systems. Major developers understandably optimize their work for the PlayStation and Xbox consoles. As for the Wii U? Well, that's a whole other story.