Nvidia's recent 526.47 driver is causing headaches for Modern Warfare 2 players

Jimmy2x

Posts: 238   +29
Staff
What just happened? The Modern Warfare 2 official launch has already left many PC users dealing with crashes and in-game bugs. A tweet from the PC version's lead studio, Beenox, identifies Nvidia's most recent driver package as the cause of some of these errors. The tweet recommends that users stick with Nvidia driver versions 516.59 or 522.25.

Nvidia's latest Game Ready Driver package claims to provide the best possible gaming experience for all major new releases. The package's release notes also state that the release is Windows Hardware Quality Labs (WHQL) certified, which is typically a sign that a driver package has successfully passed Microsoft's software testing suite and is ready for widespread release. Despite the claim and certification, the latest driver package appears to cause several Modern Warfare 2 PC issues.

Some issues, such as in-game parties causing matches to crash, have already been acknowledged and are under investigation by Infinity Ward's development teams. The tweet from Beenox backs several other Reddit posts and reports citing instability and crashing since the game's launch. Nvidia later confirmed the issue in an email to PC Gamer, stating, "yes, there is a bug filed for this game, and we are working on a hotfix."

Nvidia's email appears to contradict their standard driver release language, which claims "the best possible gaming experience for all major new releases." The email said this behavior "...is exactly why our latest Game Ready Driver wasn't promoted or recommended for this title in the first place." However, most would argue that the latest addition to the Modern Warfare lineup, without a doubt, qualifies as a major new release.

Nvidia is currently working on a fix to resolve the identified bug. It has no additional information or timeline regarding when it will be ready. However, Team Green is usually pretty quick when it comes to these things and should release an updated package soon.

Until a fix is released, users running the newest driver and looking to play Modern Warfare 2 should follow Beenox's guidance and revert to Game Ready Driver version 516.59 or version 522.25. Nvidia also provides users requiring the fix with detailed rollback and removal instructions on their product info page.

Permalink to story.

 
Oh but it's usually the AMD drivers who're cr@p ?! ... I used almost every graphic cards / Gpus from ATI / AMD since the 3D rage pro from 1998, and never had a problem until the 5700Xt black screens ( on the blower ones ) , period, and the cr@pload of builds I made for me and my customers... SAME, I don't say the problems doesn't exists, but for me it's heavily inflatuated... and Primarly from clueless ppl who're switching from nvidia to AMD without using DDU or cleaning properly their previous nvidia/intel/whatever driver they used before the switch... always simplier to blame others than our own ignorance... ( I use / build whatever offer me the best for the bucks from whatever it come... but even if I say this, I'll be of course ... an amd fanboy... whatever... )
 
Oh but it's usually the AMD drivers who're cr@p ?! ... I used almost every graphic cards / Gpus from ATI / AMD since the 3D rage pro from 1998, and never had a problem until the 5700Xt black screens ( on the blower ones ) , period, and the cr@pload of builds I made for me and my customers... SAME, I don't say the problems doesn't exists, but for me it's heavily inflatuated... and Primarly from clueless ppl who're switching from nvidia to AMD without using DDU or cleaning properly their previous nvidia/intel/whatever driver they used before the switch... always simplier to blame others than our own ignorance... ( I use / build whatever offer me the best for the bucks from whatever it come... but even if I say this, I'll be of course ... an amd fanboy... whatever... )
It looks you've been holding this for a while. AMD/ATI cards were always notorious for driver issues. My last AMD card was missing textures in Borderlands, I think it was a 4870. Admittedly, that was a long time ago, but it put me off buying AMD cards. I might give them another chance this gen, now with Nvidia screwing customers left and right.
 
Oh but it's usually the AMD drivers who're cr@p ?! ... I used almost every graphic cards / Gpus from ATI / AMD since the 3D rage pro from 1998, and never had a problem until the 5700Xt black screens ( on the blower ones ) , period, and the cr@pload of builds I made for me and my customers... SAME, I don't say the problems doesn't exists, but for me it's heavily inflatuated... and Primarly from clueless ppl who're switching from nvidia to AMD without using DDU or cleaning properly their previous nvidia/intel/whatever driver they used before the switch... always simplier to blame others than our own ignorance... ( I use / build whatever offer me the best for the bucks from whatever it come... but even if I say this, I'll be of course ... an amd fanboy... whatever... )

Nvidia has been having a lot of driver issues lately while AMD has started to clean up their act :)
 
Correct me if I'm wrong, but Nvidia's 5xx series drivers are intended for photo and movie editing roles, at 10 bit color depth, being labeled as "Studio Drivers".. They were never "game ready drivers", in the first place. Their "game ready drivers" are (or at least were). limited to 8 bit color.

But yes, that was the last time I checked. I concede that it's possible Nvidia "ran out" of 4xx numbers to attach to gaming drivers.
 
Correct me if I'm wrong, but Nvidia's 5xx series drivers are intended for photo and movie editing roles, at 10 bit color depth, being labeled as "Studio Drivers".. They were never "game ready drivers", in the first place. Their "game ready drivers" are (or at least were). limited to 8 bit color.

But yes, that was the last time I checked. I concede that it's possible Nvidia "ran out" of 4xx numbers to attach to gaming drivers.
I've never heard of drivers and color depth.
I have a 3090 hooked to an OLED and I get a 12bit, 10bit and 8 bit options.
 
As I s
I've never heard of drivers and color depth.
I have a 3090 hooked to an OLED and I get a 12bit, 10bit and 8 bit options.
As I said, my information is not recent. At their release Nvidia "Studio drivers", were designed to allow "30 bit color".

Now, if you look at monitor specs, you'll see some capable of "16.7 million colors" , while others offer "1 billion colors" 16.7 m = 8 bit color depth, while 1 billion is a 10 a bit depth panel.

I just endured almost a half hour of video "experts" droning on about the topic. I wish I had more to tell you.

But, in my abstention to pursue this further, feel free to cook up a few search terms on your own, and, "let 'em rip", so to speak.

The 10 vs 8 bit driver was true as of the GTX-1xxx series, and they couldn't be installed on anything "less", that a GTX-1050ti.

It's very true that things have almost certainly changed since then.

BTW, it's believed that the average human can only see about a million colors. Go figure.
 
Last edited:
Male or female? Because women can see more colors than men can.
Well, sexual dimorphism extends beyond that. Women can hear higher sound frequencies than men, but men have better eyesight than women.

I think it's so men can "see game", and women can "hear babies cry better". Hey, before you call me sexist, evolution provided those distinctions, I only humbly observe and interpret them.

(The same conclusion was drawn in a recent paid study. I knew it 30 years ago when I was selling hi-fi equipment, and nobody paid me to figure it out. Rats).

Some people have four sets of cones in their eyes, (or 1 extra), and those gifted individuals can see far more than a million colors. The condition / situation / mutation / gift, is called tetrachromacy. I do not know if there is an uneven distribution of it between sexes.

Wiki might https://en.wikipedia.org/wiki/Tetrachromacy

If you have some time to spare, burn, or waste, you could research hearing range and light wave length acceptance differences between man and various animals. We all know about "dog whistles", but there's way more to it than that.
 
Last edited:
Well, sexual dimorphism extends beyond that. Women can hear higher sound frequencies than men, but men have better eyesight than women.

I think it's so men can "see game", and women can "hear babies cry better". Hey, before you call me sexist, evolution provided those distinctions, I only humbly observe and interpret them.

(The same conclusion was drawn in a recent paid study. I knew it 30 years ago when I was selling hi-fi equipment, and nobody paid me to figure it out. Rats).

Some people have four sets of cones in their eyes, (or 1 extra), and those gifted individuals can see far more than a million colors. The condition / situation / mutation / gift, is called tetrachromacy. I do not know if there is an uneven distribution of it between sexes.

Wiki might https://en.wikipedia.org/wiki/Tetrachromacy

If you have some time to spare, burn, or waste, you could research hearing range and light wave length acceptance differences between man and various animals. We all know about "dog whistles", but there's way more to it than that.

Women can distinguish between very similar shades and colors better than men..their brains are just wired to do a better job at it. I've seen it theorized it might have been an adaptation to their "gatherer" role in antiquity, allowing them to spot the ripest fruits and such amidst the undergrowth. It also might have let them serve as the earliest medics, noticing slight discolorations in people indicating possible sickness. Women always seem to notice first when a kid has "bad color".
 
Women can distinguish between very similar shades and colors better than men..their brains are just wired to do a better job at it. I've seen it theorized it might have been an adaptation to their "gatherer" role in antiquity, allowing them to spot the ripest fruits and such amidst the undergrowth.
It also had something with knowing what kind of berries would kill you. Berry number one good, berry number two... bad!
 
Oh but it's usually the AMD drivers who're cr@p ?! ... I used almost every graphic cards / Gpus from ATI / AMD since the 3D rage pro from 1998, and never had a problem until the 5700Xt black screens ( on the blower ones ) , period, and the cr@pload of builds I made for me and my customers... SAME, I don't say the problems doesn't exists, but for me it's heavily inflatuated... and Primarly from clueless ppl who're switching from nvidia to AMD without using DDU or cleaning properly their previous nvidia/intel/whatever driver they used before the switch... always simplier to blame others than our own ignorance... ( I use / build whatever offer me the best for the bucks from whatever it come... but even if I say this, I'll be of course ... an amd fanboy... whatever... )
All drivers can have issues. The funny thing is nVidia even said that they didn't promote the driver for MWII but if they knew any possible issues for CoD they should have stated it. No idea If they knew anything just stating.
 
Or when Mr. Tiger is hunting you.
As Mrs. tiger was heard to snarl at her mate, "I hope you can take leftover human for dinner again. You're too old, slow, lazy, and fat, to bring home a decent meal these days". Yes dear, yes dear, three bags full......he growled in reply. "Feed it to the cubs, I'm going down to the waterhole".

OK, so it's not anthropomorphism's finest literary endeavor. So What?

 
Last edited:
As Mrs. tiger was heard to snarl at her mate, "I hope you can take leftover human for dinner again. You're too old, slow, lazy, and fat, to bring home a decent meal these days". Yes dear, yes dear, three bags full......he growled in reply. "Feed it to the cubs, I'm going down to the waterhole".

OK, so it's not anthropomorphism's finest literary endeavor. So What?


If I was that lion, I wouldn't be sucking on the bottle. :0
 
If I was that lion, I wouldn't be sucking on the bottle. :0
It's not a lion, it's a liger. (male lion x female tiger)

I take your point though. There do seem to be two other more plentiful sources of milk in the area.

In hybrid designation, the male parent is always placed first. I wonder how long it will be before the truly "woke, woke", will be bellyaching about that.
 
People putting Nvidia and AMD drivers as equivalent because of an issue with one game are being dishonest.

Nvidia has occasional driver issues with games while AMD has driver issues mostly unrelated to games such as black screens, signal loss, glitches with videos, in OS, etc... which are far worse.

 
Women can distinguish between very similar shades and colors better than men..their brains are just wired to do a better job at it. I've seen it theorized it might have been an adaptation to their "gatherer" role in antiquity, allowing them to spot the ripest fruits and such amidst the undergrowth.
Let's face some other facts, (which are "off topic", as well, and I accept full responsibility for that). First of all, "human history" extends back over a million years. The only way "humans" could have originally discovered, "which berries could be eaten, or not", is by trial and error. IE, "cave brother eat that berry, cave brother brother die". Even that, required the development of language skills, in order to transmit it to future generations.

FWIW, I've had two years of photographic training, and an AAS degree to prove it. And again FWIW, I can match an image reliably to it's source, in density, contrast, and color value. Could a women see colors between the ones I perceive? Dunno? But the last female photographer wannabe I saw in action, printed an entire wedding portfolio so ungodly cyan as to be unsalable. But she seemed thrilled with the result. :eek: So, whether or not she could see between the colors that I could, is pretty much moot, or "academic", if you prefer.
It also might have let them serve as the earliest medics, noticing slight discolorations in people indicating possible sickness. Women always seem to notice first when a kid has "bad color".
Well, the tribesmen were out hunting, while "tribette" watched the baby. IMO, that places her, "first at the scene of the accident", in a manner of speaking. But it certainly doesn't automatically confer "better color perception", upon the anguished mother..

As far as, "the medical profession" goes, that has been male dominated throughout history, leading to such "advanced techniques", as leeching, bleeding, and dancing around the patient mumbling various chants and incantations to a variety of "gods".

Which is not to say ancient medical discoveries didn't contribute to modern medicine. But the plain, unvarnished truth is Aspirin wasn't synthesized until 1897.


To go back to topic raises the overarching question, if we, (men or women) can't see a billion colors, why the hell are we being sold those "virtues", as a matter of course?

As a point of interest, Photoshop itself provides 10 bit color depth, while Photoshop Elements is limited to 8 bits, particularly in filters, which carry the file extension, "8bf", which should be a dead giveaway. Back in the day, (when you could still steal, or "crack", if that's your preferred term), all the pirates simply had to have Photoshop. My lingering question that remains, was it a question of its capabilities, or something that the greedy had to have, either for the thrill of the crime, or just social status.

My original question still stands, "have game ready drivers become 10 bit or not". They at one point weren't, and the 4xx to 5xx series designation separated them from the studio versions. Simple question you would think, but nobody seems to be able or willing to answer it.

(I emboldened those two paragraphs, so nobody has to wade through the rest of my typical "TL;DR" rantings).
 
Last edited:
Let's face some other facts, (which are "off topic", as well, and I accept full responsibility for that). First of all, "human history" extends back over a million years. The only way "humans" could have originally discovered, "which berries could be eaten, or not", is by trial and error. IE, "cave brother eat that berry, cave brother brother die". Even that, required the development of language skills, in order to transmit it to future generations.

FWIW, I've had two years of photographic training, and an AAS degree to prove it. And again FWIW, I can match an image reliably to it's source, in density, contrast, and color value. Could a women see colors between the ones I perceive? Dunno? But the last female photographer wannabe I saw in action, printed an entire wedding portfolio so ungodly cyan as to be unsalable. But she seemed thrilled with the result. :eek: So, whether or not she could see between the colors that I could, is pretty much moot, or "academic", if you prefer.

Well, the tribesmen were out hunting, while "tribette" watched the baby. IMO, that places her, "first at the scene of the accident", in a manner of speaking. But it certainly doesn't automatically confer "better color perception", upon the anguished mother..

As far as, "the medical profession" goes, that has been male dominated throughout history, leading to such "advanced techniques", as leeching, bleeding, and dancing around the patient mumbling various chants and incantations to a variety of "gods".

Which is not to say ancient medical discoveries didn't contribute to modern medicine. But the plain, unvarnished truth is Aspirin wasn't synthesized until 1897.


To go back to topic raises the overarching question, if we, (men or women) can't see a billion colors, why the hell are we being sold those "virtues", as a matter of course?

As a point of interest, Photoshop itself provides 10 bit color depth, while Photoshop Elements is limited to 8 bits, particularly in filters, which carry the file extension, "8bf", which should be a dead giveaway. Back in the day, (when you could still steal, or "crack", if that's your preferred term), all the pirates simply had to have Photoshop. My lingering question that remains, was it a question of its capabilities, or something that the greedy had to have, either for the thrill of the crime, or just social status.

My original question still stands, "have game ready drivers become 10 bit or not". They at one point weren't, and the 4xx to 5xx series designation separated them from the studio versions. Simple question you would think, but nobody seems to be able or willing to answer it.

(I emboldened those two paragraphs, so nobody has to wade through the rest of my typical "TL;DR" rantings).
Game drivers have been doing 10 bit colour for multiple years now, that's literally why HDR gaming monitors exist in the first place, and have for long time in the tech space.

Studio drivers are simply supposed to be more stable for pc's you leave on for hours while rendering. They are almost the same as game drivers but carry more support for art and ai applications, especially nvidia ones.
 
Ligra ...LION TIGER hybrid...
Well,NO.. It's, "Liger". ( male LI-on x (female ti-GER). This cross is subject to gigantism.
When crossed the other way, you get a "tigon". (male TIG-er x female li-ON). Oddly enough, this cross is subject to dwarfism.

We do make a sexual distinction in English for both species, IE 'tigress" & "Lioness", but I've never heard "ligra" used on a female result of the cross. Which is not to say it hasn't been done.
 
Back