Nvidia ends driver support for Fermi-based GPUs, 32-bit operating systems

midian182

Posts: 9,744   +121
Staff member

Nvidia has announced it has officially ended mainstream graphics driver support for its Fermi-based GeForce GPUs. The move is effective immediately and will see all Fermi products being moved to legacy support status. Additionally, the company said it would end GeForce support for 32-bit operating systems before the end of the month.

Fermi GPUs were part of Nvidia’s GeForce 400 and 500 series lines, including the infamous GTX 480 and GTX 580; you can see the full list of cards here. The architecture has been around for about eight years—the same amount of time that Fermi's predecessor, Tesla, existed before being moved to legacy status in March 2014. Nvidia notes that critical security updates for Fermi will be available through to January 2019.

“Effective April 2018, Game Ready Driver upgrades, including performance enhancements, new features, and bug fixes, will be available only on Kepler, Maxwell, and Pascal series GPUs. Critical security updates will be available on Fermi series GPUs through January 2019,” write the company.

Additionally, Nvidia is making good on a promise it made last year to stop releasing GeForce graphics card drivers for 32-bit operating systems. Support is also set to end this month.

“Game Ready Driver upgrades, including performance enhancements, new features, and bug fixes, will be available only on 64-bit operating systems. Critical security updates will be provided for 32-bit operating systems through January 2019.”

Most people, especially gamers, are likely to be using 64-bit operating systems these days, but those that aren’t might see this as another reason to consider upgrading.

Permalink to story.

 
The end of a long support era. I cant imagine too many people use fermi, and those that do will be fine on the current driver for older games. Same goes for 32 bit.

What is amazing is that official driver support for the 5000/6000 cards ended in 2015, three years earlier, with only a single beta released for 10 compatibility.

I had a pair of 550tis in SLI, good times.
 
I still have that old toaster (480) in my closet. It was the hottest, loudest, and most power-hungry card I've owned. It made summer days in my brick house miserable...but at least it performed good for whatever game was out at the time.
 
I had a GTX570. That was a good card. I got big overclocks out of mine that put it level or sometimes over the top of a GTX580 in most games. Generally a tiny bit faster than a HD6970 in the biggest games of the day, which cost more.
I had that card and loved it, used to run a little on the hot side temperature wise through. Mine anyway. ;)
 
I had a GTX570. That was a good card. I got big overclocks out of mine that put it level or sometimes over the top of a GTX580 in most games. Generally a tiny bit faster than a HD6970 in the biggest games of the day, which cost more.

I had that card as well, and enjoyed it for several years, if it had more vram I would probably kept it longer, but it was time to upgrade and in Oct of 2016 I finally decided to get a GTX 1060 6GB :D
 
My GTX 480 sli are still running fine , connected to the 65" TV. and can still churn out some decent FPS in titles not as old .they play World of Tanks just fine at 1080P,,
 
I changed my 480 gtx sli setup with 1000 Watts Corsair dual rail PSU and now have a AMD R9 290X CrossFire which required a 850 Watts Corsair mono rail PSU which mean compared to the AMD R9 290X they are not that power hungry. Contrary to my setup now the SLI was flawless almost every game gave me 2X boost which the Crossfire does not work on most title and use more power. The R9 290X are still good card but not so much when they most work together.
 
It's actually kind of sad because it's the end of an era. I remember what a beast the GTX 470/480 series was at Metro 2033. Time's cruel crawl will come for Kepler as well, sooner than later I'm afraid.
 
Last edited:
It's actually kind of sad because it's the end of an era. I remember what a beast the GTX 470/480 series was at Metro 2033. Time's cruel crawl will come for Kepler as well, sooner than later I'm better.
Could get a good 60 FPS in SLI with everything max out.
 
I changed my 480 gtx sli setup with 1000 Watts Corsair dual rail PSU and now have a AMD R9 290X CrossFire which required a 850 Watts Corsair mono rail PSU which mean compared to the AMD R9 290X they are not that power hungry. Contrary to my setup now the SLI was flawless almost every game gave me 2X boost which the Crossfire does not work on most title and use more power. The R9 290X are still good card but not so much when they most work together.
Crossfire has never been as good as SLI.

That being said, dont think f it as good now. SLI was in its prime during the fermi days, nvidia was pouring resources into it, dual GPU was all the rage. Today, SLI support has waned, many devs dont support it properly, and many modern game engines dont scale nearly as well as they used to.

I loved my 550ti SLI, but I'll probably never run dual GPu again, as fun as it was.
 
IDK to me it just seems a little premature to completely stop supporting them with updates that could improve performance. Fermi was a very successful architecture and I'm sure there are some non-enthusiast types out there that still have them in their old gaming PC's and try to get the most out of them.

I understand it's been eight years, but I'd have been happier to see Nvidia just round it to an even ten years of support. I had both a GTX 480 and 580's in SLI. I know how powerful they are even compared to today's tech. Now I realize kepler isn't far behind.

While I have a 1080Ti in my main rig, I'm still using a GTX 670 in a secondary rig and was hoping to see it supported for years to come. I'll be lucky to get two more years of updates.
 
Back