The History of the Modern Graphics Processor, Part 4: The Coming of General Purpose GPUs

By Graham Singer on April 16, 2013, 10:05 PM
amd, radeon, nvidia, geforce, gpu, vga, cpu, ati, 3d, pc gaming, history, graphics cards, 3d gaming, modern graphics processor, history of gpu, general purpose gpus

Until the advent of DirectX 10, there was no point in adding undue complexity by enlarging the die area, which increased vertex shader functionality in addition to boosting the floating point precision of pixel shaders from 24-bit to 32-bit to match the requirement for vertex operations. With DX10's arrival, vertex and pixel shaders maintained a large level of common function, so moving to a unified shader arch eliminated a lot of unnecessary duplication of processing blocks. The first GPU to utilize this architecture was Nvidia's iconic G80.

Four years in development and $475 million produced a 681 million-transistor, 484mm² behemoth -- first as the 8800 GTX flagship and 8800 GTS 640MB on November 8. An overclocked GTX, the 8800 Ultra, represented the G80's pinnacle and was sandwiched between the launches of two lesser products: the 320MB GTS in February and the limited production GTS 640MB/112 Core on November 19, 2007.

Aided by the new Coverage Sample anti-aliasing (CSAA) algorithm, Nvidia saw its GTX demolish every single competitor in outright performance. Despite that success, the company dropped three percentage points in discrete graphics market share in the fourth quarter -- points AMD picked up on the strength of OEM contracts.

Read the complete article.




User Comments: 21

Got something to say? Post a comment
LNCPapa LNCPapa said:

I've owned so many of the cards mentioned in these 4 parts... and it feels like it's been such an awesome trip to get to where we are now. I've loved almost every high end (for the time) gpu I've purchased regardless of brand.

TomSEA TomSEA, TechSpot Chancellor, said:

Terrific series of articles. Really brought back some fun memories. And believe you're spot on, Graham - upgrading is becoming less and less of a necessity as the power of the new cards will handle anything game developers are going to be putting out for quite a while.

Guest said:

I love rounding errors... 100.2% total market share :-S

Burty117 Burty117, TechSpot Chancellor, said:

Aawww man, I was enjoying these, but they were all a very good read I have to say, I did notice the GTX285 wasn't mentioned but I guess it didn't need to be as it was only a tad faster than a 280.

Do you plan on doing any other articles like this? such as for chipsets, CPU's, display technology or sound cards? although I have to admit, GPU's firms have (had?) definitely got the more interesting tale to tell xD

1 person liked this | fraggleki said:

That 8800 GTX brings back so many good memory! First card I ever bought on my own. Also the 4870x2 was my second card and it was great but got to hot and eventually died.

The 6970 is my new card but thinking about switching to Nvidia again!

SantistaUSA said:

Back in my younger days (I'm 34 now) I was so jealous that my rich friend had a nice setup with the vodoo video cards, those graphics blew me away back then. Nowadays I have a EVGA GTX570 (in use with i7 3770k) and so far it plays games pretty well at 1080p.

Great article!

Guest said:

I've been really enjoying these articles, some of my favourite pieces as of late.

Puiu Puiu said:

Really good article, but I remember the 4000 amd series having a much bigger impact than what I got from reading this. it might be just me.

VitalyT VitalyT said:

Whichever way things go in the coming months and years

I thought your could do better than that. Having written such a comprehensive graphics history, and without the will to push the envelope and make future predictions.

VitalyT VitalyT said:

There is so much more to it from today that the article didn't go over. I don't follow all of that, but what I did hear about :

1. The coming support of HDMI 2.0, its competition with DisplayPort;

2. MS announcing there will be no DirectX12

3. All the buzz about 4K and its support.

4. The battle of oncoming play stations, 720 vs PS4

There is so much one could at least try to predict in the world of graphics...

5 people like this | cliffordcooley cliffordcooley, TechSpot Paladin, said:

There is so much one could at least try to predict in the world of graphics...
But the article is about history, not predictions of the future. Besides DBZ doesn't have access to the Iranian time machine.

JC713 JC713 said:

Once again great article.

I love rounding errors... 100.2% total market share :-S

Don't hate.

amstech amstech, TechSpot Enthusiast, said:

My 8800GTS and GTX 280 were some of my favorite cards. A 512bit bus on that 280 to this day is still remarkable.

1 person liked this | dividebyzero dividebyzero, trainee n00b, said:

There is so much one could at least try to predict in the world of graphics...

As Clifford noted, the series is about history, and not about making pronouncements regarding future history.

I could I suppose expound upon...

2. MS announcing there will be no DirectX12

...except that the company doing the announcing is AMD and not Microsoft, and the guy doing the announcing is a marketing guy. I suppose you could take the concept and run with it, but I seem to remember that the last time AMD marketing guys expounded on APIs they managed to invent the internal confusion engine...

Richard Huddy (then) of AMD talks of doing away with the API (and just not DirectX) on 16 March, 2011

'The funny thing about introducing shaders into games in 2002,' says Huddy, 'was that we expected that to create more visual variety in games, but actually people typically used shaders in the most obvious way. That means that they've used shaders to converge visually, and lots of games have the same kind of look and feel to them these days on the PC. If we drop the API, then people really can render everything they can imagine, not what they can see - and we'll probably see more visual innovation in that kind of situation.'

...and two days later, Neal Robison (then) of AMD announces that the API is alive, well, and the way forward

'The bottom line for us is that we support open standards, such as OpenCL and DirectCompute, we feel this to be a way to move the whole industry forward.'

Guest said:

Love this series.

LinkedKube LinkedKube, TechSpot Project Baby, said:

I recently sold my 295 and my 8800 ultra not even 3 months ago. Kind of miss those things.

Footlong Footlong said:

I've owned so many of the cards mentioned in these 4 parts... and it feels like it's been such an awesome trip to get to where we are now. I've loved almost every high end (for the time) gpu I've purchased regardless of brand.

Terrific series of articles. Really brought back some fun memories. And believe you're spot on, Graham - upgrading is becoming less and less of a necessity as the power of the new cards will handle anything game developers are going to be putting out for quite a while.

I love rounding errors... 100.2% total market share :-S

Aawww man, I was enjoying these, but they were all a very good read I have to say, I did notice the GTX285 wasn't mentioned but I guess it didn't need to be as it was only a tad faster than a 280.

Do you plan on doing any other articles like this? such as for chipsets, CPU's, display technology or sound cards? although I have to admit, GPU's firms have (had?) definitely got the more interesting tale to tell xD

That 8800 GTX brings back so many good memory! First card I ever bought on my own. Also the 4870x2 was my second card and it was great but got to hot and eventually died.

The 6970 is my new card but thinking about switching to Nvidia again!

Back in my younger days (I'm 34 now) I was so jealous that my rich friend had a nice setup with the vodoo video cards, those graphics blew me away back then. Nowadays I have a EVGA GTX570 (in use with i7 3770k) and so far it plays games pretty well at 1080p.

Great article!

I've been really enjoying these articles, some of my favourite pieces as of late.

Really good article, but I remember the 4000 amd series having a much bigger impact than what I got from reading this. it might be just me.

Whichever way things go in the coming months and years

I thought your could do better than that. Having written such a comprehensive graphics history, and without the will to push the envelope and make future predictions.

There is so much more to it from today that the article didn't go over. I don't follow all of that, but what I did hear about :

1. The coming support of HDMI 2.0, its competition with DisplayPort;

2. MS announcing there will be no DirectX12

3. All the buzz about 4K and its support.

4. The battle of oncoming play stations, 720 vs PS4

There is so much one could at least try to predict in the world of graphics...

There is so much one could at least try to predict in the world of graphics...
But the article is about history, not predictions of the future. Besides DBZ doesn't have access to the Iranian time machine.

Once again great article.

I love rounding errors... 100.2% total market share :-S

Don't hate.

My 8800GTS and GTX 280 were some of my favorite cards. A 512bit bus on that 280 to this day is still remarkable.

There is so much one could at least try to predict in the world of graphics...

As Clifford noted, the series is about history, and not about making pronouncements regarding future history.

I could I suppose expound upon...

2. MS announcing there will be no DirectX12

...except that the company doing the announcing is AMD and not Microsoft, and the guy doing the announcing is a marketing guy. I suppose you could take the concept and run with it, but I seem to remember that the last time AMD marketing guys expounded on APIs they managed to invent the internal confusion engine...

Richard Huddy (then) of AMD talks of doing away with the API (and just not DirectX) on 16 March, 2011

'The funny thing about introducing shaders into games in 2002,' says Huddy, 'was that we expected that to create more visual variety in games, but actually people typically used shaders in the most obvious way. That means that they've used shaders to converge visually, and lots of games have the same kind of look and feel to them these days on the PC. If we drop the API, then people really can render everything they can imagine, not what they can see ? and we'll probably see more visual innovation in that kind of situation.'

...and two days later, Neal Robison (then) of AMD announces that the API is alive, well, and the way forward

'The bottom line for us is that we support open standards, such as OpenCL and DirectCompute, we feel this to be a way to move the whole industry forward.'

Love this series.

I recently sold my 295 and my 8800 ultra not even 3 months ago. Kind of miss those things.

Good memory. I think that the market is different now, since PS4 and the next Xbox (no given name yet) are going to be x86. The API are all shader model 5 with PS4 using OpenGL and OpenCL. There are people talking about XNA2 being almost identical to DX 11.1. This means, at least for me, that games are entering in a murky and troubled waters. DX11 games requires a lot of processing power. Open games like FarCry3 and Crysis3 struggle GPU's like HD 7950 and GTX 660Ti. They are the middle ground for desktop, but faster than what console can perform.

Microsoft needs to sell the new Xbox. Creating a new API to sell more Windows "blue" or 9 means that the market will split again. Those who have the new DX12 and those who don't. The base will be shader 5, in a different API sure, but this leave us in the same situation as we are today. All of this off course with one problem: the stakes a much higher and developing games that fully uses a good implementation of DX 11 are too expensive.

On a side note: The WItcher 2 showed a lot of critics that DX 9.0 C is pretty good at making a graphical demanding game.

1 person liked this | cliffordcooley cliffordcooley, TechSpot Paladin, said:

Good memory.
Tell me, why did you quote so many comments?

Better yet why did you quote my comment? Your reply seems to be irrelevant to my comment.

Footlong Footlong said:

Good memory.
Tell me, why did you quote so many comments?

Better yet why did you quote my comment? Your reply seems to be irrelevant to my comment.

It was an accident! My bad :p

yukka, TechSpot Paladin, said:

Loved these articles. Seems like a good opportunity to list my graphics card history

I had a matrix mystique then bought an orchid rightous 3d (yes it clicked) 4mb 3dfx card primarily to play quake. After that I upgraded to a creative voodoo 2, tried out a tnt2 which kept crashing (but I was trying to play hidden and dangerous so everything crashed with that game). Took it back and got a voodoo3000. Used that for ages then got a NVIDIA geforce 2mx. Great card. Sort of upgraded to a radeon 9200 then got a 9700pro when prices dropped. After that I had a radeon x1950pro then a x850xt platinum. Used that for ages before getting a new pc with a radeon 4850 in it which I upgraded to a 460gtx 1gb overclocked model) couple of years ago now. Still cooking with gas on that one. Don't think I left anything out had totally forgotten a few of those before reading these articles.

darkzelda said:

Great article... Wondering how great is going to be the jump in performance in the new video cards. Games like Crysis 3 are very demanding so if that's the standard for the next AAA games, there should be a great performance jump in the next cards.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.