Intel: GeForce GTX 280 is only 14x faster than Core i7-960

Status
Not open for further replies.

Matthew DeCarlo

Posts: 5,271   +104
Staff

In a peculiar attempt to dismiss claims made by Nvidia, Intel yesterday argued that its CPU technology is only 14 times slower than the graphics company's GPUs. The unusual admission comes as the Santa Clara-based chipmaker looks to downplay Nvidia's claims that its GPUs outperform the conventional Intel processor by a factor of 100.

In a paper titled "Debunking the 100x GPU vs CPU Myth," Intel suggests that application kernels run up to 14 times faster in certain circumstances on an Nvidia GeForce GTX 280 than an Intel Core i7-960. On average, Intel says that number is more along the lines of 2.5 times faster. Naturally, Nvidia quickly published a rebuttal of its own.

In a blog post, spokesman Andy Keane pointed out that Intel used Nvidia's last-generation GPU, as opposed to Fermi. Keane also notes that Intel presumably ran unoptimized codes on the GTX 280, and it's not even clear how they were compared between the GPU and CPU.

The Nvidia staffer went on to acknowledge that not all applications run 100 times quicker on GPUs, but he cited many developers who have achieved that kind of performance, and more. At least seven organizations cite speed-ups of over 100x, and one claims 300x.

Permalink to story.

 
Obviously the only resolution to this is have representatives from nVidia and Intel duke it out in a cage match with a fight to the death! ;)
 
TomSEA said:
Obviously the only resolution to this is have representatives from nVidia and Intel duke it out in a cage match with a fight to the death! ;)
I think I would pay to see that :D
 
TomSEA said:
Obviously the only resolution to this is have representatives from nVidia and Intel duke it out in a cage match with a fight to the death! ;)

LOL I can only imagine what a sad display that would be. They'd have to air the fight on Comedy Central.
 
Somehow, I'm picturing the fight to the death scene on Vulcan between Spock and Kirk, only with 2 supernerds who can barely lift the weapons, wheezing and stumbling around trying to hit each other... Even have the dramatic music from the scene playing in my brain as they flail around...

Thanks, TomSEA... I may need therapy or hypnosis or something to get this scene out of my head!
 
Well in certain circumstances, I drive faster than Lewis Hamilton. (Namingly while he sleeps.) But I'm not going to run out and publish that, or did I?
 
I'm sure lewis hamilton drives faster than you in his dreams. Maybe you mean in your dreams? :p
 
Somehow, I'm picturing the fight to the death scene on Vulcan between Spock and Kirk, only with 2 supernerds who can barely lift the weapons, wheezing and stumbling around trying to hit each other...
...until supernerd1.0 arbitrarily decides that the testing enviroment is flawed and presents his argument ALL IN CAPS that the setting should take place in a virtual Battlestar Galactica episode. Tantrums and hilarity ensue over who's futuristic space opera of super-advanced technology that still requires manually aimed small-arms is the dominant form.
Pilot to air on Fox in the new season.
 
Vrmithrax said:
Somehow, I'm picturing the fight to the death scene on Vulcan between Spock and Kirk, only with 2 supernerds who can barely lift the weapons, wheezing and stumbling around trying to hit each other... Even have the dramatic music from the scene playing in my brain as they flail around...

Thanks, TomSEA... I may need therapy or hypnosis or something to get this scene out of my head!

Actually, I picture the Kirk vs Gorn fight! * Way more dramatic! High speed action intensive scenes! - end sarcasm*
 
Oh, Oh I just but a core i7 machine. Does this mean I made the wrong choice (again) *sigh* I can't win at this stuff.
 
Somehow, I'm picturing the fight to the death scene on Vulcan between Spock and Kirk, only with 2 supernerds who can barely lift the weapons, wheezing and stumbling around trying to hit each other... Even have the dramatic music from the scene playing in my brain as they flail around...
Is this the STOS battle over Spock's Vulcan Trollop Fiance....?

"Jim...if you don't get him to Vulcan he'll die"! Why Bones"?

After dealing with that b****, you'd think the next line would be, "oh death, where is thy sting"?

I don't get it, how can you possibly compare a graphic processor vs. a... data processor?
Do Nvidia GPUs allow PS/2 mouse and keyboard, or are they USB only...?
Because they get paid trash loads of money to do it, not you, that's why you can't see it.
OK then, it is logical to assume by this you mean, that if any one of us was paid identical money, that would improve our vision...? :rolleyes:
 
OK then, it is logical to assume by this you mean, that if any one of us was paid identical money, that would improve our vision...? :rolleyes:

I guess if we were paid by nvidia to be more specific. And yes if we were paid identical moneys from the respective company we may feel compelled to feel the same way.
 
I don't get captain... Sorry, my english is not that good to get practical jokes or ironies... if that's what you intented to do...

What's the point of the discussion if one processor cannot work without the other, and i don't mean these 2 processors, i mean the logical (or whatever is its name) processor is useless without a graphic processor and vice versa, who cares which one is more powerful?

Pointless publication...
 
The article isn't dealing with CPU versus grahics processor. It means CPU versus GPGPU. In essence using the parallel nature of graphics processing (cores/shaders) to do the same type of computation that is undertaken traditionally by the CPU alone.
A primer here.
While CPU's and GPGPU's exist in tandem, both are optimized for their own particular branch of coding. Parallel computation tasks (like Folding@Home, SETI@Home, Milkyway@Home etc.)lends itself well to the many "cored" GPGPU which would bog down a 2,3,4,6 cored CPU.

As for who cares which one is powerful....it's more a case of utilising both to increase productivity

BTW : 2.98 Petaflops means that the system is theoretically capable of undertaking 2,980,000,000,000,000 FLoating point Operations (calculations) Per Second
 
I smell a new cpu coming to the market soon :D, only problem is it uses 375 watts
 
Sounds frightening!

Since Intel's new chips are already slated for 65w & 95w, nVidia doesn't have a x86 license and IBM's new Power7 has already been released it doesn't leave a whole lot of players left in the market
 
Vicbowling, You made a horrible decision give me your whole i7 system and I will trade for a GTX280.

Now you will be able to run Windows 7 14x faster.
 
Vicbowling, You made a horrible decision give me your whole i7 system and I will trade for a GTX280.

Now you will be able to run Windows 7 14x faster.
I would be careful with this. If Windows runs 14X faster, it may also crash in 1 /14 the time.
 
Status
Not open for further replies.
Back