The Best Graphics Cards: Full AMD and Nvidia GPU Comparison with Latest Drivers

7950 boost is beating gtx 680 in some games because it has the capability to do so with good drivers....what not to believe!!!!!!

7950 boost specs are roughly same as 7970 and even greater than gtx 680 in major specs as:


7950 boost cores-1792(680=1536 cores),


7950 boost transistors-4.3 billion (680=3.54 billion)


7950 boost memory bus=384 bit(680=256 bit),


7950 boost vram=3 gb(680=2 gb)


7950 boost bandwidth = 240 gb/s(680=192.2 gb/s)


both have 32 ROPS....both are same gen but ....(the core clock and memory clock is greater for 680 wrt 7950 boost)

it's very mouch possible that a 7950 boost beating a gtx 680 with good driver on some games as of now.....do some research before speaking blindly......................................don't fall on nvi marketing show offs and traps buddy.....
 
Jagen,

"I have been trying to decide between a EVGA GTX 680 Classified 4GB in 3 way sli, 7970 GHz edition in tri fire or 2 GTX 690's. ill be playin on either 3 1920x1080 or 3 2560x1440 screens."

The Classy is a huge waste of $. It's cooling is nothing special at all and 4GB is a total waste for 1080P. At 2560x1440 x 3, no 3 GPUs will be enough because SLI/CF doesn't scale perfectly (actually based on all the reviews I've seen the scaling for the 3rd GPU is often 40-50% max in SLI/CF). If you look at the performance of GTX680 at 2560x1440/1600, it's already weak. That means 3 such cards cannot drive 3x 2560x1440 monitors maxed out in modern games. Not only that but you are wasting $ on Classy since neither 4GB nor Classy designation will help a memory bandwidth starved GPU at high resolutions. And honestly, I wouldn't get HD7970 Tri-Fire either. You are ready to spend this much $ at the end of this generation, what did you do waiting 10 months then? HD8000/GTX700 are around the corner. Either wait until Spring 2013 to get GTX780 SLI or get GTX670 Tri-SLI. GTX680 is not worth the money over GTX670 and GTX680 Classified is a huge rip-off since voltage is locked, so why are you paying extra for it?

I wasn't even aware you can still buy the EVGA Classified. Either way, given how poor Tri-SLI and Tri-CF scale, I bet GTX780 SLI would be as fast or faster than those. If you waited this entire generation to upgrade, why not wait to March-April?

dividebyzero,

"and locally the Palit GTX670 Jetstream (GTX680 perf.) was cheaper than the HD 7950 Boost"

In what country is a GTX670 cheaper than an HD7950? Romania?
 
Yes,I call 5-6 games because in some games with physx you can offload the task to the cpu as borderlands 2,etc (I also posted before,my friend playing with i3 2100+6870 on bl2 physx high getting 45-50 on most occasions...a good intel cpu will serve the cause)...
 
Even in arkham asylum(uses v1 of physx) you can offload the task to cpu by config change and some other titles w/physx....
 
In what country is a GTX670 cheaper than an HD7950? Romania?
You might need to find someone from Romania to answer that. As for New Zealand, the Palit card was (and continues to be) very good value for money. At the time I bought mine, only the big 3 tier one AIB's had 7950 boost cards available. PowerColor, HIS, and Sapphire have since become more widespread, but except for the Vapor-X (when it's available) none are that compelling since the cheaper cards are reference/reduced BoM.
Playtech_1.jpg

PC Pacific are another fairly trustworthy etailer here (I.e. no bait advertising, bait-and-switch tactics), and the prices are again similar.
I have been trying to decide between a EVGA GTX 680 Classified 4GB in 3 way sli, 7970 GHz edition in tri fire or 2 GTX 690's. ill be playin on either 3 1920x1080 or 3 2560x1440 screens
3 x 1920 x 1080 (5760x1080) won't be a problem. Here's a selection of 5760x1080 benches with single cards (as you can see, the 680 and 7970 are pretty evenly matched), and here's a selection with triple 670 cards....and another here.
VRAM isn't cumulative. For example, three 4GB cards still only net 4GB of VRAM, so the increased pixel count takes a toll of the framebuffer pretty quickly unless the image quality is reduced. Here's a vid of DiRT3 7680x1440 running max game image quality using quad 670's- it's a safe assumption that dropping the game I.q. would allow for 3 cards since scaling from triple to quad for both SLI and CrossfireX is pretty sad.
7950 boost cores-1792(680=1536 cores)
Of course, you can't directly compare one architecture with another. I'll illustrate for you...GTX 580 has 512 cores/shaders...HD 5750 has 1008 cores/shaders. By your reckoning the HD 5750 should have 197% of the performance of a GTX 580. Of course the 580's shaders run at twice the clock of the core...in that event, the 580 and the HD 5750 must have equal performance! Oops.
To quote your good self:
...do some research before speaking blindly...
You might have included die size and power requirement in your comprehensive breakdown...just to prove that the laws of physics validate the "can't have your cake and eat it too" adage.
 
That site is pure fraud...fooling foolish customers....ooh gawd!!!!!!seen tons of buying online sites for 7950(thogh asus always charge higher prices for amd and nv),never seen that gigantic price for 7950........

here is the true price (newegg) : asus gtx 680=530$
asus hd 7950=369$

and only fools will compare gtx 580 and 5750 of totally different architectures and shader count...

Graphics Core Next(GCN) is totally a ground-up architecture for amd 7-series cards (not vliw4 architecture for amd which's continuing since 2006 as 2900xt pro....see gcn from amd or hardware canucks- Graphics Core Next: From Evolution to Revolution,etc etc

(that's the only reason I said research it before speaking as nvi trolls).......

 
Frauding ---you can see their site is claiming all working deliveries as free...actually they are including all those prices+way more.....

and reputed online site as newegg and flipkart never displays flash ads and vote playtech posters on side....any good buyer will say that site is not trustworthy......
 
That site is pure fraud...fooling foolish customers....ooh gawd!!!!!!seen tons of buying online sites for 7950(thogh asus always charge higher prices for amd and nv),never seen that gigantic price for 7950....Frauding ---you can see their site is claiming all working deliveries as free...actually they are including all those prices+way more..
Hey, Einstein, prices are in New Zealand dollars. There was actually a pretty big hint at the beginning of my post...
As for New Zealand, the Palit card was (and continues to be) very good value for money.
If that wasn't enough, the receipt clearly shows that the retailer is in New Zealand, and the currency is New Zealand dollars (NZD).
not vliw4 architecture for amd which's continuing since 2006 as 2900xt pro
HD 2900XT is VLIW5, so is the HD 3000 series, the HD4000 series, the HD5000 series, and most of the HD 6000 series. VLIW4 was used only with the Cayman/Antilles GPU (HD 6950/6970/6990). Your technical knowledge is only rivalled by your ability to interpret a Playtech receipt and parse a simple sentence.

We do tend to pay through the nose for hardware in general, but if you could work out how to read a map, you'd see that the country isn't exactly in the centre of world commerce...and of course you can bleat like a deranged sheep about Newegg all you like, but the fact remains the Newegg don't ship internationally...and even if they did, USPS Priority Express shipping is upwards of $US 60 to New Zealand.

Feel free to carry on posting, you're dropping the average IQ of the AMD fanboy and troll contingent with every post. Play your cards right and Blue Falcon might pay you to stay away. ;)
 
It is interesting when a review uses a GHz edition 7970 and boosted 7950, but all stock GTX GPU's, then say the AMD cards are reference GPUs. How about we not go by what AMD says it is, but use common sense and logic. They are re-released GPU's for different reasons. The original 7970 gets beat by a 680 (dont hear many Radeon guys talking about that) so weeks later a magical 'GHz Edition' released.
But its a stock 7970 :makesjerkoffmotionwithhands:
Plenty of previous comments have addressed this but I get personal satisfaction in clarifying how ridiculous it looks when a company needs to re-release its flagship and SLASH its price to properly contend. The comment above mine about a boosted 7950 beating a stock GTX 680 is nothing short of comical. What about a boosted 7950 vs a boosted 670? It will lose. Forget about a boosted 680.
Then the big argument was, bandwidth bandwidth!! Hmm where is the advantage at 1600p or lower? Wait, there is none.
I won't argue that right now the Radeons are an incredible value, and I am trying my hardest not to sound like a Nvidia fanboy, I have nothing against AMD.

you have seen from this review that GTX 680 whose guaranteed minimum boost is 1058 Mhz but whose actual boost speed varies from 1058 - 1110 Mhz is losing on average by 7% at 1080p and 11% at 1600p against a HD 7970 at 1050 Mhz. If you know about the HD 7950 at the same clocks the HD 7950 is 3 - 6% slower than HD 7970 (majority of games being around 3 - 4%).

http://hexus.net/tech/reviews/graphics/34761-amd-hd-7950-vs-hd-7970-clocks/?page=3

Taking an average 4% slower at same clocks a HD 7950(1050 Mhz) will beat GTX 680 by 3% at 1080p and 7% at 1600p. so its simple.

from this review you can see HD 7970 (925 Mhz) is 1% slower than GTX 680 which is 7% slower than HD 7970 Ghz at 1080p. so you can see 125 mhz clockspeed for HD 7970 has a 8% average performance gain. the GTX 680 is boosting to 1084 Mhz even if take an average of the min and max boost clocks. (1058 + 1110 / 2)

Assuming similar scaling as HD 7970 (which is rarely the case as hd 7970 scales better than GTX 680), a 3% average gain would mean a 50 Mhz higher clock speed. so you are looking at a GTX 680 at 1140 Mhz and GTX 670 at 1200 Mhz matching a HD 7950 (1050 Mhz).

HD 7970(1200 Mhz) >= HD 7950(1260 Mhz) >= GTX 680(1350 MHz) .

this has been confirmed by users on overclock.net who have both cards HD 7970 and GTX 680 and have posted benchmarks at same clocks. the gap is significant in demanding games like BF3, Metro 2033, MOH Warfighter, Sleeping Dogs, Skyrim with mods, Witcher 2.

BTW your last statement is the biggest joke. your post and its tone are clearly that of a fanboy.
 
Hi div by 0,
why not you see the full article from hardware canucks- Graphics Core Next: From Evolution to Revolution that why GCN is a ground up architecture unlike Kepler, which's practically a refined Fermi.....(amd needs to improve more and more in driver department to explore the full potential of gcn....and it will not be game specific,nearly every game fps will improve (as 12.11) due to optimal way of using gcn codes....let's see what future holds buddy)....
 
Well, as to using a Gigabyte 650Ti 2Gb OC 1032MHz (12%) and competing against what appears to be reference HIS 1Gb is extremely deceptive.

That Gigabyte at Egg today for $175, while hard pressed to find an any 2Gb 650Ti for under $160 and none have rebates. Although, I can get plenty of nice OC’d 7770 for $110-125 –AR. Now if you want to compare the more generic GTX 650 1Gb those are $140-150 that would be a fair challenge.

To claim that Gigabyte handily beat a 7770 is in the $150 price point is utterly deceitful. I’d like to see it more against this XFX Core Edition FX-785A-ZNL4 Radeon HD 7850 1GB 256-bit GDDR5 which today is $155 –AR$20.
 
One card you never see is a 2560MB GTX 570. (Some people forget 570's are 320bit)
I'd love to see its SLi results at 1440p/1600p. Edit: Now that I think of it, there were 2GB Cypress XT's as well. Those in CrossfireX would still perform nicely.
 
[FONT=Calibri]Well, as to using a Gigabyte 650Ti 2Gb OC 1032MHz (12%) and competing against what appears to be reference HIS 1Gb is extremely deceptive. [/FONT]

[FONT=Calibri]That Gigabyte at Egg today for $175, while hard pressed to find an any 2Gb 650Ti for under $160 and none have rebates. Although, I can get plenty of nice OC’d 7770 for $110-125 –AR. Now if you want to compare the more generic GTX 650 1Gb those are $140-150 that would be a fair challenge. [/FONT]

[FONT=Calibri]To claim that Gigabyte handily beat a 7770 is in the $150 price point is utterly deceitful. I’d like to see it more against this XFX Core Edition FX-785A-ZNL4 Radeon HD 7850 1GB 256-bit GDDR5 which today is $155 –AR$20.[/FONT]

Really cannot believe I have to say this again but it seems I do. NO GRAPHICS CARDS in this article were overclocked! They were all tested according to the AMD and Nvidia specifications.

Also do you think at that 1680x1050 the 2GB frame buffer is giving the GTX 650 Ti any kind of advantage over the Radeon HD 7770 1GB?

All our data for these two cards was recorded at 1680x1050 while we ignored the 1920x1200 and 2560x1600 resolutions for these low-end cards. I can tell you that at this resolution using the quality settings that we did there isn’t a single frame difference between a card with 1GB and 2GB of memory.
 
Excellent and very nicely written ...

Can any 1 tell me which one of the above card will work best on Dell Nostro 200 ?

thanks
 
Yeah AMD is great except for the micro stuttering, frame latency, and image cheats. Thats why they can't selll cards to professionals.
 
Yeah AMD is great except for the micro stuttering, frame latency, and image cheats. Thats why they can't selll cards to professionals.

Why can't we dislike posts? I am going to have to look into that.
 
One of the stupidest things I have ever heard. I had 660 gtx 'till a month ago and it was immposibble to play, bf3 was on 30 fps on any settings, but thats not even the worse, game was crashing due to nvidias drivers every 10 minutes. In AC3 I had 30 fps but that 30 fps were like 15, stutering all over the place and as for Image quality picture on AMD cards is much sharper, here is a clip to confirm it... At the end I was forced to return the card cause it was useless and came back to 6850...
 
It kind of looks like PhysX might be on for the nVidia side of that video. I'm not 100% sure but something looked different especially around the clouds of smoke. Either way, I don't think the few data points you guys point out on a much larger chart can really show any sort of trend in either direction. Many people have no such issues with ATI or nVidia so it leads me to believe that there are other issues introduced by other pieces of hardware or software. I hear the same complaints being tossed around by both sides these days and how their side doesn't have these issues.
 
It kind of looks like PhysX might be on for the nVidia side of that video. I'm not 100% sure but something looked different especially around the clouds of smoke. Either way, I don't think the few data points you guys point out on a much larger chart can really show any sort of trend in either direction. Many people have no such issues with ATI or nVidia so it leads me to believe that there are other issues introduced by other pieces of hardware or software. I hear the same complaints being tossed around by both sides these days and how their side doesn't have these issues.

Actualy many on Battlelog are complainig on nvidia drivers, here is just some of them
http://goo.gl/bQtLd
http://goo.gl/qSUyp
http://goo.gl/Hl3rl
http://goo.gl/QTbUr
http://goo.gl/REzbC
http://goo.gl/dOfhS
http://goo.gl/tsvLZ
http://goo.gl/Dcfvb
http://goo.gl/aCqjQ
http://goo.gl/6pWvD
http://goo.gl/Jjb9V

That problem has been existing almost a year now.
The fault is in nvidias nvwgf2um.dll, 3d library that makes the game crash. Try googling you will be amaised
I actually believed the reviews, that bf3 actually works better on nvidia cards.
Watta sucker I was...
As for image quality its clear as a day that amd picture is sharper, but you cant see this in all games in AC3 there is no difference, I can tell you that, but in BF3 the difference is more than obvious, its like edges are disolving or smt...
As for the smoke in that video, it looks blury and foggy (just like or items on the video), dont believe that has smt to do with PhysX...
When you play FPS, it doesnt really matter whats the image quality you need frame stability and smoothness (and didnt get any of tha. But when you stop and look around, you see the difference in image quality..
 
Darth Shiv said:

And PhysX does not affect gameplay - just eye candy.

You do realise how dumb that comment is?

High performance graphics cards are all about eye candy, physx or not.

Gameplay, is largely dependent on the game itself, provided you have a computer powerful enough to run it smoothly.

Sadly, a lot of current releases may look really pretty, but I'd probably prefer the challenge of Jet Set Willy. Pretty graphics does not equal gameplay, it merely enhances the immersion experience...sometimes.
 
Back