Far Cry 3 Tested, Benchmarked

Back to quoting the same review that's using an old Nvidia driver not fully optimized for HDAO?
Technically the four HDAO benches they ran are all unplayable, and of those four, the highest res/game iq (25x16/ HDAO+8xMSAA) actually favours the GTX 680 by a solitary fps

gamegpu tested with 310.70 beta and 12.11 beta 11. yet the HD 7970 Ghz was 10% faster than GTX 680 at 1080p. 44 fps at 1080p on HD 7970 Ghz is definitely playable. At the higher 1600p resolution and 8x MSAA settings its not playable. in the resolution that counted most and at the highest IQ settings the HD 7970 Ghz was the clear winner.

We've talked about this before, and it is very true here, that GTX 680 SLI was smoother, I.e. no stutter, no choppiness, no lag, no "micro stutter" as we played at this setting. Yet, with 7970 GHz Edition CrossFire we definitely felt it stuttering. It doesn't show up in framerate, but it was very blatant and obvious as we moved about the game at this setting on 7970 GHz Edition CrossFire.

read the hardocp preview properly. at 1080p the HD 7970 Ghz CF was faster and as smooth as GTX 680 SLI. at 1440p HD 7970 Ghz CF had stuttering. its probably a driver issue which should be resolved soon. HD 7970 Ghz was faster as a single GPU.
 

7950 oc whips *** of 680 oc...you pay less,you get more for amd gpus

Even with no 12.11 beta in the test ,still great perf from 7950 oc
 
http://blogs.amd.com/play/2012/12/10/far-cry-3-in-depth/comment-page-1/#comment-15228

(see the far cry 3 bench on amd website with latest drivers...7970 the winner and 7950(800) without it's boost clock beats gtx 670(915-980) which already has boost clock on all models and also giving good competition to gtx 680)....7970 GHz scores the highest

Did you really just link us to AMD's blog to prove that AMD graphics cards are faster in Far Cry 3? Also I will say it again we used the latest drivers. It is possible the portion of the game we tested slightly favors the Nvidia cards at the moment.
 
gamegpu tested with 310.70 beta and 12.11 beta 11.
And? You're obviously quoting me out of context since I was obviously referring to your PCGH assertion. I'll recapitulate:
AMD cards run better with HDAO in Farcry 3. look at PCGH...
Back to quoting the same review that's using an old Nvidia driver not fully optimized for HDAO?
See what you did there? If you want to argue what I post then feel free to do so. Arguing a point I didn't make and editing my quote to present a false impression just makes you look like a trolling fool.
yet the HD 7970 Ghz was 10% faster than GTX 680 at 1080p
Only on the less demanding settings. If you can read the graph (from gamegpu, since that's the only source you've linked to that has the latest drivers for each vendor) for 19x10 w/8xMSAA+HDAO, you'll see that both cards post 30 fps. And the fact remains, as the game IQ increases the 7970's advantage evaporates regardless of whether the resolution is 19x10 or 25x14...an odd situation for a game developed in partnership with AMD, which kinds of puts you at odds with what you've said:
At the higher 1600p resolution and 8x MSAA settings its not playable. in the resolution that counted most and at the highest IQ settings the HD 7970 Ghz was the clear winner.
BTW I game at 2560x1440, so that tends to "count the most" with me
44 fps at 1080p on HD 7970 Ghz is definitely playable
From Hilbert at G3D:
As such we say 40 FPS for this game should be your minimum, while 60 FPS (frames per second) can be considered optimal
Don't know about you but I don't toss $400+ on a card looking for the bare minimum in playability. If you do, then fine- in which case you could amend your statement to:
"in the resolution that counted most for me in this particular instance, and at the cherry picked IQ settings that I deem playable the HD 7970 Ghz was the clear winner by 3-4 frames per second"...and I doubt anyone would argue the point.
Personally, I'd aim for 60 fps, and Steve's review (19x12), and ComputerBase's as well as Hilbert's (2xMSAA are likely more representative of what people are likely to actually use. That is to say moderate 2-4xMSAA w/o HBAO/HDAO
read the hardocp preview properly. at 1080p the HD 7970 Ghz CF was faster and as smooth as GTX 680 SLI. at 1440p HD 7970 Ghz CF had stuttering
With 2560x1440 IPS panels rapidly becoming mainstream in pricing and availability, I'd think it likely that dropping $800+ on graphics cards wont be commensurate with gaming at 1920x1080 long term.
its probably a driver issue which should be resolved soon.
No, it's a design issue with all dual graphics. Lessening microstutter via driver is game specific and invariably results in lowered framerates at the expense of equality in frame rendering between two GPU's. While both AMD and Nvidia have improved microstutter performance (and Nvidia card users can ameliorate this with adaptive v-sync if so desired) it is still far from being resolved. From ComputerBase's microstutter analysis for the GTX 690/680SLI/7970/7950CFX
0VbPo.jpg

And unfortunately the review deals with the here and now...if we were dealing with what could happen, whose to say that Nvidia's drivers wouldn't also improve? After all, FC3 is an AMD sponsored game, and it stands to reason that Nvidia would be playing catch up in optimizing for it.
 
This false stupid madness about msaa a SCUM!!! And this bulsshit about the hi-end is false! Business criminals, corporation cheaters bullshit! All the vga factories and the game developers ARE MAFFIA!!! Liars, cheaters, criminals! Stupid looser user buy buy buy... buy the new bullshit products, run the new bullshit high-and software.... pathetic ins sane...
The Far Cry 3 game looks minimal better like the second and first parts of this serial...
Minimal visual different, but need multipled hundred hardware power... haha and I"m the stupid looser who believes this... at least think that the criminals in Ubisoft and in Nvidia... disgousting...
 
Crysis 2 looks better optimalised better, pardon optimalized CORRECTLY and not cheating... Nvidia and Ubisot CHEATERS!!!
 
The Far Cry 3 game looks...Minimal visual different, but need multipled hundred hardware power... haha and I"m the stupid looser who believes this... at least think that the criminals in Ubisoft and in Nvidia... disgousting...
Maybe the "stupid looser" should note that it's Far Cry 3: Optimized for AMD Radeon. AMD partnered with Ubisoft to develop the game. Easy mistake to make though, after all it was only mentioned in the line immediately above your post as well as prominently displayed in the conclusion in the article itself.
 
??? Stoned much ??? I am so confused by post #56, what is the point you're trying to get across?
 
My reliable old 6850 is really struggling with these newer gayems.

I shall have to retire her to my Brother's rig soon.
 
I would love to see that 920 @ 3.5Ghz to see how close it gets to the top scorer. Its amazing what this old guy can still do and be right up there at just 2.66Ghz!
 
From 3.5GHz to 4.5GHz the i7 gained only 5 fps. In the same clock range the FX gained 13 fps. This comment: "It's interesting to note that the FX-8350 at 4.5GHz was only able to match the Core i7-3770K at 3.5GHz." is very biased and unsuitable for a serious tech site. Combined with the driver "choice" towards a better nVidia performance (or worse AMD performance) makes this test less trustworthy. Very, very biased against AMD. :-(

For the CPU part i7 scales poorly with clock speed. FX scales much better. Make them both run at 5GHz and see the FX outperform the i7 (if i7-3770K would ever run at 5GHz).
And the comedy comment of the day goes to...

Try reading the graphs again, the i7 3770K at 3.5Ghz gets 70fps while the FX 8350 at 4.5Ghz gets... 70fps

I was an AMD guy until I got my i7 920 which is still my CPU...some AMD guys just cant face reality it seems.
 
From 3.5GHz to 4.5GHz the i7 gained only 5 fps. In the same clock range the FX gained 13 fps. This comment: "It's interesting to note that the FX-8350 at 4.5GHz was only able to match the Core i7-3770K at 3.5GHz." is very biased and unsuitable for a serious tech site. Combined with the driver "choice" towards a better nVidia performance (or worse AMD performance) makes this test less trustworthy. Very, very biased against AMD. :-(

For the CPU part i7 scales poorly with clock speed. FX scales much better. Make them both run at 5GHz and see the FX outperform the i7 (if i7-3770K would ever run at 5GHz).
And the comedy comment of the day goes to...

Try reading the graphs again, the i7 3770K at 3.5Ghz gets 70fps while the FX 8350 at 4.5Ghz gets... 70fps

Yeah dude, you totally missed his point. Using a 1GHz jump in clock speed of each individual processor. The FX8350 scales better with clock speed => 13fps jump from 1GHz jump (3.5->4.5) whereas the i7 manages just 5fps from a 1GHz boost (which imo for he price you pay for it, is lame)

The FX cpu's are designed to run at very high clock speeds so comparing them to an intel chip designed to be the most powerful on the market running at fairly low clocks seems stupid to me. Price of both the top i7 cores you have included are well above the 8350. AMD themselves have said the 8350 is to compete with core i5 anyway...

As he said, run the FX8350 at 5GHz+ as it'll easily do it, then compare that with intel. You'll likely notice how the FX cpu outperforms i7 & i5 in a gaming environment, there's countless reviews and benchmarks that prove this. Yes the single core performance isn't as good but the multicore performance is better especially with a good overclock (last I checked the FX cpu's also hold the world record at overclocking, despite being "a failure" in the eyes of many)
 
Yeah dude, you totally missed his point. Using a 1GHz jump in clock speed of each individual processor. The FX8350 scales better with clock speed => 13fps jump from 1GHz jump (3.5->4.5) whereas the i7 manages just 5fps from a 1GHz boost (which imo for he price you pay for it, is lame)

The FX cpu's are designed to run at very high clock speeds so comparing them to an intel chip designed to be the most powerful on the market running at fairly low clocks seems stupid to me. Price of both the top i7 cores you have included are well above the 8350. AMD themselves have said the 8350 is to compete with core i5 anyway...

As he said, run the FX8350 at 5GHz+ as it'll easily do it, then compare that with intel. You'll likely notice how the FX cpu outperforms i7 & i5 in a gaming environment, there's countless reviews and benchmarks that prove this. Yes the single core performance isn't as good but the multicore performance is better especially with a good overclock (last I checked the FX cpu's also hold the world record at overclocking, despite being "a failure" in the eyes of many)

Dude give it up!
 
Back