Far Cry 3 Tested, Benchmarked

By Julio Franco
Dec 5, 2012
Post New Reply
  1. raghunathan

    raghunathan Newcomer, in training

    gamegpu tested with 310.70 beta and 12.11 beta 11. yet the HD 7970 Ghz was 10% faster than GTX 680 at 1080p. 44 fps at 1080p on HD 7970 Ghz is definitely playable. At the higher 1600p resolution and 8x MSAA settings its not playable. in the resolution that counted most and at the highest IQ settings the HD 7970 Ghz was the clear winner.

    read the hardocp preview properly. at 1080p the HD 7970 Ghz CF was faster and as smooth as GTX 680 SLI. at 1440p HD 7970 Ghz CF had stuttering. its probably a driver issue which should be resolved soon. HD 7970 Ghz was faster as a single GPU.


  2. 7950 oc whips *** of 680 oc...you pay less,you get more for amd gpus

    Even with no 12.11 beta in the test ,still great perf from 7950 oc
  3. Steve

    Steve TechSpot Staff Posts: 1,286   +398 Staff Member

    Did you really just link us to AMD's blog to prove that AMD graphics cards are faster in Far Cry 3? Also I will say it again we used the latest drivers. It is possible the portion of the game we tested slightly favors the Nvidia cards at the moment.
  4. dividebyzero

    dividebyzero trainee n00b Posts: 4,788   +639

    And? You're obviously quoting me out of context since I was obviously referring to your PCGH assertion. I'll recapitulate:
    See what you did there? If you want to argue what I post then feel free to do so. Arguing a point I didn't make and editing my quote to present a false impression just makes you look like a trolling fool.
    Only on the less demanding settings. If you can read the graph (from gamegpu, since that's the only source you've linked to that has the latest drivers for each vendor) for 19x10 w/8xMSAA+HDAO, you'll see that both cards post 30 fps. And the fact remains, as the game IQ increases the 7970's advantage evaporates regardless of whether the resolution is 19x10 or 25x14...an odd situation for a game developed in partnership with AMD, which kinds of puts you at odds with what you've said:
    BTW I game at 2560x1440, so that tends to "count the most" with me
    From Hilbert at G3D:
    Don't know about you but I don't toss $400+ on a card looking for the bare minimum in playability. If you do, then fine- in which case you could amend your statement to:
    "in the resolution that counted most for me in this particular instance, and at the cherry picked IQ settings that I deem playable the HD 7970 Ghz was the clear winner by 3-4 frames per second"...and I doubt anyone would argue the point.
    Personally, I'd aim for 60 fps, and Steve's review (19x12), and ComputerBase's as well as Hilbert's (2xMSAA) are likely more representative of what people are likely to actually use. That is to say moderate 2-4xMSAA w/o HBAO/HDAO
    With 2560x1440 IPS panels rapidly becoming mainstream in pricing and availability, I'd think it likely that dropping $800+ on graphics cards wont be commensurate with gaming at 1920x1080 long term.
    No, it's a design issue with all dual graphics. Lessening microstutter via driver is game specific and invariably results in lowered framerates at the expense of equality in frame rendering between two GPU's. While both AMD and Nvidia have improved microstutter performance (and Nvidia card users can ameliorate this with adaptive v-sync if so desired) it is still far from being resolved. From ComputerBase's microstutter analysis for the GTX 690/680SLI/7970/7950CFX
    [​IMG]
    And unfortunately the review deals with the here and now...if we were dealing with what could happen, whose to say that Nvidia's drivers wouldn't also improve? After all, FC3 is an AMD sponsored game, and it stands to reason that Nvidia would be playing catch up in optimizing for it.
  5. afafafa

    afafafa Newcomer, in training

    This false stupid madness about msaa a SCUM!!! And this bulsshit about the hi-end is false! Business criminals, corporation cheaters bullshit! All the vga factories and the game developers ARE MAFFIA!!! Liars, cheaters, criminals! Stupid looser user buy buy buy... buy the new bullshit products, run the new bullshit high-and software.... pathetic ins sane...
    The Far Cry 3 game looks minimal better like the second and first parts of this serial...
    Minimal visual different, but need multipled hundred hardware power... haha and I"m the stupid looser who believes this... at least think that the criminals in Ubisoft and in Nvidia... disgousting...
  6. afafafa

    afafafa Newcomer, in training

    Crysis 2 looks better optimalised better, pardon optimalized CORRECTLY and not cheating... Nvidia and Ubisot CHEATERS!!!
  7. dividebyzero

    dividebyzero trainee n00b Posts: 4,788   +639

    Maybe the "stupid looser" should note that it's Far Cry 3: Optimized for AMD Radeon. AMD partnered with Ubisoft to develop the game. Easy mistake to make though, after all it was only mentioned in the line immediately above your post as well as prominently displayed in the conclusion in the article itself.
  8. LNCPapa

    LNCPapa TS Special Forces Posts: 4,274   +259

    ??? Stoned much ??? I am so confused by post #56, what is the point you're trying to get across?
  9. afafafa

    afafafa Newcomer, in training

    I translate. "optmised" = business CHEAT...
  10. PC nerd

    PC nerd TechSpot Booster Posts: 320   +36

    My reliable old 6850 is really struggling with these newer gayems.

    I shall have to retire her to my Brother's rig soon.
     
  11. Steve

    Steve TechSpot Staff Posts: 1,286   +398 Staff Member

    Is that all you wish to translate for us? :)
  12. shaolin95

    shaolin95 Newcomer, in training

    I would love to see that 920 @ 3.5Ghz to see how close it gets to the top scorer. Its amazing what this old guy can still do and be right up there at just 2.66Ghz!
  13. shaolin95

    shaolin95 Newcomer, in training

    I was an AMD guy until I got my i7 920 which is still my CPU...some AMD guys just cant face reality it seems.
  14. PC nerd

    PC nerd TechSpot Booster Posts: 320   +36

    The last good CPU's AMD made were the Phenom IIs.

    My 955BE @4GHz is still going strong.
  15. Callum_J

    Callum_J Newcomer, in training

    Yeah dude, you totally missed his point. Using a 1GHz jump in clock speed of each individual processor. The FX8350 scales better with clock speed => 13fps jump from 1GHz jump (3.5->4.5) whereas the i7 manages just 5fps from a 1GHz boost (which imo for he price you pay for it, is lame)

    The FX cpu's are designed to run at very high clock speeds so comparing them to an intel chip designed to be the most powerful on the market running at fairly low clocks seems stupid to me. Price of both the top i7 cores you have included are well above the 8350. AMD themselves have said the 8350 is to compete with core i5 anyway...

    As he said, run the FX8350 at 5GHz+ as it'll easily do it, then compare that with intel. You'll likely notice how the FX cpu outperforms i7 & i5 in a gaming environment, there's countless reviews and benchmarks that prove this. Yes the single core performance isn't as good but the multicore performance is better especially with a good overclock (last I checked the FX cpu's also hold the world record at overclocking, despite being "a failure" in the eyes of many)
  16. Lionvibez

    Lionvibez TechSpot Enthusiast Posts: 530   +76

    Dude give it up!
    ghasmanjr likes this.


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.