AMD, VIA, Nvidia quit benchmark group due to 'Intel bias'

By Jos
Jun 23, 2011
Post New Reply
  1. AMD has publicly announced it is withdrawing its support from BAPCo, a non-profit consortium which develops and distributes a benchmarking program called SYSmark, refusing to endorse the latest version of…

    Read the whole story
  2. Jibberish18

    Jibberish18 TechSpot Maniac Posts: 431   +8

    Hmmm. BAPco getting kickbacks from Intel OR AMD CPU's are just not as powerful as Intels? I say both can be true.

    How many people have GPU's that can add to the processing of their workloads compared to people who just rely on their CPU? Now that AMD has their Llano system, are they hoping to gain more points with this?

    I don't know. I guess there should at least be an option to use the GPU in the tests OR do it automatically if it senses the system has a GPU that can be used for those tasks.
  3. Puiu

    Puiu TechSpot Addict Posts: 1,035   +91

    Any modern GPu can do general purpose computing (nvidia from series 8000 and AMD from 3-4000 - not sure there) so most of the GPU's can do it. And if you take in cosideration that now laptops come with really good IGP's of GPU's then you have a really large portion of users that have the capability.
    Also programmers are starting to work towards offloading part of the program onto GPU's (think browsers, photoshop, etc). In a few years this will become the norm.
  4. mosu

    mosu TechSpot Enthusiast Posts: 297

    In my experience, the real benchmark is the price and at any price point an AMD processor does a better job than it's Intel equivalent. A minus for Intel is that they put emphasis on speed rather then quality and here I'm speaking about transcoding video content, lacking DirectX11 in games and so on...and about delivering Atom to the public the way they did it.
  5. Jibberish, I use allot of SW in the animation and game development industry that use the GPU(s). for doing on the fly renders and multi pass renders.

    As mosu says, AMD processors have the cost advantage, ( particularly the 1090/1100T matched with a HD69xx GPU(s) is the best bang for my dollar by a long shot.)
  6. nickblame

    nickblame Newcomer, in training Posts: 41

    Well I kind of felt that benchmarks can be biased and this is why I built one when I bought my AMD 1090T just to be sure.. turns out that AMD is still slower and sometimes in the same price ranges but then again AMD does a great job for lower end options..

    http://www.testmycpu.com/ if any1 wants to try it.
  7. Balls get in the way to a bad dancer.
  8. Burty117

    Burty117 TechSpot Chancellor Posts: 2,489   +302

    pretty cool,

    i clicked run, and it as running, how long does it take to run?
  9. gwailo247

    gwailo247 TechSpot Chancellor Posts: 2,105   +18

    "You guys are mean, I'm taking my CPUs and going home." - AMD
  10. zillion

    zillion Newcomer, in training Posts: 51

    Are u sure we are talking about the same Intel and Amd......coz i dont see Amd having the slightest chance in any of the pricebrackets atm. Unless u take into consideration the cost of upgrading in a 3yr period, but then again i have never had the same cpu for more then 2yrs. My needs forces me to have a discrete gpu so i wont ever need the built in gpu and i doubt any gamer will ever need it for the next 3-4yrs until they prove alot stronger then the current generation.
  11. princeton

    princeton TechSpot Addict Posts: 1,716

    Pfft. AMD's crying about intel bias even though everyone knows their chips are slower. Cry some more.

    As for Nvidia, well it's quite hard to compare computing using the gpu to computing using the cpu.
     
  12. Johnny Utah

    Johnny Utah Newcomer, in training Posts: 18

    This
  13. http://www.testmycpu.com/results.php

    I made 2nd rank!

    BTW, my cpu clock is not 3.4Ghz... its 5Ghz OCed.
  14. thatguyandrew92

    thatguyandrew92 Newcomer, in training Posts: 118

    It's basically a fact that Intel is faster. :/
  15. Jibberish18

    Jibberish18 TechSpot Maniac Posts: 431   +8

    It actually is a fact. A benchmark DOES NOT care if your CPU has more "bang for your buck". All it cares about, is performance. So you might say "Well I got my 6 core AMD for $200 and it threads like a mofo!" That's all fine and dandy but it might still not be able to accomplish as much as a Quad Core Intel Jerusalem chip or whatever.

    Although other posters here do make a point I suppose. I don't entirely agree but it is a point.
  16. PanicX

    PanicX TechSpot Ambassador Posts: 829

    Well of course it is. When benchmarking programs refuse to add metrics that can indicate otherwise, you have all the "fact" you need.

    Regardless of the actual product performance, benchmarking software should be as impartial as possible when determining results. It should also utilize all avenues of optimization for each product to ensure "best case scenario" results as well. If AMD has indeed worked to add their optimization and been excluded, thats about as damning as you can get as a benchmark software.

    In any case, I've never once had SysMark sway my opinion on purchasing decisions. When building a machine, it makes more sense to look at actual performance in the applications you will use rather than a generalized machine score. If I want max FPS in Crysis, I'm dont give a damn about Sysmark.
  17. nickblame

    nickblame Newcomer, in training Posts: 41

    the clock read 3.4Ghz in idle, when busy the clock climbs so its natural to detect only 3.4Ghz. I might need to monitor the clock throughout the benchmark and keep the peak..
  18. PinothyJ

    PinothyJ TechSpot Enthusiast Posts: 429   +15

    Well are not we all experts then :p
  19. dividebyzero

    dividebyzero trainee n00b Posts: 4,808   +642

    Of course the obvious problem is that the majority of people (review readers) need to be able to boil down all the numbers into a single figure (after skipping straight to the conclusions page). Best number/Highest on the chart = best CPU/GPU/HDD/ whatever.

    Offering different benchmarks will give the reader a better rounded overview of what the component is capable of, but I would suggest that for many people, unless that component tops all the charts then they are going to be disappointed with the ambiguity of "multiple winners" and confused that there is no "outright winner" THERE CAN BE ONLY ONE!!!

    With APU's (or on-die/cpu graphics) blurring the lines between what constitutes a CPU and GPU, using a CPU compute-only benchmark becomes less valid (esp. one that may favour a particular arch)...as do traditional GPU benchmarks and game fps when the GPU takes on traditional CPU-compute tasks (hardware acceleration, GPGPU/Co-processor etc.)
  20. captaincranky

    captaincranky TechSpot Addict Posts: 10,527   +857

    I see the "Highlander" series has made the third world.

    With that out of the way, "what's a benchmark"? :confused:

    Somedays, I swear my heart's not into this computer s***.....;)
  21. dividebyzero

    dividebyzero trainee n00b Posts: 4,808   +642

    Series??? We've seen just the one. That Christopher Lambert must have a fine future as an actor ahead of him.
    Uh oh, sounds like Jobs has another Apple convert.
  22. captaincranky

    captaincranky TechSpot Addict Posts: 10,527   +857

    Soon you'll have Christopher Lloyd in, "Back to the Future", and you'll forget all about him.

    "It just works, by cracky".....:
  23. Quite true.

    Most benchmark suites are full of it.


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.