TechSpot

Alleged Radeon HD 6800 benchmarks leaked

By Jos
Aug 30, 2010
Post New Reply
  1. Following last week's leaked list of codenames for AMD's Radeon HD 6000-series graphics chips, a user on Chinese-speaking forum PCInLife has shared what he claims are benchmark results for one of these still-unannounced products. The posting includes a 3DMark Vantage and GPU-Z screenshot, with the first showing an overall score of 11,963. That's above what the Nvidia GeForce GTX 480 can produce, and not far from the dual-GPU Radeon HD 5970, which suggests AMD has managed a major speed improvement with the same 40nm manufacturing process it currently uses.

    Read the whole story
  2. BMfan

    BMfan TS Guru Posts: 479   +48

    Hopefully this isn't someone having fun and the figures are true,because i'm waiting to see what AMD do to compete with the GTX460 before i upgrade.
  3. Technochicken

    Technochicken TechSpot Paladin Posts: 903

    Is there any evidence to say that these pictures were not just photoshopped? Creating these just based on other cards' benchmarks and just inserting your own numbers would take all of about 10 minutes.
  4. princeton

    princeton TS Addict Posts: 1,716

    There is no way ATI..I mean AMD achieved that kind of boost. They haven't had the time or money to do enough Research and Development. I just took that picture and made the clock speeds 99999 and 99999 for core and memory respectively. I also made the bus width 1 Bit. IMO it's complete photoshop.

    ALSO. If AMD had the time and money to get that kind of boost then bulldozer has no reason to still be unreleased. Their HD 5000 series is doing fine, anyone who had their priorities straight over at AMD would put money into their CPUs which have been underwhelming to say the least recently.
  5. EXCellR8

    EXCellR8 The Conservative Posts: 2,278

    could be fake i suppose... but there's no telling which HD68xx card it is anyways.
  6. It might be true, the new card has a lot of the "Northern Islands" features like memory controller etc with the same die size. i guess once test samples are out we will see for sure.
  7. Rick

    Rick TechSpot Staff Posts: 6,304   +52 Staff Member

    How fortunate for TS to have an expert who not only knows what AMD's R&D budget is but also has an intimate understanding of their latest cost/performance paradigm contentions... and can even offer an educated, speculative analysis regarding the realistic limits of their GPU talent.

    Thank you for putting all of this into perspective, for the rest of us.
  8. EXCellR8

    EXCellR8 The Conservative Posts: 2,278

    while it is a pretty significant boost from the existing product lineup, the specifications aren't completely improbable. AMD has been making a lot of changes lately so who's to say they haven't been keeping something under wraps. pretty much every single company doesn't reveal their actual budget and as far as R&D is concerned i don't see them having to go such great lengths to achieve something that appears so significant. the specifications are indeed fake and a red flag went up when i saw "chinese-speaking forum" but princeton does make a very good point with AMD's development path.
  9. dividebyzero

    dividebyzero trainee n00b Posts: 4,849   +679

    W1ZZARD (the author of GPU-Z) has pretty much discounted the GPU-Z screenshot as being 'shopped. If the rumoured die size increase ( ~380mm² +) is true, then that would equate probably to an approximate 20% increase in shader count - add in the (very much) faster vRAM and the performance gain is not out of character with the known (and suspected) numbers. The real questions here is likely whether AMD will manage a 1600MHz (6400 effective) -presumeably 2Gb-GDDR5 for the retail product, and will the 256-bit memory bus limit performance at higher res.
    Some of this was previously discussed on this thread http://www.techspot.com/vb/topic152436.html

    Rest assured that a couple of things seem to be in general agreement. Larger die size + 35% performance increase will likely mean that the incoming series will be priced accordingly, and larger die size + increased shader count = higher power draw.
  10. You're clueless.

    That was in response to this garbage:

    There is no way ATI..I mean AMD achieved that kind of boost. They haven't had the time or money to do enough Research and Development. I just took that picture and made the clock speeds 99999 and 99999 for core and memory respectively. I also made the bus width 1 Bit. IMO it's complete photoshop.

    ALSO. If AMD had the time and money to get that kind of boost then bulldozer has no reason to still be unreleased. Their HD 5000 series is doing fine, anyone who had their priorities straight over at AMD would put money into their CPUs which have been underwhelming to say the least recently.
  11. ---agissi---

    ---agissi--- TechSpot Paladin Posts: 2,383   +15

    Sounds accurate to me. Makes sense with the same fab. process memory is what would get a boost. I wonder if the memory controller is in the GPU die or external like the PCIE bus chip. Have those since been integrated?
     
  12. princeton

    princeton TS Addict Posts: 1,716

    It's common sense that if their R&D budget was on the same level as intel's they wouldn't just be sticking more cores onto the same architecture. Answer this. Why would AMD put money towards gpu sales when they are already doing well in that market opposed to putting money towards cpu's where intel is dominating them.

    You call me clueless yet you don't have the intelligence to find the quote button. You missed it twice. It's in the bottom right corner in case your wondering.
  13. "You call me clueless yet you don't have the intelligence to find the quote button. You missed it twice. It's in the bottom right corner in case your wondering. "

    As a guest, the quote function doesn't work, and I don't feel like signing up. That intelligent enough for ya?

    You don't seem to understand that they have their CPU engineers (the AMD guys) and their GPU engineers (the ones from ATI). Some of the GPU guys have been diverted to work on Fusion, but that has nothing to do with BD and why it's late and it's not going to kill their GPU department. It would be stupid not to invest properly in both. The 5000 series is fine only until Nvidia gets it together again. Not to mention, it seems that main hold up with BD is Global Foundries and their 32nm process. The screen shot may be fake, but not for the reasons you say.
  14. hellokitty[hk]

    hellokitty[hk] Hello, nice to meet you! Posts: 4,367   +125

    I'm sure everyone here is aware of the potential falsity, stop posting about it.

    Now the important speculation is how much it costs!
    I'm betting the release price will be ≈ $450 USD to ≈ $500 USD.
  15. Regenweald

    Regenweald TS Rookie Posts: 143

    Thankfully, the internets never forgets, so we'll all check back here in a few weeks. I have a box of sanitary wipes for anyone with egg on their faces :) just in case........
  16. dividebyzero

    dividebyzero trainee n00b Posts: 4,849   +679

    Why does this not surprise me

    And to the "Guest" posting just below...
    I can see why some people use the Guest account. Who the hell would call out "Intel or Nvidia fanboys" to cease and desist with their opinions...then end with a subjective AMD offer "higher performance at lower cost" opinion.
  17. this is a tech site and arguments "because I say so" or "I don't believe that"matters only as opinions so please Intel or Nvidia fanboys keep your opinions for you.Some of us now work on competitive hardware thanks to AMD-ATI, benefiting from higher performance at lower cost.
  18. Actually it does work.
  19. Burty117

    Burty117 TechSpot Chancellor Posts: 2,498   +304

    Just to note that was me saying that, I just thought I would actually use the quotes under a guest account to prove its possible :)
  20. Well AMD does have a excuse not to have released bulldozer yet. Bulldozer is a complete redesign while the 6000 series is just a update or "hybrid" series using current and the future 28nm series features. faster to upgrade features and cores then to completely redesign them.
  21. You obviously know nothing about business, you other Guest. Why would a company do a refresh right away when they have inventory to sell and hype to build? They run on product cycles you *******. And thats just scratching the surface as to why companies do this. Maybe learn a little about standard company practice especially in streamlined markets and then maybe ponder why companies do things the way they do.
  22. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,285   +232

    Man, I'm glad we have business geniuses to give us perspective. You're absolutely right, of course. AMD stole the show with the 5xxx series, ran with it and ruled for a time, until nVidia finally got into the game with their 4xx stuff. So now is the perfect time to just sit back and coast as nVidia picks up steam, I mean why keep pushing and bringing new and better products to market to try to keep the momentum? Nah, much better to just let it slide and slip back into the #2 slot behind nVidia...

    Seriously, this is an industry built on innovation, pushing envelopes, and timely product drops. You miss an opportunity, you lose market share. Look at Intel, they released the i-series processors, then did they just stop working on them? Nope, they are releasing revision after revision, improving on things and keeping themselves on top of the hill. Why wouldn't AMD do the same with GPUs, if they have improvements on the 5xxx series that they can drop on the market and build on their existing momentum, rolling right into their Bobcat and Bulldozer product launches?
  23. Other than adding my own "


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.