HD 6870? Is this it?

dividebyzero

Posts: 4,840   +1,271
Yes, truth is stranger than fiction.
Shock horror, a Chinese website is publishing a Vantage score, complete with partially filled GPU-Z screen purportedly showing the upcoming Cayman XT (HD 6870). The Vantage score would indicate an approximate 20% improvement over a stock GTX 480 and ~35% over the HD 5870.

HD6870.jpg


Memory at 6400MHz effective...wow.

[source]
 
Looks like the Chinese are the ATi equivalent of Russia's Microsoft leak crew! :D

Looks good, even if I don't understand half of it! haha.
 
The Chinese are everyones equivalent of Russia's leak crew.
Traditionally, benchtests of new processors (GPU and CPU) originate from China as a precursor/tease a few months out from it's official release in order to generate interest (and rampant excitement amongst some) and start the PR campaign. At lot of these leaks are "sanctioned" by the manufacturer as a means of getting the product in the public eye when the product in question is still being evaluated/validated/debugged.

The Vantage score is a standard benchmark for comparing graphics/CPU/system, and a known quantity. The principle points of interest would be the high memory clock, which would denote a that AMD is using the new 7Gb GDDR5 chips or is overclocking the vRAM and AMD has a good handle on their GDDR5 memory controller (all good), and that the new cards are still using a 256Mb memory bus (not so good).
The other thing of note is that the Vantage score is approximately 35% better than the HD 5870 using the core/shader frequency, which implies a substantial increase in shaders in the new design.
 
Haven't heard anything. I was under the impression that the next go round for nvidia was based on GF104 ( full 384 shader GTX 475 as soon as GTX 470 stock is sold), and that Fermi II was going to based on TSMC's 28nm (possibly GF's as well !).
I don't think there is anything radical coming from either AMD or nvidia on 40nm. The HD 6xxx parts will be offering better DX11 features (read tessellation), but with say (hypothetically) increasing the shader count from 1600 to 1920 plus increased tessellation hardware, we could quite easily see a reversal of todays situation, where AMD move to a larger die and power consumption* while nvidia sit pat with the 150-210w TDP market (dual GTX460 excepted)
AMD and nvidia seem to have slipped into a non-competitive stratification, with cards dove-tailed on performance/price and new model entry aligned so as not to cause an all-out price war that neither can afford.
This way I think a larger percentage of consumers are likely to upgrade more frequently, if not for a large performance gain, then from the sheer number of models available at $10-20 increments.
Miight be worth noting that not long ago, nvidia and AMD were being scrutinised regarding price fixing. While nothing seems to have been "officially" proven, the present (and likely short/medium term future) situation looks remarkably comfortable for both camps.
 
Back