Rumor: ATI 'Southern Islands' GPUs due in November

Jos

Posts: 3,073   +97
Staff

Last month, during its second quarter earnings call with investors, AMD CEO Dirk Meyer confirmed the company is on track to launch their next-generation graphics chips before the end of the year. Codenamed Southern Islands, the new parts are widely expected to be branded as the ATI Radeon HD 6000 series, and if DigiTimes' sources in the graphics market are to be believed, they should be announced in October with a hard launch following in November 2010.

Not many details have been made available thus far, but from what is being rumored things don't look as promising as one may hope. This is mostly because the Radeon HD 6000 will be based on the same 40nm manufacturing process as their predecessors. Turns out AMD originally planned to build its next generation GPUs using a 32nm process, but after TSMC skipped its 32nm R&D and advanced directly to 28nm R&D, the company was forced to adjust its plans accordingly.

We still expect the new cards to perform faster than the current Radeon HD 5000 series; they just won't exactly be a cause for rave and awe it seems. Parts based on the newer 28nm manufacturing process are coming sometime in 2011, but AMD will make sure to have something fresh on store shelves for the Q4 2010 sales period in order to defend its growing market share against Nvidia's aggressively priced GTX 460 and whatever else they have in the making.

Permalink to story.

 
I expect that it will be a bigger increase than 4870-4890 because they changing to 6xxx. But no die shrinkage? They might as well just refresh the 5000 series
 
We've seen this before only nVidia is usually the larger culprit. I don't know how many rebranded versions of the GTX 260 are out there, but it's a bunch.
 
TomSEA said:
We've seen this before only nVidia is usually the larger culprit. I don't know how many rebranded versions of the GTX 260 are out there, but it's a bunch.

I think you mean 8800GTX or G92 core. Well anyways, this is good news for me because i plan on buying a new video card around the beginning of this winter. And i might just make the switch to ATI
 
dustin_ds3000 said:
TomSEA said:
We've seen this before only nVidia is usually the larger culprit. I don't know how many rebranded versions of the GTX 260 are out there, but it's a bunch.

I think you mean 8800GTX or G92 core. Well anyways, this is good news for me because i plan on buying a new video card around the beginning of this winter. And i might just make the switch to ATI

Yah. I've never seen a rebranded GTX 260 in my entire life. Now stuff like the 9800GTX+ is exactly the same as the GTS 250 and I'm sure there are other examples too. But I'm sure you could say the GTX 260 core 216 is a rebranded GTX 275/280 with some cores removed.
 
This suck. Amd ATi were ready to launch new card BUT because of tsmc... they wont be able to get it out, So yeah they will re-brand it but not on purpose like nvidia like when they were late in their fermi design.
 
Maybe it was the 8800GTX I was thinking of. At any rate, when ATI came out with their current line of DX11 cards leaving nVidia in the dust and with egg on their face, it seemed like every other day we were getting news of a "new" nVidia card which were all rebrands.
 
My comment on this old post sums up the nVidia renaming game quite well; https://www.techspot.com/news/37129-nvidia-quietly-intros-first-geforce-300series-graphics-card.html

Anyway, I'll hold off for this, I was expecting to either buy a Geforce GTX 260 x2 (Fermi GF104) or ATI 5870 Refresh when Crysis 2 was supposed to come out this year, however since that was postponed I'm quite happy with my 3 year old 8800GTS 512MB, runs well in Starcraft 2...
Might be due to my vcore mod and watercooling of course, 650Mhz core at 850Mhz ;)
Memory could need a voltmod tho, never did get around to that
 
This isn't necessarily a bad thing. ATI's sitting pretty with their lineup generating revenues, and nVidia's only beginning to churn out credible competition in the midrange league. As comical as nVidia's 'rebranded' cards were, there is nothing wrong with them as the G92 architecture still remains a fantastic buy to this day. Seeing as most of the new cards today are so advanced, and DirectX 9 still remains the bread and butter for many games, an update that makes for cooler operation and small performance gains is entirely appropriate, I think.
 
HMMMM Rebranding... i remember just not long ago i was having a fit about nvidia doing this. I sincerely hope that the changes are at least sizeable.

Still, its a shame that my 5850 will feel like last generation's technology already :(
 
now all we need is more DX11- capable games; preferably ones that don't suck *cough* metro 2033... *cough* AVP :)
 
EXCellR8 said:
now all we need is more DX11- capable games; preferably ones that don't suck *cough* metro 2033... *cough* AVP :)

Metro 2033 was the worst optimised console port I've ever seen. Worse frames than crysis and it sure as hell doesn't look as good as crysis.
 
Agreed, I ran that filth in DX10 mode (of course) on my GTX 295, and it massively lagged when walking. No detailed trees or jungle in sight.
 
ati has left nvidia quite far behind!!
their only offering i consider good would be the gtx460
 
GPU architecture and the manufacturing process are independent. Just because the HD 6000 uses the same 40nm manufacturing process doesn't mean it is "just" a minor tweak the HD5000 architecture. Until we see details on the 6000's architecture, you guys are speculating out your a$$ about "rebranding".
 
princeton said:
HD 6000=HD 5000 rebrands with more cores. How exciting -_-

Not exactly , one part of the GPU is upgraded architecture , the other part is the same , together it is still a 40nm GPU , since TSMC 28nm process is not yet completed there is no PLACE to make next generation GPU now , you build the plant then if you can :S , with more , 40nm proces is well established now , this will lead to greater YIELDS for HD6000 , they wont be as expensive , but even HD5000 yields were fine since ATI didn't develop such a radical and big GPU as nvidia did , it wasn't TSMC 40nm process who was the fault , it was nvidias chip, journalims always mixes the context.

But don't expect HD6000 to be like 100 more cheaper , it may be cheaper at least a bit since it's not as big development cost this time , but because nvidia has nothing amd can have steady prices and get more profit regardless if the card is cheaper to make.
 
...this will lead to greater YIELDS for HD6000 , they wont be as expensive , but even HD5000 yields were fine since ATI didn't develop such a radical and big GPU as nvidia did ...
The obvious flaw in the argument is, as you've pointed out, HD5000 series yields are good- thanks to their experiment on 40nm with the HD 4770.

Now, since the HD 6000 series are expected to have larger dies (due to the need for better tessellation and increased shader count) that obviously means that there will be fewer dies per wafer, which means increased cost.

There is no reason to suppose that the transition from HD5xxx to HD6xxx series is going to differ from either the HD4xxx to HD5xxx series (or for that matter the HD3xxx to HD4xxx series), with the incumbent series being priced down, while the new series are priced at a premium, at least until the full HD6xxx model range has launched and is in retail quantity -sometime during Q1 2011. This seems all the more probable since nvidia is unlikely to offer any "new" (non-GF104) models in competition.
 
The 40nm process being used again doesn't mean that ATI can't double the overall power of this generations GPUs once again. IIRC, the die shrinkage means greater energy efficiency and ease of overclocking mostly anyways. As long as the HD 6xxx series lowers the price on the HD 5xxx series so I can buy one finally, that is what I care about. :p
 
Christmas time marketing... I thought ATI had elevated their production and marketing so as to be better than this... Do you think an announcement in August will motivate buyers to wait until December... the real date when these chips can be in our hands?
 
AMD obviously don't have anything in the short-term to combat the positive reception that the GTX 460, probably to be joined by the GTS 455/450/440 and GTX 475 in the near future except AIB custom HD5xxx boards, so hardly surprising that they would want to promote their brand at a time when their current series is starting to look "so last year". nvidia will likewise in all probability wheel out the spin machine when the HD6xxx launches also since it's next series will likely be on TSMC's 28nm process (sometime in Q3/Q4 2011 by all accounts)
It seems that AMD and nvidia have settled into an informal agreement not to tread on each others toes with regards price/performance segmentation of the market-at least for the moment. The only direct competition in the marketplace is the GTX 460 768Mb v HD 5830, but since the 5830 has never been overly abundant, well received by press/consumers or offered compelling performance (compared with the 5770, 4870 1Gb/4890), AMD seen content to lose that particular battle rather than upset pricing of the 5850/5870/5970
 
My only interest is whether it will drive down prices of the GTX460 come christmas time, since I plan to hopefully purchase one. :D
 
I wouldn't hold your breath on that one.
AMD will likely release the HD 6870 and 6850 first-and they will be priced higher than the cards they are intending to replace. It's quite possible, if nvidia don't upset AMD's present pricing, that the HD6850 could retail at HD5870 1Gb prices (which should be EOL'ed in favour of the 2Gb version) and the HD6870 could end up retailing midway between the aforementioned HD5870 2Gb and HD5970, So the mainstream-enthusiast sector could look like this:
GTX460 1Gb -> HD5850 ->GTX470/475->HD5870 2G/HD6850->GTX480->HD6870->dual GPU.
The GTX 460 will most likely fall in price if the HD 6770 (or possible salvage part 6830) equals, or surpasses it in either/both performance and price (quite possible), but AMD's lower mainstream cards aren't slated for introduction until sometime in the new year.
 
@DBZ
Cheers for the explanation. I didn't think it would make a difference, but I'm intent on getting a GTX460 anyway.
 
Back