Sources say AMD didn't order enough 40nm wafers for Evergreen

Status
Not open for further replies.

Matthew DeCarlo

Posts: 5,271   +104
Staff

According to several unnamed sources cited by Fudzilla, the shortage of AMD's Radeon HD 5000 ("Evergreen") graphics cards is largely because the company didn't order enough 40nm wafers from Taiwan Semiconductor Manufacturing Company (TSMC). This is contrary to the original belief that AMD was simply shipping an overwhelming number of cards. The anonymous sources say that Nvidia ordered a significant portion of the wafers for its 40nm mobile chips.

Regardless of what is causing the shortage, AMD says availability will improve in the weeks and months to come, and in truth, they are probably in a position to take their time. While I don't doubt that at least a few sales are being lost by the shortage, Nvidia has yet to launch its competing DirectX 11 line -- plus there are few titles on the immediate horizon that employ the technology. Fudzilla speculates that despite AMD's shortage of Evergreen cards, this may be the company's best quarter in years.

Permalink to story.

 
Fudzilla does its best work with 'unnamed sources' I trust that site about as far as I can throw my neighbors wife.
 
The title is incorrect. It should be 300mm, not 40nm, wafers. Leave it to Fudzilla to not understand the difference between millimeters and nanometers. 40nm is smaller than a speck of dust and not visible to the naked eye.
 
Maybe this is a case of artificial shortage in order to keep prices high while nvidia does not release their new gpu generation. Then prices will certainly decrease to keep competition hardened.
 
Guest said:
The title is incorrect. It should be 300mm, not 40nm, wafers. Leave it to Fudzilla to not understand the difference between millimeters and nanometers. 40nm is smaller than a speck of dust and not visible to the naked eye.

While I don't altogether ever trust the source of this article, they are actually correct in at least the wafer size listing. The Evergreen GPUs use 40nm Silicon technology, down from the 55nm they were using previously. The diameter of the wafer (like 300mm) does matter to a degree, to be sure it fits in the manufacturing proces. But wafer diameter does not matter nearly as much as the thickness for the process, that is the vital statistic that defines wafers. If it didn't matter, all of the chip manufacturers would be listing their length and width of their chips as the primary statistic, not the silicon thickness rating.
 
It should be fine to describe the wafers as 300 mm or 40 nm. In each case you are describing a different aspect of the chips.
 
40nm refers to the die\reticle size, not the wafers themselves.

Guest is correct; the wafer size is 300mm. But most people don't really care about the semantics, since the general idea is about a silicon shortage, which everyone will get.
 
"The diameter of the wafer (like 300mm) does matter to a degree, to be sure it fits in the manufacturing proces. But wafer diameter does not matter nearly as much as the thickness for the process, that is the vital statistic that defines wafers. If it didn't matter, all of the chip manufacturers would be listing their length and width of their chips as the primary statistic, not the silicon thickness rating."

You do understand that wafer size matters not when it comes to process technology used, right? I'm not sure you understand the difference between what is 300mm vs 55nm or 45nm or 40nm etc. When people say 45nm or 40nm, they are describing process technology used to "etch" features onto a silicon wafer. The reason 300mm is important to a wafer is because the larger the size the dies you can put on it. You can make 40nm processors on 200mm wafers if you want. But 300nm will allow you to have more processors on a single wafer, thus lowering your cost per processor.
 
Sorry, it should state "The reason 300mm is important to a wafer is because the larger the size the more dies you can put on it"
 
Heh... I think my brain began the weekend a day early (Note to self: Do not comment or post prior to coffee intake). Wafer thickness is typically a set constant, determined by the diameter of the wafer... I was thinking of the lithography and layering processes, which determine the resolution of the IC components... The silicon wafer substrate really has no bearing on that resolution.

At least I waited till Friday for a bonehead post :)
 
Status
Not open for further replies.
Back