lol the gpu in the Xbox Series X = 7900XT since when?I would like everyone to step back and remember that you can already get a 7900XT for $500. In this special promotion they throw in a free motherboard, CPU, RAM, memory, storage, case, power supply, and controller.
This special deal is called an Xbox Series X
It delivers similar frames when comparing the same games/resolution/quality. This has been known for quite some time.lol the gpu in the Xbox Series X = 7900XT since when?
The poll doesn't specify that the upgrade needs to be just one generation newer than the current card. Most gamers wait two generations.AMD will be reluctant to do this -- the operating margin in its Gaming sector isn't great (16% in Q4 2022, 14% for the whole year). The bulk of that sector's revenue comes from PS5, Steam Deck, and Xbox Series X/S SoC sales, but given that these are already low margin products, significantly dropping the prices on dGPUs will only hurt that sector even more.
That said, at least it actually made some operating income in Q4 -- the Client sector (I.e. Ryzen) actually made a loss.
The polls make for interesting reading and I feel that a reasonable portion of people have unrealistic expectations concerning performance improvements with new GPUs. An increase of 50 to 60% would require substantial changes in architecture design and chip size.
1080ti was six years ago and was a $600ish card. You probably got four years of use out of that card before the 3080. Was it a whole new PC or did you put a 3080 into an 8th gen Intel / first gen Ryzen?The poll doesn't specify that the upgrade needs to be just one generation newer than the current card. Most gamers wait two generations.
I had a 1080Ti and skipped the 2xxx series. Now I have a 3080 (12GB) and I definitely got a at least a 60% increase!
New PC, 9700K (oc'd to 5ghz across all 8 cores), 2019. The 3080 is doing quite well at 3440x1440.1080ti was six years ago and was a $600ish card. You probably got four years of use out of that card before the 3080. Was it a whole new PC or did you put a 3080 into an 8th gen Intel / first gen Ryzen?
I believe the 6950XT is also a great card that can be had today below $599 with cpu ComboThe 6800XT, 7900XT, and 7900XTX are the only cards worth buying right now at the high end, unless you really want RT. <$600, $800, and $1000. Thanks for the comparison TS.
You must have lost your sanity if you believe that a modern GPU die in the recent manufacturing process costs pennies. Such a high-end chip costs hundreds of dollars.Cost break down for what? Dies costs cents to make. Millions are produced and whatever is good in the batch is used. However more dies are bad than good so you wanna break down the cost of a card it's probably around 25-50 to produce and ship for a single card but in the making what you don't see is all the dies that were bad, the employees on the line testing that need paid, or even the the engineers that devople the cooling or look of the card. So yes cards prices need to be increased to make up for the losses and still profit. You can't look at just the cost of the card but what actually goes into making the card!
The list price for 5nm is $17k. Big players (Apple, AMD, Nvidia...) will get a big discount. For the 4080 Nvidia is getting about 180 chips off of each wafer. I suspect Nvidia pays about $10k a wafer due to their relationship, volume, and future plans to produce at 3nm and beyond. Let's also say that only 90% of chips are functional for a 4080. That would put each GPU chip at around $60 each. The RAM, board, cooler, packaging, assembly, and shipping probably push the total cost of goods up to $300 for a 4080. That leaves $900+ in pure profit to distribute between Nvidia, board partner, and retailer.You must have lost your sanity if you believe that a modern GPU die in the recent manufacturing process costs pennies. Such a high-end chip costs hundreds of dollars.
A 5nm wafer should already cost close to $20k
There is no 'list price' -- the $17k figure that one can find on the Internet comes primarily from this source, which is a well-reasoned estimate, but an estimate nonetheless. TSMC may well offer a discount, but that's likely to be offset by factors mentioned by Nvidia:The list price for 5nm is $17k. Big players (Apple, AMD, Nvidia...) will get a big discount.
The AD103 in the 4080 just has two TPCs disabled -- the rest of the die is fully functional. To have a 90% yield like that would be nothing short of miraculous, but if it is that high, then Nvidia wouldn't need to pre-order as many wafers as it would if the yield was, say, 70% for a 4080-die and thus wouldn't get much, if any, of a discount.Let's also say that only 90% of chips are functional for a 4080. That would put each GPU chip at around $60 each.
The GPU dies are initially assembled, packaged, and tested by contracted companies, such as Amkor Technologies initially. I mention this particular company because it's one of the few to provide a decent amount of financial data -- for FY2022, its revenue was $7 billion, with 86% of that being for packaging and the rest for testing, with an overall gross margin of 19%. The cost of the work that Amkor and other packaging & testing companies do needs to be added to TSMC's wafer costs. (I only mentioned Amkor but the other firms Nvidia uses report similar revenue figures, though rarely anything about margins).The RAM, board, cooler, packaging, assembly, and shipping probably push the total cost of goods up to $300 for a 4080. That leaves $900+ in pure profit to distribute between Nvidia, board partner, and retailer.
The list price for 5nm is $17k. Big players (Apple, AMD, Nvidia...) will get a big discount. For the 4080 Nvidia is getting about 180 chips off of each wafer. I suspect Nvidia pays about $10k a wafer due to their relationship, volume, and future plans to produce at 3nm and beyond. Let's also say that only 90% of chips are functional for a 4080. That would put each GPU chip at around $60 each. The RAM, board, cooler, packaging, assembly, and shipping probably push the total cost of goods up to $300 for a 4080. That leaves $900+ in pure profit to distribute between Nvidia, board partner, and retailer.
4070 competitor is not yet released.Nvidia's margins should be at least double AMD's at this point. Note that the 4070 has only 12gb of vram and uses a small chip compared to the competitor.
Sorry, I meant the 4070ti which competes with the 7900XT.4070 competitor is not yet released.
IMO it's a $500 card at best. The prices are so damn inflated that we are normalizing it crazy prices.$750, really? More like $600. Seriously. I wish there was a cost break down per card of what it costs AMD to produce one. You would be utterly shocked.
I know there is a lot of overhead involved with that, staff, R&D, everything. But the more cards they sell the less it costs per card.
Except the 7900XTs are the defects when producing Big Navi 31 for the XTX line - so in many ways they are free. One of the side benefits of chipletting GPUs.Nvidia's margins should be at least double AMD's at this point. Note that the 4070 has only 12gb of vram and uses a small chip compared to the competitor.
Amen, brother.Until my paycheck gets adjusted for inflation those are still $800 real dollars.
Nothing to do with chiplets -- that's simply chip binning and it's been done with GPUs since the mid-90s, though defect dies were just run at lower clock speeds at that time (disabling defective sections for other products didn't really start until the early 2000s). Given that the manufacturing, packaging, testing, and distribution costs for every die from a wafer are exactly the same, it's somewhat of a flawed argument to suggest that they're free in some ways.Except the 7900XTs are the defects when producing Big Navi 31 for the XTX line - so in many ways they are free. One of the side benefits of chipletting GPUs.
The use of chiplets breaks up the GPU into a bunch of tiny chiplets. One tiny flaw on the wafer only impacts a tiny chiplet, so only 20-60 sqmm of water are lost instead of a whole 300 or 600 when the die is a single rectangle on the wafer.Nothing to do with chiplets
The Navi 31 comprises two chiplets, only one of which is tiny -- the Graphics Compute Die, which is 304 mm2 in area, and multiple Memory Controller Dies, which are 37 mm2 in area. The 7900 XT has a partially disabled GCD (12 CUs are non-functional) and out of the six available placements for MCDs, five are filled and the sixth is fitted with a blank die for heatsink load distribution. The GCD is reasonably large, roughly the same size as the Navi 22, but a lot smaller than the Navi 21, due to the L3 cache and MCs being in separate chiplets.The use of chiplets breaks up the GPU into a bunch of tiny chiplets. One tiny flaw on the wafer only impacts a tiny chiplet, so only 20-60 sqmm of water are lost instead of a whole 300 or 600 when the die is a single rectangle on the wafer.