AMD Radeon RX 6900 XT Review: Can AMD Take the Performance Crown?

Intel has no 5nm fab. It's already been reported that Intel will be using some of TSMC's 5nm capacity, starting in 2022.

I don't know Intel's plans, but they can certainly fab all their 5nm chips at TSMC, now and forever, without ever shuttering any of their current facilities. The gig very well may last forever.
The gig *may* last forever, but how realistic ist this unless Intel gives up their own fabs? Which I bet they will not do.
Intel has historically tied their chip design and nodes together. This has given them a big advantage. Right now, they are having problems but eventually they could solve them.

Yes, Intel does not have a 5nm node but since I kept hearing ad nauseam that their nodes are better than competing nodes with the same mm designation, a working Intel 7nm EUV node should be just as good as TSMC 5nm.

Lastly, AMD is a safe customer. Their sales are limited by how much TSMC can produce for them. Would it be wise to hurt that customer by reducing their volume for a short term gig with their competitor ? What does TSMC have to gain, seeing that they are capacity constrained? Is Intel going to pay more for an i3 wafer than AMD for an Epyc wafer ?

That‘s the thing - every CPU, NIC... that Intel does not sell is one more that a TSMC customer sells.
 
Intel has historically tied their chip design and nodes together.
If you mean that literally: it's false: Intel's tick-tock model was predicated on decoupling design from the process node. And if you mean it more generally as Intel prefers to manufacture it's own chips, well then so did AMD for the first 40 years of the company's existence. Times change. Historically, PC processors used to rate the latest and greatest fab nodes. Today, it's cell phones, then GPUs, CPUs, and memory in a 3-way tie for second place. Intel can no longer maintain process primacy off its CPUs alone.

A working Intel 7nm EUV node should be just as good as TSMC 5nm.
Possibly. However, that Intel node isn't coming online until early 2023, by which time TSMC will be have already been churning out 3 nm chips, more than twice as dense as even the best estimates of Intel's 7nm process.

Is Intel going to pay more for an i3 wafer than AMD for an Epyc wafer ?
The answer to that is: yes, apparently so, given that Intel will indeed be producing the i3 on TSMCs 5nm node.

TSMC to Reportedly Fab Intel's Core i3 CPUs in 2022 on 5nm EUV Process
 
I think there are two main reasons for the 6900XT to exist:
- show that AMD can make a high end card
- to keep nVidia from charging too much in the high end while lowering prices in the low and mid range to combat AMD's cards.

You could see this last gen where prices were dropped across the board / Super models were introduced to combat the 5000 series but in the 2080+ territory not much changed.
I think there are two main reasons for the 6900XT to exist:
- show that AMD can make a high end card
- to keep nVidia from charging too much in the high end while lowering prices in the low and mid range to combat AMD's cards.

You could see this last gen where prices were dropped across the board / Super models were introduced to combat the 5000 series but in the 2080+ territory not much changed.

Bingo!

People were asking for an AMD halo card, that needed to be super expensive and good, just for them to buy a 200 bucks cards, because is what they can afford.

Talking about price, I think that AMD did a mistake in pricing their new cards so high.

Each one of these cards should be around 150 to 200 cheaper and let nvidia fleecing their rabid followers that no longer care for their money.

I simply refuse to pay 1K for a GPU or CPU. Same applies to the stupid "flagships" cell phones.
 
Hey Steve and Tim! At the beginning of the article, it appears that you incorrectly stated that both the RTX 3080 and the RTX 3090 both have a TDP of 320W. For anyone who does not know, the RTX 3080 does indeed have a TDP of 320W, but the RTX 3090's TDP is 350W.
 
"That’s going to do it for our Radeon RX 6900 XT review. In short, don’t buy it, doing so will simply ensure that the next GPU generation is even more expensive. Then again, we said that about the RTX 3090 and they’ve been selling like hotcakes, so we guess budget gaming is doomed."
Techspot write that in the last few lines, then they list those 2 cards a few lines latter with a link to buy from the big A
 
"We're sure you've noticed over the years how mainstream video gaming has become, if you're under the age of 30 and don't play video games, you're now considered an outlier."
Oh, I've noticed. When I was under the age of 30 and was playing video games I was considered the outlier, often by the very parents (from my generation) of these kids who nowadays have turned "gaming" into a snobbish social status and an obsession. I come from the generation that basically made, programmed all the software and video games that these noobs and computer illiterates, and way too often functional illiterates, are now excitingly consuming without thinking.
 
Back