AMD argues Radeon graphics cards offer better performance & power consumption per dollar...

I've been an Nvidia customer since the middle 90s and dropped them because they treat me like ****. None of my PCs have their cards anymore. Keep simping.
You’ve have assumed I’m simping for Nvidia because I called out Radeon for the festering heap of garbage that it is?

All AMD have done is give Nvidia another chance. Tha not “simping”..
 
"You might also notice the RX 6800 XT is down as having a 280W TBP, yet the official specs say it is 300 Watts."

This is correct I generally see my 6800XT using 225-280watts at 3440x1440 @ 144hz at reference clocks. Only the AIB models that come factory overclocked would hit 300Watts.
 
RTX 40 and RDNA3 will be interesting. While shortages plagued the 30 series and RDNA2, that was only part of the reason most people could not find them, the other part was mining. In fact, I would wager to say if it wasn't for Ethereum mining, GPU shortages would have been short lived and pricing would have been close to MSRP most of the past two years. When the next generation of cards hit the market, unless something changes, mining will be much less of a concern. This means that there will likely be a return to form with high-end cards being way out of most gamers budget and gamers really looking to those mid-range cards. My guess is that production of the high-end cards will be cut back considerably and AMD/Nvidia will both need to focus on mid-range cards. I also think that while MSRPs will initially be the same or greater than the previous gen, you will start to see below MSRP sales again 3-6 months after a GPU launches. This all assumes of course that mining remains a non-factor.
 
Isn't that how AMD marketed Bulldozer and it's duratives?
Better value from a core count and clock speed perspective? Didn't change the fact they were really poor performing CPUs did it? I remember hearing a rumor that AMD marketing was considering counting the iGPU cores in the total core count for the processors.

It's pretty obvious that the success of the Zen generation of CPUs haven't taught them anything. Results talk and BS walks. Want to take on Nvidia like they did Intel? Do the same thing, bring to market a compelling product. A Yugo might of been a really good value, but it was still a cheap POS car that wasn't worth spending money on...
 
Whoa, AMD is beginning to sound like nVidia and Intel....and stealing a page from their PR BS too!!
 
"$400 RX 6650 vs $430 RTX 3060". Me looks up real-world prices = £440 RX 6650 vs £380 RTX 3060. Closes dumb clickbait infomercial and moves on...
 
This is the same as “horsepower per liter” you see thrown around by car people when their fanboying something and it comes up short. If this is super important to you I guess it might help sway your decision on purchasing but I can’t possibly care any less about this and I honestly don’t know of anyone who does care.
 
Inflation is real, you can't expect the midrange to stay in 250 for ever. I bought my gtx 1060 around that price point, almost 6 years ago.

RTX 2080Ti die size: 754 mm2 + GDDR6
RTX 3080Ti (and even the 3090 Ti) die size: 628.4 mm2 + GDDR6x

Due to size shrinking and optimization, as well as a GDDR6x costing even less than GDDR6 back then, the hardware itself from newer boards is even cheaper to produce. Not only that but that 3080/3090 die allows more models/ binning in case of bad chips. Even so cards are much more expensive.
 
RTX 2080Ti die size: 754 mm2 + GDDR6
RTX 3080Ti (and even the 3090 Ti) die size: 628.4 mm2 + GDDR6x

Due to size shrinking and optimization, as well as a GDDR6x costing even less than GDDR6 back then, the hardware itself from newer boards is even cheaper to produce. Not only that but that 3080/3090 die allows more models/ binning in case of bad chips. Even so cards are much more expensive.
You realize that the die shrink is expensive, right? Foundries charge more for 5nm than 7nm...
 
Ok, so... I don't really see what this article is about. Yes, AMD claims more performance-per-dollar because that's what the market has shown. Granted, prices are all over the place but they're all over the place for nVidia as well so it's still a fair playing field. The truth is that the best values on video cards have been AMD cards for quite a while. Don't forget that the RX 5700 XT was Steve Walton's favourite card of that generation. It sure wasn't the top performer of its generation so its value preposition was what made it special.
"If you're looking for maximum value it's the Radeon RX 5700 series that delivers."
- Steve Walton, October 22, 2019

$400 GPU King: Radeon RX 5700 XT vs. GeForce RTX 2060 Super

Now, granted, prices have been insane and AMD doesn't offer better performance-per-dollar in all cases if the cards are at MSRP but MSRP has been completely irrelevant for both Radeon AND GeForce cards. When I was looking at the RX 6800 XT and the RTX 3080, the RX 6800 XT was by far the better value. At 1080p, the CPF for the 6800 XT was (at the time of the test) $3.29 while the 3080 was $3.76. At 1440p, the CPF for the 6800 XT was $4.14 while the 3080 was $4.57. Finally, at 2160p, the CPF for the 6800 XT was $6.98 while the 3080 was $7.14. I haven't done the analysis for the other cards but at least, in this case, AMD ain't lying. This was one of the major factors in my choosing of the RX 6800 XT.

It's a known fact that the RX 6800 XT and RTX 3080 are pretty much neck-and-neck when it comes to performance:
"As it's often the case, depending on the game and even the quality settings used, the RX 6800 XT and RTX 3080 trade blows, so it's impossible to pick an absolute winner, they're both so evenly matched."
- Steve Walton, November 15, 2020
AMD Radeon RX 6800 XT Review RX 6800 XT vs. RTX 3080: Fight!


From everything I've read, the RX 6800 XT always uses less power than the RTX 3080 so that would mean that, at least in the case of the high-end cards, AMD does deliver more performance-per-watt.

So, sure, I wouldn't believe AMD's chart because it surely is cherry-picked but name me one company that DOESN'T do that (I won't hold my breath). However, what they claim does seem to be overall true.
 
You realize that the die shrink is expensive, right? Foundries charge more for 5nm than 7nm...
Foundries charge for a new 5nm almost as much as when 7nm was new! That is the story with every new manufacture process "the new one is more expensive". And on a 5nm you get more chips per wafer and more binning possibilities than on a 7nm. So less waste, each die is *at least* not more expensive than the older one (despite companies say otherwise to justify price increases)
 
Foundries charge for a new 5nm almost as much as when 7nm was new! That is the story with every new manufacture process "the new one is more expensive". And on a 5nm you get more chips per wafer and more binning possibilities than on a 7nm. So less waste, each die is *at least* not more expensive than the older one (despite companies say otherwise to justify price increases)
Cost is still more... on every die shrink... you also have to redesign around it... you think R&D is free?
 
Cost is still more... on every die shrink... you also have to redesign around it... you think R&D is free?
if the norm is 10nm you'll have to have more costs for the 7nm; when 7nm is the norm, you have newer costs for the 5nm. But they don't add up! Specially now that most designs are very similar and most improvements come from the node/ small optimizations. Only Apple must have giant R&D costs as at each generation speed increase ist big and the energy consumption lower.

Just take a look at Qualcomm / Samsung: newer chips are almost like the old ones; see the Qualcomm "for windows" chips: light years from any x86 or Apple ARM chips.
 
Especially when nvidia has a huge budget allocated to youtubers and tech sites to use all kinds of tricks to keep the name nvidia present.

Example, see how all youtubers always have a RTX 30 GPU visible on their videos, even if the video is not related to it or the best I example saw, a cell phone review; the 'Tuber" mentioned the "power" of the 3090Ti for absolutely no reason whatsoever.
Just like on CNN or other news channels when they make interviews or take online interventions from their various "experts": they all show a bookshelf in the background, filled with what look to be brand-new, unopened books related to the subject being discussed, with one or two clearly visible titles specifically about the subject at hand, such as: Quotations from Chairman Mao Tse-tung, China's Economy, The Communist Manifesto, etc.
 
You misunderstand me. Your lobotomized toy computers for children and de facto children are holding back whatever potential in terms of performance and complexity that games could have. I wasn't recommending consoles in the slightest, I'm just a realist about the shitty situation their existence has imprinted on the industry.
He was trolling or mocking. Like "If you can't afford it, get a job", "If you are poor, get rich!" and stuff like that, usually written by murikan kids.
 
Back