Guys like Avro are very disingenuous with their analysis because they are fanboys, just look at his comment history, he argues with anyone who says anything slightly negative about an AMD product but is more than happy to **** on Nvidia/Intel.
What analysis? All I did was post what YOU wrote. I didn't analyze ANYTHING.
- Now, I can understand why you think that I'm a fanboy Steve but the truth is this:
1) I do hate Intel and nVidia for their actions in the past and the present. I've never tried to hide that fact because, no matter what, I always try to be 100% honest. I do have a good reputation for that and it was well-earned. I won't say
anything that I can't back-up with data. When I say something, it's not because of what I
want to be true, it's always because of something that I
see to be true.
If you look at my posts that are about you specifically, you'll find that they're overwhelmingly positive but I call a spade a spade and when I see something wrong, I say so. You and I have been around PC tech for decades and with you being far more involved than me, I really have a hard time believing that you didn't see the 8GB as a serious red flag because we both remember when AGP cards had no more than 256MB on them. I don't think that making an 8GB card today is necessarily bad in concept, but if the GPU on a card is potent enough that the 8GB will be a hindrance, that makes a card an objectively bad product if not sold for dirt-cheap (and they were the opposite of dirt-cheap).
I decried nVidia's use of 8GB because it was put on expensive cards with some objectively potent nVidia GPUs. I never said anything bad about the 8GB on the RTX 3050 or the RX 6600/50/XT because more than 8GB would be a waste on those cards and made them more expensive. Those cards
should have no more than 8GB because the GPUs aren't potent enough to make 8GB a limiting factor. I also panned the RX 6500 XT and trolled it by always following it with (I still don't understand why AMD gave that card the "XT" suffix.), does that sound like something a fanboy would do?
2) I'm not a fanboy of AMD because I don't "love" AMD, in fact I don't really even "like" AMD. All that AMD is to me is a method of being involved in PC tech without having to use Intel or nVidia parts where I can avoid it. Fortunately, for my uses, I am always able to avoid it. If VIA released a new x86 CPU or a new S3 GPU, I would be totally interested in them. I'm not a fanboy, I'm a hater and my hate has been well-earned by Intel and nVidia. I am not delusional however and I don't see AMD as a saviour. I don't want AMD to win, I want AMD to achieve parity, nothing more. Does that sound like something a fanboy would say?
3) If AMD managed to reach parity in the markets with Intel and nVidia, I would stop using exclusively their parts because the main reason that I use them would be achieved. I want parity in the markets because that's what would benefit all of us consumers. Having said that, buying nVidia or Intel won't help anyone in that quest one bit. To me, Intel and nVidia have wounded the PC market and until they have serious competition, that wound won't heal. I don't think that AMD makes better products that Intel or nVidia but I do think that spending money on AMD parts usually (not always) is more beneficial to the consumer doing the spending because they get more for their dollar. For people who are rich and therefore don't care about that, sure, it makes no difference to them what they spend on. I'm focused on the average Joe and the average Joe doesn't have a lot of disposable income like we did 20 years ago. The CPU market is lopsided and the GPU market is
horribly lopsided.
Do you remember back in the 90s, when the video card market actually had some semblance of parity and new cards were getting released almost monthly by companies like ATi, nVidia, 3dfx, Matrox, S3, Diamond and Orchid? THAT is what I want to see again and the only way to get it is for the duopoly that we're stuck with to be no more than 60% of the market on either side. Just imagine what it would be like if that were to occur. Then imagine what it would be like if AMD decided to just pack it up because some executive decided that the consumer market was a waste of time at that point. It would make the prices we see now seem like bargains in comparison to that reality.
The truth is that I would
love to see Intel gain traction in the GPU market (really, I totally would!) but only at the expense of nVidia because replacing Radeon with Arc won't improve anything. If Intel is to advance, it has to be at nVidia's expense because they can afford to lose said market share while AMD can't. I want to see a 3-way free-for-all between GeForce, Radeon
and Arc. Wouldn't that be just incredible, Steve? Isn't that something worth wanting, something worth having?
Hell, I honestly wish that VIA would re-enter the CPU market but I know that they can't afford the R&D to compete with AMD and Intel. If they DID enter the market, I would happily abandon AMD and only buy VIA CPUs because I want the maximum number of players to succeed. It helps
all of us. Does that sound like a fanboy to you?
I can prove to you that I'm not a fanboy because I was absolutely LIVID at AMD for what I considered to be two extremely cynical moves. The creation of the R9 X3D CPUs and the refusal to create an R5 X3D CPU. I'm sure that some of you remember my (seemingly) unhinged rant about that, a rant that was
completely on your side:
"This review vindicates everything that I've been saying about AMD's decision to produce 3D versions of R9 APUs instead of R5 APUs. It is literally the stupidest decision that I've ever seen AMD make and it's going to hurt them. I get no joy from this because their choice to make these 3D R9 APUs instead of a 3D R5 APU doesn't only hurt them, it hurts gamers and I am a gamer.
https://www.techspot.com/review/263...:~:text=This review vindicates,plate of crow.
Prosumers won't pay more for an APU that is beaten by the R9-7950X in productivity, even if they want to also game with it because the R9-7950X already matches the i9-12900K in gaming which makes it an already great gaming APU for far less money than the R9-7950X3D.
https://www.techspot.com/review/263...:~:text=This review vindicates,plate of crow.
OTOH, gamers won't buy it because there's no point in paying more for an APU that games worse than one costing significantly less (as the simulated R7-7800X3D showed us). This is especially true when you're paying more money for a bunch of extra cores that will just sit idle and eat power for no reason which is EXACTLY what the R9-7950X3D will do.
https://www.techspot.com/review/263...:~:text=This review vindicates,plate of crow.
The R9-7900X3D is in an even worse position because it has no hope of out-performing both the R9-7950X3D or the R7-7800X3D because it has fewer cores with 3D cache. It will perform no better in games than the APU that should have been, the R5-7600X3D, but, again, its high price will make it one of the worst processors ever launched by AMD.
https://www.techspot.com/review/263...:~:text=This review vindicates,plate of crow.
I said from the beginning that AMD was utterly insane to create X3D versions of the R9 APUs instead of the R5. When Steve tests the R9-7900X3D, he'll be able to simulate what the R5-7600X3D would have been, the APU that AMD should have made. I said that 3D versions of the R9 APUs would be DOA, and sure enough, here we are.
https://www.techspot.com/review/263...:~:text=This review vindicates,plate of crow.
The R5-7600X3D would've been an APU with no chance of failure. Instead, AMD decided to produce TWO APUs that have no chance of success. Even worse, these two APUs cost them a lot more of their money and resources (like TSMC allocation) than the R5-7600X3D would have, making the consequences of this assured failure all that much worse. I said that 3D versions of the R9 APUs would be DOA, and sure enough, here we are. Steve will be able to simulate an R5-7600X3D when he gets his hands on an R9-3900X3D and we'll see what could have been, the APU that would have made AMD the undisputed kings of gaming.
https://www.techspot.com/review/263...:~:text=This review vindicates,plate of crow.
Instead, here we are, EXACTLY where I knew that we'd be. To everyone who gave me flak for saying this, enjoy your plate of crow."
I ask again Steve, does that sound like a fanboy to you, or an idealist who gets
really pissed off at cynical actions?
Don't get the idea that I don't like or admire you Steve, because I always have and I'm pretty sure that I always will. Your dismissal of the RTX 3060 8GB was pure genius as was your dismissal of the RX 6500 XT (I still don't know what the "XT" is for on that card). Your 100% fair and objective comparisons of the RX 5700 XT with the RTX 2060, 2060 Super and 2070 were as objective as could be. Your review of the HD 4870 is what convinced me to buy one as my first Radeon card. Seeing an article from you that was so non-objective that it seemed like from the beginning that you had an axe to grind was just shocking to me because I knew that you're so much better than that.
Not everything I say about Radeon is positive either. I was all over them because they took the RX 7800 XT and re-named it the RX 7900 XT and jacked the price. They tried to obfuscate it with the XFX suffix on the RX 7900 XTX and I said that while I believe that nVidia deserved the crap they received for trying to pass off the RTX 4070 Ti as the RTX 4080 12GB, I didn't think it was fair that AMD got a pass for their cynical nomenclature. They should have been shamed in the
exact same way but I understood why that didn't really happen. No matter what crap AMD pulls, Intel and/or nVidia will
never fail to do something far more egregious which makes everyone forget about AMD's (relatively) minor shenanigans and transgressions.
I'm fully aware that AMD is not anyone's friend, but for whatever reason, it cannot be denied that they have been, by far, the least evil of the three evils. I am ethically-driven to use their products but it's not because I'm a fan of them, it's because I just hate them the least. I'm also driven to get the most for my dollars while still getting a performance level that's good, or at least, good enough. It cannot be denied that, historically, AMD has been the one to provide that the most.
Hell, I got my FX-8350 for only $170CAD and it lasted me for five years. I know that, based on your pretty cool (and funny) dartboard that you hate FX for some reason (probably because AMD over-hyped it) but it served me flawlessly between 2012 and 2017. Was it good? Maybe not, but it was "good enough" and it was far cheaper than Sandy Bridge or its descendants. I was also on AMD's side in the "How many cores in an FX-8350?" question. It wasn't because I liked AMD and as much as I DID want to stick it to Intel, being dishonest would only make Intel look good and me look bad.
Steve, I'm sure that you remember this. Before the advent of the Intel 80486DX, CPUs were just ALUs and if you wanted an FPU, you had to buy a "Math Co-Processor" which was based on Intel's x87 FPU architecture. That means the CPU core was already defined many years previous as an ALU. It also means that, objectively, AMD didn't lie and so I had to take their side. If they HAD lied, I would've been pissing all over them like I did with the R9 X3D CPUs. I had no emotional investment in it whatsoever, it was simply what I saw as being correct, nothing more. I was defending the established definition of the CPU core as being an ALU, I wasn't defending AMD.
Tell me Steve, does that sound like a fanboy to you?