So doing the math, the AMD card is quite a bit faster than the 4090 since we can see the benchmark in overwatch 2 for both. It's also much cheaper, probably does worse in Ray-T but who cares?
4090 = 500 max fps -
AMD = 600 max fps. - in this post.
So we're looking at a 100+fps difference in one game?
It would be nice but I seriously doubt that. If it were faster than the RTX 4090, I'm sure that the people on stage wouldn't have been able to shut up about it. Instead, they were extremely coy and only showed graphs with FSR enabled. Quite frankly, I thought that this was the
absolute worst AMD product reveal that I've ever seen. It reminded me of just how slimy Intel product releases are. It wasn't nearly as bad, but it was bad nonetheless.
Where do I start?
Tim or whoever writes for him is a master at using words and sentences to nudge people on their desired paths, which of course is, to push consumers towards nvidia.
Yep, I definitely picked up on his serious lack of enthusiasm.
Every possible plus from AMD is dismissed, especially if it paints nvidia on a bad light.
Every claim made by AMD were doubted, even though AMD has proven over and over that they dont lie or over promise on their presentations, unlike intel and nvidia.
You're not wrong.
Only referred the AMD gpus by model, yet used every possible adjective (when convenient of course) to mention nvidia (gpu chip name, gpu codename, gpu brand and model, etc).
He threw a couple of bones here and there, but in the end, nvidia is king and its perfect and this announcement is another failure on amd part.
Yeah, I was rather astonished by Tim's whole attitude. It was like he didn't even want to be doing this. On one hand, I can't blame him for being tired considering the time of night it was in Australia but on the other hand, I've seen many reactions to reveals by him that occurred at exactly the same time of night but he was actually enthused about it. It was definitely a bad look on him.
Anyways, my take.
RT is simply stupid, period.
Requires too much resources for a damned shadow or reflection in a puddle.
Agreed. I couldn't tell you what the shadows are like in ANY game that I've ever played because I don't look at them. There's always something way more important on screen to keep my focus.
I personally believe that we are at least 2 more gens away from decent hardware and same for games, which are currently less than 50 in existance.
I'm honestly not sure if it will ever make that much difference to the enjoyment of games. It's basically a case of "Let's improve
everything that gamers
don't look at or care about!"
Talking about games, why so much effort is placed in hyping the very few existent RT games when we have thousands of other pc games that many of us havent played yet?
Because nVidia has convinced the noobs that this is what they want and once convinced, no amount of intelligent discourse can change their feeble minds. Have you ever tried to use logic to show a religious person just how insane their beliefs are? It's the same thing.
I know, it paints nvidia in a superior spot and as another failure for AMD.
Only to clueless people and people
pretending to have a clue. Those who have a clue know better.
The estimated power consumption is great, compared to the performance increase and the less heat that will produce.
They only cared about heat and power consumption when AMD products were less efficient.
Display port 2.1 support is good to have (but of course dismissed), that usb-c port is intriguing.
Well, I'm kind of dismissive of that DisplayPort as well because if these cards are unable to out-perform the RTX 4090, then the DisplayPort version is completely irrelevant. This would mean that the RTX 4090 can out-perform the RX 7900 XTX
despite having the inferior DisplayPort.
The naming is stupid, these are simply 7800 and 7900 gpus, no need for the moronic use of XT and XTX.
Nope. The RX 7800 XT is a different card entirely and will be A LOT less expensive. Trust me, I know ATi nomenclature.
Price would’ve been great if they were 150 bucks cheaper but since media, writers and Tubers are ok with 1700 and 2000 bucks gpus, then we shouldn’t complain, I guess.
I think that you should read my first post in this thread. Those are halo products and both are priced $100 less than the last generation. If this is their method, the RX 7800 XT will only cost $550.
Well, hopefully Steves review will be fair and unbiased as always and we can then see how they really perform.
Steve's usually good that way but after Tim's reaction, I'm starting to wonder what's going on.
Those interested in AMD's GPU chiplet strategy should check out AdoredTV on Youtube. Jim's latest take there is (as always) detailed and insightful.
He's the one who AFAIK first leaked and explained AMD's chiplet strategy for CPUs.
Jim's the greatest investigative tech journalist since Charlie Demerjian and is hands-down the greatest investigative TechTuber that I've ever seen. I actually posted his video in a few places already but here's the vid again:
The "I'm a Mac" channel is actually pretty good too.
But, but, but... Ray Tracing!
Another detailed and informative article, but you don't need to be so neutral. Nvidia must drop prices - that's the big takeaway from this.
If you saw his video, he was far from neutral. He was definitely annoyed and had a strong anti-AMD attitude the whole time. I'd never seen Tim be negative like this before and it made me wonder what else is going on. Funnily enough, Hardware Canucks (who usually have a strong Intel and nVidia bias) may have been AMD's biggest cheerleader this time around:
+ $500 cards is out of my league, waiting for the rest of SKU's. I would give this MCM design a few months after launch to see the issues before jumping. I'm not a Beta tester.
You'll be looking at an RX 7700 XT or RX 7800 if I've figured out AMD's pricing structure correctly. It appears they're trying to return prices to Earth.