https://www.techspot.com/review/2099-geforce-rtx-3080/
"Doesn't necessarily future proof the GPU however, as Nvidia paired it with 10GB of VRAM which might prove insufficient in a year or two..."
I didn't mean you specifically Steve. I know that you're a big fan of the RX 6800 cards. I meant "tech press" as a more general term. I don't want to name names but I think you know which members of the tech press to whom I was referring. HINT: It wasn't you specifically.
Nevertheless, I think you were too soft on nVidia for the 10GB VRAM buffer. I realise that you're a polite gentleman and I do like that about you but people were spending thousands on these cards because they didn't know any better. RT or not, DLSS or not, those cards will be relegated to 1080p gaming sooner rather than later. For that reason alone, I would've advised people to avoid them.
Now, I realise that the performance is good and the quality is definitely good and I've always said that nVidia doesn't make bad products, the company itself is what I can't stand. This is a perfect example of why I won't use them and rarely advise others to use nVidia. These cards have a built-in Achilles' heel that was glaringly obvious to people who know PC tech. Jensen called this card "The Flagship" but since when does a flagship card have less VRAM than cards of previous generations?
If a card has a built-in Achilles' heel like this, it shouldn't be one that costs that much money. If they had done this to, say, the RTX 3050 or RTX 3060 it wouldn't have been so bad because people aren't cashing in their life's savings for those cards. I may snicker at people who spent all that money and are now encountering this problem but it's not because I bear them any malice, quite the contrary. It's because I've become somewhat hopeful that nVidia will suffer the wrath of all the people they screwed over.
I realise that it's probably not going to happen (nVidia has always gotten away with it before) but they've never done something this egregious before. Sure, there was the GTX 970 VRAM debacle but that was just 512MB that was allocated in such a way that it couldn't be used but that didn't have a huge impact on the card's longevity. In this case, people are going to be stuck at 1080p and they're going to see the people with Radeons not having to break the bank a second time because, let's be honest here, the difference between 10GB and 16GB on a video card is MASSIVE.
When buying a video card, you're buying
hardware. Sure, there are software packages offered by both companies that can enhance your experience but software behaviour can always be replicated, improved and updated. On the other hand, hardware like video cards can't be replicated, improved and updated. The amount of VRAM that a card is born with is the same amount that it dies with. This is why I believe that the hardware features of a video card far outweigh any software features.
Now, I will concede that not all software features are created equally. I wouldn't be so foolish as to say that ATi's implementation of OpenCL is as good as CUDA but it did still work and for most people (who don't use it professionally), it was fine for the odd times that they would ever use it. OTOH, PhysX was ultimately defeated by Havok which proves that Havok was the better solution. I realise that neither OpenCL nor Havok were AMD creations but their creators are irrelevant as long as they can be used.
DLSS and FSR comparisons have become comedic because, as Tim so rightly pointed out, you have to slow the game down and literally
search for differences to make these comparisons at this point. At this point, any quality difference that exists (if any) would be so slight as to be irrelevant. Thus, the DLSS selling point has been successfully nullified by FSR but the extra 6GB of VRAM can
never be nullified.
Did it begin that way? No, initially DLSS was clearly superior but it is that truth that proves my point most effectively. DLSS was nullified as a feature by FSR because FSR improved and now DLSS doesn't offer any real advantage over FSR. In fact, I would say that FSR is superior overall because it doesn't block anyone from using it regardless of what GPU they have, something that I consider to be significant and worth supporting.
If I'm buying a high-end video card, I don't even need upscaling tech (at least, not at first) and if I ever do, I can be sure that the maker of my card will have a polished solution by the time I need to use it. OTOH, if I'm buying a low-end card, upscaling tech will be more relevant but my priority on a low-end card would be the best native performance I can get for my money.
Sure, for a low-end card, DLSS was an advantage but only in the short-term. Where nVidia shot itself in the foot was in the belief that they were somehow justified in trying to charge more for an RTX 3050 than AMD did for an RX 6600 XT. I'm not talking about MSRP here because those became irrelevant in that generation
very quickly. The prices on Radeon cards did fall but they fell because AMD allowed them to while nVidia didn't.
I honestly don't understand how these two are considered equivalents sometimes when it's so patently obvious that GeForce cards are a terrible buy. Right now, the RTX 3050 costs
$280 at Newegg and to get a Radeon that competes at the same price point, you'd be paying $5 less at
$275 despite the fact that the RX 6600 XT is a staggering 54% faster than the RTX 3050. The hardware is what creates that performance delta and no amount of software can mitigate it. Note that these are valid "Sold and Shipped by Newegg" prices, not prices that have been manipulated by scalpers. DLSS cannot even hope to offset that large performance difference and that's not even taking into account that FSR does completely mitigate any advantage that DLSS offers. Thus, the
hardware matters.
I'm not an AMD fanboy, I just buy Ryzen and Radeon because in both the CPU and GPU space, AMD is definitely the lesser of two (or now three) evils. For gamers, every software-based advantage that a card has never lasts longer than a few months. First nVidia had GSync so AMD released FreeSync. Some said that GSync is superior but FreeSync didn't cost anything and was good enough to eliminate screen-tearing. Now they're indistinguishable. AMD released Smart Access Memory so nVidia encountered by enabling resizable bar. Now, some would say that SAM is superior to resizable bar but, again, the difference isn't enough to matter so who cares? We have DLSS which was countered with FSR which again, some say DLSS is better, but, again, the difference is too slight to matter. Now we have this "frame-generation" technology (which I find dubious at best) that only works on RTX 40-series cards (how nice of them) and AMD will have a response for it probably in April or May.
As we see, software advantages are short-term only. Hardware advantages are both short and long-term. More attention should therefore be paid strictly to the cards themselves, the hardware. Sure, for professionals, the software support matters more but how many video cards are purchased with professional applications in mind compared to the number that are purchased for gaming? I'd be surprised if it was more than 0.1% of total sales.
This is where nVidia shifted the narrative to their own advantage. With the exception of the halo-level, they compete very badly against AMD so they come up with these gimmicks to try to justify the extra cost as if it's some kind of magic that only they can deliver. I gotta hand it to them though, the tactic works for the masses and the tech press should be telling people about shenanigans from
any company that's trying to pull them. Your coverage of the RTX 3060 8GB was nothing short of phenomenal. More articles like that are what this industry needs, not more articles trying to compare DLSS with FSR.
