AMD Radeon RX 6000 series graphics cards revealed, feature double the performance of the...

You may see a 10 to 20% improvement of Nvidia RTX 3000 series cards in the next few months, once they update drivers and start using a AMD Ryzen 5900XT or similar CPU (or next gen Intel when they get around to it). Biggest improvements for the RTX 3090. The figures from both GPU families will be superseded as time goes on. Performance at launch is not the whole picture.
Nvidia sure has something in their mind, I don't believe they don't have a plan B. But at the same time I think AMD did not spill all the beans yet, either. Some rumors (bear in mind: rumors or leaks) indicate to good OC potential. I hope this is true. And I think AMD may have some other surprises, too. An unlocked 6900XT with GDDR6X memory and higher bus width? Maybe. Because as we all saw, AMD kept its highest end card within 300w TDP limit even though the competition has 50w more headroom. We will wait and see.
 
Last edited:
You may see a 10 to 20% improvement of Nvidia RTX 3000 series cards in the next few months, once they update drivers and start using a AMD Ryzen 5900XT or similar CPU (or next gen Intel when they get around to it). Biggest improvements for the RTX 3090. The figures from both GPU families will be superseded as time goes on. Performance at launch is not the whole picture.

That kind of performance jump will only be possible through individual game optimizations, not across the board as you suggest. All the 3000 series cards on the market and any upcoming cards are designed around a specific power budget. Any increase in performance would likely result in a corresponding increase in power consumption. You can't expect to get 20% increase to the base performance of the card without causing issues with many of the cheaper model cards. Not to mention the danger of arbitrarily increasing power consumption on end user systems which can cause their system to become inoperable. This is especially important given the already high power consumption of Ampere. A 3090 would consume 463w, which is insane.

Just for comparison, the 7970 only got 11% over 7 years (most of which came from games utilizing more compute) and that was an extremely forward looking architecture. In fact it's the card that spawned the whole "fine wine" meme.

I'm not holding my breath for Nvidia to pull out some magic key that unlocks the true potential of Ampere, at this stage Nvidia's hands are tied in regards to cards that are already released. Single game optimizations are about the best you can hope for.
 
You may see a 10 to 20% improvement of Nvidia RTX 3000 series cards in the next few months, once they update drivers and start using a AMD Ryzen 5900XT or similar CPU (or next gen Intel when they get around to it). Biggest improvements for the RTX 3090. The figures from both GPU families will be superseded as time goes on. Performance at launch is not the whole picture.
Improvements that big don't happen with driver updates across the board. They might release updates for some individual titles (usually new titles). But the same can be said about AMD too (actually it's AMD that has the opportunity to improve the drivers the most :D)
 
That was a long time ago so I can't imagine what he thinks about them now.
Well, $20 says his impression of them isn't any better today. :D
You may see a 10 to 20% improvement of Nvidia RTX 3000 series cards in the next few months, once they update drivers
You could say the same thing about the RX 6000 series cards. Driver updates aren't just an nVidia thing. Besides, I think that in the next few months, they'll be completely focused on actually having cards to sell. Drivers without cards to drive are just useless bunches of code. LOL
 
Nvidia sure has something in their mind, I don't believe they don't have a plan B. But at the same time I think AMD did not spill all the beans yet, either. Some rumors (bear in mind: rumors or leaks) indicate to good OC potential. I hope this is true. And I think AMD may have some other surprises, too. An unlocked 6900XT with GDDR6X memory and higher bus width? Maybe. Because as we all saw, AMD kept its highest end card within 300w TDP limit even though the competition has 50w more headroom. We will wait and see.
Maybe they have a liquid-cooled and superclocked model called the RX 6900 XTX. Adding an "X" to signify liquid cooling is something that they've done before like with the Fury and Fury-X.
 
True but it should be noted that the 6800 comparisons had Rage mode enabled (not sure why they did that though - it was already quite well ahead of the 2080ti (3070), but they stated it on their slide - transparent and honest is always nice.
That wasn't Rage mode, that was "Smart Access Memory" mode. Rage mode was only turned on for the RX 6900 XT.
 
Seems like AMD went stealthy and took Nvidia by surprise.

How come the 3090 has such a high consumption? It seemed like Nvidia always opted for low consumption. It does mean they knew that Navi would scale pretty good. Seems kind of deceiving that both companies know what are doing reciprocally. It seems vicious maneuver on AMD part that they placed such a good offer on price, compared to Nvidia, in such a short period. But, since Nvidia did this many times to AMD... seems like fair play.

Lets see if devs would make good use of those 26 billion transistors. Games seem to do well even on 4k on these.
 
My perception is the opposite. The 3080 and 3090 have GDDR6x, doubling the actual bandwidth to the video RAM. A larger cache can certainly help improving memory access, but it won't always take the place of actual bandwidth to memory.
So the 6800, up against the 3070, doesn't have this question mark, because they both have GDDR6 - and it has a lot more memory.
So while the 6800 has a slightly higher price, I'm more willing to accept that it's genuinely better than the 3070 than I am with AMD's other two cards compared to their NVIDIA counterparts. The 3090 may cost more, but it has more memory than the 6900, and AMD didn't even try to claim the 6900 could game in 8K.
Not that I think the 3090 is that good a buy, or that 8K gaming is for anyone in his right mind.
I agree and I think that's why 8K gaming wasn't mentioned during the event. When Jensen mentioned 8K gaming, I remember thinking to myself "What kind of meaningless BS are you slinging this time?" because I only know of TWO people in the universe who have an 8K TV and they're both TechTubers. I don't even know of any games that support 8K and using DLSS like Linus did is NOT real 8K. I thought that Steve Burke did a great video on the RTX 3090. I was laughing so damn hard:
 
Seems like AMD went stealthy and took Nvidia by surprise.
Do you remember the "Red October" meme that I made? That's why I made it. Well, that and the fact that Team Red won in both CPUs and GPUs for the first time ever during the month of October. "Red October" just popped into my head and I thought "and details are as hard to find as a nuclear submarine" sooo...
118lkd0muwv51.jpg

Of course, then Sean Connery passed away and I was like "Damn, bad timing!".

RIP Captain Ramius
 
Last edited:
Rage mode?

Is that when ur cards fans are so loud you cant take it and u freak TF out?

Vega 64 blower had rage mode for sure

Actually no, rage mode just allows the boost clocks to be retained for longer (which is likely acheived via undervolting - that used to do the trick).
 
Back