It really doesn't matter if anyone cares about playing msfs. It is an interesting game to benchmark because it's so taxing on the system.
Remember when the rumours were big Navi would be on par with the 2080 super lol.
Nope. PCIe 3.0 is just fine.On an AMD x370 chipset motherboard with PCIe 3.0 slots, would it be pointless to install a PCIe 4.0 generation video card?
I seem to recall that the rumours were that it was going to be better than a 2080 Ti, and so it appears so. I expected their best to slot in between a 3070 and a 3080. Perhaps with Rage mode off and Smart Access Memory mode off it might still be that way.
Still, I wouldn't be so fast with the victory lap until a broad range of reviewers get the cards onto test benches and turn on and off the special AMD only feature (SAM) as well as the "one click overclocking" thing in Rage Mode, and run them on Intel and older AMD CPUs.
The selection of games pretty much matched what I thought they would use, with the exceptions of Doom Eternal and Wolfenstein Youngblood. Kudos for picking games that the 5700XT didn't perform as well in. But, you can go and look at recent reviews and see how the 5700XT stacked up against the 2070 Super and 2080 and see why they picked games like Battlefield V, Gears 5, Call of Duty MW and the like. It seems like those games engines like the RDNA architecture. So, it will be interesting to see what the reviewers pick for games, as well as how the different CPU platforms run these cards.
Use pro driver instead of the adrenalinIt will be interesting to see how AMD drivers hold up with this new release. At this stage in GPU performance I would consider paying the premium simply to be on nvidia's driver platform and software suite though admittedly it's been a few years since I threw in the towel with AMD.
Considering the 6800xt was supposedly benched with neither rage or SAM enabled and was still well matched to a 3080, I would expect the 6900xt to be slightly slower than a 3090 with no features enabled.
And frankly "rage mode" is just allowing the 6900xt to try and consume as much power as a stock 3090.
Wait, what?!Hope so. I want AMD to knock it out of the park so I can cancel my 3090 preorder.
Lmao.Remember when the rumours were big Navi would be on par with the 2080 super lol.
Something did come up though sometime ago where we were warned about a AMD Radeon card that was locked to the x8 speed which if it was put on a board with PCI Express 3.0 slot then it would be slower than if it was put on a board with a PC Express 4.0 slot. Does anyone remember that?Nope. PCIe 3.0 is just fine.
![]()
PCIe 4.0 vs. PCIe 3.0 GPU Benchmark
With the launch of the new GeForce 30 series, PCI Express 4.0 performance has come up into the discussion. To find out exactly what we're talking about,...www.techspot.com
I imagine the PSU recommendation difference between the RX 6800 XT & RX 6900 XT is to make sure the latter can be run with "Rage Mode" enabled w/o issue. I imagine at stock, power draw isn't going to be THAT different between the two cards. You'll probably still be able to get away with a good 650W PSU running a RX 6900 XT at stock settings.Even though AMD calls the 6900XT a "300W" card they call for a 750 W PSU for the 6800XT and a 850W PSU for the 6900XT? I loved the "it's too hot" and "it consumes too much power" comments from the AMD crew before the new cards were revealed though, that was fun. Now, we need some independent tests to see what is going on under the hood.
I'll wait and see the down under Steve and the right side up Steve reviews, and throw in some Guru3D, TPU, Hardware Canucks, and some others, before I hand anyone the crown.
This all should be fun. I don't envy the reviewers having to set up 3 test benches. A 5000 series CPU to test the SAM mode (and PCIe 4.0), an Intel platform (PCIe 3.0), and a 3000 series AMD platform (PCIe 4.0 no SAM possible), just to give people a full dive into the various architectures at play now. Maybe they just don't test the old AMD CPUs? But I would be interested to see because there are a lot of folks out there with these installed.
This is only for sure true in a pre-Smart Access Memory world. With SAM, PCIe bandwidth could have a VERY notable impact on the performance gain it can provide. (Because it provides a faster connection between the CPU & the GPU's VRAM). Though this is more likely to start making a real difference on down the line when developers actually start optimizing for the feature (in addition to things like Direct Storage).Nope. PCIe 3.0 is just fine.
![]()
PCIe 4.0 vs. PCIe 3.0 GPU Benchmark
With the launch of the new GeForce 30 series, PCI Express 4.0 performance has come up into the discussion. To find out exactly what we're talking about,...www.techspot.com
That was the 4GB RX 5500XT. The reason for the performance gain is because it had only 8x PCIe lanes & it was regularly running out of VRAM. Thus, PCIe 4.0 helped it shuffle data back & forth into that limited VRAM pool faster. The 8GB model didn't see the same benefits as games could actually fit all their assets in VRAM at once.Something did come up though sometime ago where we were warned about a AMD Radeon card that was locked to the x8 speed which if it was put on a board with PCI Express 3.0 slot then it would be slower than if it was put on a board with a PC Express 4.0 slot. Does anyone remember that?
Yeah, 34 games is a handful and more are coming, among which is the game most people will going to buy cards for: Cyberpunk 2077. Also, the best shooters support RT: CoD:MW and Cold war, BFV, Metro:Exodus, Fortnite.Only a handful of games use Ray tracing. https://www.digitaltrends.com/computing/games-support-nvidia-ray-tracing/
RT it`s a niche, like VR, really? I strongly disagree. It`s a step closer to what we wanted for so long: photorealism. Once you make cards fast enough to drive 80-100fps in 4k, there`s really little incentive to go 8k or beyond. It`s like a camera with 1000 megapixels, the image is bigger, but still sucks. Improving graphics quality is the future. And I`m not saying it like they said "VR is the future", I`m saying it because this has happen from the beginning of games ever.RT drags even Nvidia's own cards down so hard, that the tech remains a question mark itself on how to progress. Even Nvidia itself might need a few more generations to polish up it's own introduction to play at a respectable framerate without lowering the resolution or details.
I wouldn't call RT is the thing to look for until quite some time to come, IF it ever takes off.
Like VR, it remains a niche area.
No the card would just run in 3.0 mode.On an AMD x370 chipset motherboard with PCIe 3.0 slots, would it be pointless to install a PCIe 4.0 generation video card?
I also agree that RT tech is still kind of immature due to the big hit on performance (compared to what it offers). It's getting better though. And calm down man, nobody calls it an obsolete tech, it just is not yet practical in many of today's conditions. No one will want to activate RT in their competitive shooter, for example. You'd activate it only if your card can produce high enough FPS, during single player. Even then most people would prefer higher FPS for fluid game play.RT it`s a niche, like VR, really? I strongly disagree. It`s a step closer to what we wanted for so long: photorealism. Once you make cards fast enough to drive 80-100fps in 4k, there`s really little incentive to go 8k or beyond. It`s like a camera with 1000 pixels, the image is bigger, but still sucks. Improving graphics quality is the future. And I`m not saying it like they said "VR is the future", I`m saying it because this has happen from the beginning of games ever.
PS: the most anticipated game is Cyberpunk2077, do you want to buy a top card that will take advantage of all it has to offer or just more fps? Because than I can argue to play it on medium settings and buy a regular card...
Yeah, 34 games is a handful and more are coming, among which is the game most people will going to buy cards for: Cyberpunk 2077. Also, the best shooters support RT: CoD:MW and Cold war, BFV, Metro:Exodus, Fortnite.
You're counting games that are not even out yet. Read the article again.Yeah, 34 games is a handful and more are coming, among which is the game most people will going to buy cards for: Cyberpunk 2077. Also, the best shooters support RT: CoD:MW and Cold war, BFV, Metro:Exodus, Fortnite.
I never said gaming will be stuck on 4k, it`s just not that practical beyond. I almost can`t tell the difference between 2k and 4k on a 32 inch monitor. To really see 8k you`ll probably need a 50+inch monitor which is lame, because it`s as big or bigger than your average tv. Anything less and you won`t be able to play without squinting.I also agree that RT tech is still kind of immature due to the big hit on performance (compared to what it offers). It's getting better though. And calm down man, nobody calls it an obsolete tech, it just is not yet practical in many of today's conditions. No one will want to activate RT in their competitive shooter, for example. You'd activate it only if your card can produce high enough FPS, during single player. Even then most people would prefer higher FPS for fluid game play.
And no, gaming will advance further beyond 4k and even 8k.
Comparing apples to oranges much?Remember when HairWorks was the next big thing? Lol