Latest Steam survey shows AMD rebounding, an excellent month for Ampere, and a Windows...

It has been a long time since there were positive reasons to choose an AMD GPU.
Picking out of ideology or being cheap only goes so far. There's always these drawbacks that you have to justify to yourself - and to others, loudly. The AMD fanbase remains pretty toxic.
Truth is that most people are simply said plain *****s. Buying decisions are not based on any facts but just brand loyalty. They gladly pay more for worse product as long as brand is right. And that's just plain stupid.

Good example is GPU power consumption. Previously OEM machines contained Nvidia cards because, according to fanboys, Nvidia cards had lower power consumption and there was no need for beefier power supplies and cooling. Now when AMD cards have much lower power consumption, OEM machines still contain Nvidia cards. Why? Because OEM's are just stupid and don't realize that things DO change. Decision to use Nvidia cards are not based on facts but brand only.
 
Now when AMD cards have much lower power consumption, OEM machines still contain Nvidia cards.

See, you do yourself no favors with this random talking point.

Power consumption matters in portable devices, where 6800M is no more efficient than Ampere. And then there are the missing features.
 
It has been a long time since there were positive reasons to choose an AMD GPU.
Picking out of ideology or being cheap only goes so far. There's always these drawbacks that you have to justify to yourself - and to others, loudly. The AMD fanbase remains pretty toxic.
It depends on the prices in your country and the games you play. I don't remember there always being drawbacks. Maybe now with DLSS, but in general AMD had very competitive GPUs when it came to perf/$.

For example, higher power draw was mostly the only drawback when picking the 580 vs the 1060 (6GB version). Prices and average performance were above the 1060.

It was the high end where AMD had a lot of problems.

AMD fans can be "toxic", but in general you just have very passionate fans that are tired of the shenanigans that Intel and Nvidia have put them through when AMD wasn't as competitive.
 
See, you do yourself no favors with this random talking point.

Power consumption matters in portable devices, where 6800M is no more efficient than Ampere. And then there are the missing features.
🤦‍♂️

OEM desktops are made as cheaply as possible. Larger power consumption on graphic card means:

- More expensive motherboard (PCIe slot power delivery)
- More expensive case and cooling solution (more heat)
- More expensive power supply (more power consumption and more connectors)

Yes, power consumption matters also on desktops.

When it comes to laptops, large majority of laptops come with Intel GPU that has much less features than AMD GPU's.

As usual, Nvidia fans move goalposts when old arguments no longer apply.

It depends on the prices in your country and the games you play. I don't remember there always being drawbacks. Maybe now with DLSS, but in general AMD had very competitive GPUs when it came to perf/$.
Tbh this DLSS vs FSR reminds me of G-sync vs Freesync. Nvidia fans touted G-Sync as superior but eventually Nvidia decided to support Adaptive sync/Freesync and nobody cares about G-Sync anymore.

Same will happen with DLSS. FSR will get wide support and soon nobody cares about DLSS.
 
Last edited:
🤦‍♂️

OEM desktops are made as cheaply as possible. Larger power consumption on graphic card means:

- More expensive motherboard (PCIe slot power delivery)
- More expensive case and cooling solution (more heat)
- More expensive power supply (more power consumption and more connectors)

Yes, power consumption matters also on desktops.

When it comes to laptops, large majority of laptops come with Intel GPU that has much less features than AMD GPU's.

As usual, Nvidia fans move goalposts when old arguments no longer apply.


Tbh this DLSS vs XFR reminds me of G-sync vs Freesync. Nvidia fans touted G-Sync as superior but eventually Nvidia decided to support Adaptive sync/Freesync and nobody cares about G-Sync anymore.

Same will happen with DLSS. XFR will get wide support and soon nobody cares about DLSS.
I think you meant you say FSR.
 
🤦‍♂️

Tbh this DLSS vs FSR reminds me of G-sync vs Freesync. Nvidia fans touted G-Sync as superior but eventually Nvidia decided to support Adaptive sync/Freesync and nobody cares about G-Sync anymore.

Same will happen with DLSS. FSR will get wide support and soon nobody cares about DLSS.

Yeah, FSR reminds me of G-Sync and Freesync too.
And AMD's Ray Tracing support. And their video encoding. And their compute stack.

Knock-offs that are nowhere near as good as the market leading solutions.
But hey, at least they are cheaper. That inspires great passion in some types of customers.
 
Yeah, FSR reminds me of G-Sync and Freesync too.
And AMD's Ray Tracing support. And their video encoding. And their compute stack.

Knock-offs that are nowhere near as good as the market leading solutions.
But hey, at least they are cheaper. That inspires great passion in some types of customers.
Video encoder is useless for 95% of customers. Ray tracing in current cards is way too slow so it really doesn't matter if cards have RT at all. It's only for developing purposes.

Those "knock- offs" happen to be much simpler to use and cheaper than "market leading" solutions. Meaning they will also get much higher market share faster.

Later this year amount of FSR supported games are likely to surpass amount of DLSS supported games. FSR released two months ago and DLSS 2.5 years ago. FSR adoption rate is around 7 games per month, DLSS is around 1 game per month. Who cares about "market leading solution" that is pretty much useless because of no support?
 
Yeah, FSR reminds me of G-Sync and Freesync too.
And AMD's Ray Tracing support. And their video encoding. And their compute stack.

Knock-offs that are nowhere near as good as the market leading solutions.
But hey, at least they are cheaper. That inspires great passion in some types of customers.
The compute stack is fine. I'm assuming that you are referring about the fact that more applications have good CUDA support. Even so, for example, the Radeon Pro W 6800 is no slouch in many applications (it gained a lot compared to the previous gen).

I also wish the encoder on AMD was better. But there are some third party solutions for that.
 
The compute stack is fine. I'm assuming that you are referring about the fact that more applications have good CUDA support. Even so, for example, the Radeon Pro W 6800 is no slouch in many applications (it gained a lot compared to the previous gen).

I also wish the encoder on AMD was better. But there are some third party solutions for that.
Also AMD has no interest in areas where brainwashed people buy brand instead speed/features. Like laptops, servers and in this case "professional" GPU segment. No matter if AMD is better on Everything, many still buy "the other brand". We have seen that before.
 
Also AMD has no interest in areas where brainwashed people buy brand instead speed/features. Like laptops, servers and in this case "professional" GPU segment. No matter if AMD is better on Everything, many still buy "the other brand". We have seen that before.
Well, to be fair, reputation is important. It gives people peace of mind that what they bought is good, even if there might be better options for their specific needs.
 
Video encoder is useless for 95% of customers. Ray tracing in current cards is way too slow so it really doesn't matter if cards have RT at all. It's only for developing purposes.

Those "knock- offs" happen to be much simpler to use and cheaper than "market leading" solutions. Meaning they will also get much higher market share faster.

Later this year amount of FSR supported games are likely to surpass amount of DLSS supported games. FSR released two months ago and DLSS 2.5 years ago. FSR adoption rate is around 7 games per month, DLSS is around 1 game per month. Who cares about "market leading solution" that is pretty much useless because of no support?

Quality > Quantity, basically games that come with DLSS definitely require it to archieve the highest visual with RayTracing, for example Control, Metro Exodus Enhanced, Cyberpunk 2077, Doom Eternal, WD Legions, The Medium and The Ascent.

There is literally no RT game that are unplayable on RTX3000 provided the screen res is suitable to GPU's performance (3060 - 1080p, 3070 - 1440 and 3080 - 4K).

So yeah, there is no game that cripple Ampere, but there are plenty for RDNA2, so paying premium prices for RX6000 is just dumb, RDNA2 make sense for 500usd or less GPU.
 
Knock-offs that are nowhere near as good as the market leading solutions.
But hey, at least they are cheaper. That inspires great passion in some types of customers.
When a consumer post something like that, tells you that there is no hope for us.
 
Quality > Quantity, basically games that come with DLSS definitely require it to archieve the highest visual with RayTracing, for example Control, Metro Exodus Enhanced, Cyberpunk 2077, Doom Eternal, WD Legions, The Medium and The Ascent.

There is literally no RT game that are unplayable on RTX3000 provided the screen res is suitable to GPU's performance (3060 - 1080p, 3070 - 1440 and 3080 - 4K).

So yeah, there is no game that cripple Ampere, but there are plenty for RDNA2, so paying premium prices for RX6000 is just dumb, RDNA2 make sense for 500usd or less GPU.

What's stupid is folks buying into the RTing aspect of games and the technology behind it in its current form. Neither company (Nvidia or AMD) are good for it. Give RT technology at least 2 more generations before you make a decision about what company does it better.

If RTing requires the use of DLSS or AMD's equivalent, then something is still seriously lacking with RTing with today's current gen of GPUs.
 
Quality > Quantity, basically games that come with DLSS definitely require it to archieve the highest visual with RayTracing, for example Control, Metro Exodus Enhanced, Cyberpunk 2077, Doom Eternal, WD Legions, The Medium and The Ascent.

There is literally no RT game that are unplayable on RTX3000 provided the screen res is suitable to GPU's performance (3060 - 1080p, 3070 - 1440 and 3080 - 4K).

So yeah, there is no game that cripple Ampere, but there are plenty for RDNA2, so paying premium prices for RX6000 is just dumb, RDNA2 make sense for 500usd or less GPU.
Yet the 3070/3070ti can be crippled by lack of VRAM. So much for "no compromises" :/

You are also ignoring the normal rendering performance.
 
Quality > Quantity, basically games that come with DLSS definitely require it to archieve the highest visual with RayTracing, for example Control, Metro Exodus Enhanced, Cyberpunk 2077, Doom Eternal, WD Legions, The Medium and The Ascent.
Games that have ray tracing support also require DLSS to be playable. Hmm. Sounds like Nvidia marketing for me ;)
There is literally no RT game that are unplayable on RTX3000 provided the screen res is suitable to GPU's performance (3060 - 1080p, 3070 - 1440 and 3080 - 4K).

So yeah, there is no game that cripple Ampere, but there are plenty for RDNA2, so paying premium prices for RX6000 is just dumb, RDNA2 make sense for 500usd or less GPU.
Cyberpunk 2077 is quite unplayable 1440p with RT even on RTX 3090. 60 FPS is just not enough when 144 Hz displays are common. And 4K?
Clearly, 4K ray tracing with the Ultra preset is a seriously performance intensive mode with not even the top of the line RTX 3090 delivering a 40 FPS experience. Even with a variable refresh rate display, this is a sluggish experience and honestly you should turn down the settings to get 60 FPS. Like we said earlier, this is really meant for future GPUs with improved levels of performance.
Exactly. Even current games are way too demanding on for current cards. RT on current cards is basically useless and only for development purposes.

I would also say that Cyberpunk cripples Ampere very badly. And it's even 2020 September title :D
 
Yet the 3070/3070ti can be crippled by lack of VRAM. So much for "no compromises" :/

You are also ignoring the normal rendering performance.

Doom Eternal RT can still run 100FPS+ with 3070/3070Ti at 1440p, what more do you want? 200FPS at 4K with a 500usd GPU :joy:?

Check out TPU review, the 3070 never dip below 60FPS at 1440p in any game, maybe CP2077 does but it has DLSS, so all bases covered.
Average per-Game FPS FPS 2560x1440


What's stupid is folks buying into the RTing aspect of games and the technology behind it in its current form. Neither company (Nvidia or AMD) are good for it. Give RT technology at least 2 more generations before you make a decision about what company does it better.

If RTing requires the use of DLSS or AMD's equivalent, then something is still seriously lacking with RTing with today's current gen of GPUs.

Sure let just ignore every beautifully-rendered RT games that are out right now and the next 4 years that are perfectly playable on RTX3000. I guess every gamer should be very pragmatic with how they spend their money :p, by that I mean just stop playing games and start working.

Games that have ray tracing support also require DLSS to be playable. Hmm. Sounds like Nvidia marketing for me ;)

It just means Nvidia got all their bases covered LOL
index.php


With 4K DLSS Balanced, which look very close to 4K Native, CP2077 is very playable with RT on the 3080 and above. CP2077 is truly next gen in term of graphics, so you will not find better looking game in the near future. Perhaps RX8000 owners can enjoy CP2077 in all its glory and change their mind about RayTracing :p .
 
Stop beating the bush with amd. amd doesn't catchup because they are always one step behind nvidia, the drivers are not as good, they are usually lacking features that nvidia has and the game support isn't as good.
if amd wants to gain back marketshare they need to do in the graphics space what they did in the cpu space, that is actually bring some better products compared to the competition, not only similar products...
I have both 1060 and 580 and seeing the ratio of ownership on these two clearly shows four out of five people either do not bother about what they are being sold or do not know a crap about what they think they know.
 
What's stupid is folks buying into the RTing aspect of games and the technology behind it in its current form. Neither company (Nvidia or AMD) are good for it.
Man, I dont get why such a simple concept is so hard to understand.

Give RT technology at least 2 more generations before you make a decision about what company does it better.
Tired of repeating that and getting downvoted on other places because.
I have both 1060 and 580 and seeing the ratio of ownership on these two clearly shows four out of five people either do not bother about what they are being sold or do not know a crap about what they think they know.
If you remember the movie F&F Tokyo Drift, when the guy in the Viper says all the cr@p that his car do, the answer was "nice, someone read the brochure".

After RT30 series came out, its the same thing all over, the drones only knows to repeat the same things that their youtubers overlords keep spewing.

Which of course comes directly from nvidia's marketing dept.
 
Doom Eternal RT can still run 100FPS+ with 3070/3070Ti at 1440p, what more do you want? 200FPS at 4K with a 500usd GPU :joy:?

Check out TPU review, the 3070 never dip below 60FPS at 1440p in any game, maybe CP2077 does but it has DLSS, so all bases covered.
Average per-Game FPS FPS 2560x1440
Just like Doom 2016, Doom eternal runs poorly even FPS looks good. There is just something badly wrong on engine. Similarly some Call of Duty series have games that run like charm and then some that are unplayable. Even when FPS looks good on game like Black ops 3, it runs poorly. Same applies to some poor console ports like Far Cry 3/4/5.
index.php


With 4K DLSS Balanced, which look very close to 4K Native, CP2077 is very playable with RT on the 3080 and above. CP2077 is truly next gen in term of graphics, so you will not find better looking game in the near future. Perhaps RX8000 owners can enjoy CP2077 in all its glory and change their mind about RayTracing :p .
Does not look "very playable" for me. I have 144 Hz monitor and expect at least 100 FPS. Again, that is Nvidia sponsored title and still runs poorly. Even with Nvidia hardware. Perhaps reason is that.
 
Doom Eternal RT can still run 100FPS+ with 3070/3070Ti at 1440p, what more do you want? 200FPS at 4K with a 500usd GPU :joy:?

Check out TPU review, the 3070 never dip below 60FPS at 1440p in any game, maybe CP2077 does but it has DLSS, so all bases covered.
Average per-Game FPS FPS 2560x1440




Sure let just ignore every beautifully-rendered RT games that are out right now and the next 4 years that are perfectly playable on RTX3000. I guess every gamer should be very pragmatic with how they spend their money :p, by that I mean just stop playing games and start working.



It just means Nvidia got all their bases covered LOL
index.php


With 4K DLSS Balanced, which look very close to 4K Native, CP2077 is very playable with RT on the 3080 and above. CP2077 is truly next gen in term of graphics, so you will not find better looking game in the near future. Perhaps RX8000 owners can enjoy CP2077 in all its glory and change their mind about RayTracing :p .
I know all of the results and it's pretty funny that you think the 3070 is just for 1440p. Do you know why you think that? Because it has only 8GB. It's insulting to even think it's normal for such expensive GPUs to have so little VRAM.

FYI I would have snapped a 3070 very early on when it was "just" 1.5x the normal price if it had 12 or 16GB because it would allow for more complex scenes in Blender and I really wanted such a GPU, but as it stands I'll wait for the next gen cards. It's such a shame :(

The marketing for Nvidia will read like this: "Upgrade to the 4070, it has more VRAM!" - I doubt Nvidia will pull such a stunt a second time. but knowing them there is a chance that we might see another reduction in VRAM like with this generation :D (6GB 1000$ GPUs anyone?)
 
Last edited:
I know all of the results and it's pretty funny that you think the 3070 is just for 1440p. Do you know why you think that? Because it has only 8GB. It's insulting to even think it's normal for such expensive GPUs to have so little VRAM.

FYI I would have snapped a 3070 very early on when it was "just" 1.5x the normal price if it had 12 or 16GB because it would allow for more complex scenes in Blender and I really wanted such a GPU, but as it stands I'll wait for the next gen cards. It's such a shame :(

The marketing for Nvidia will read like this: "Upgrade to the 4070, it has more VRAM!" - I doubt Nvidia will pull such a stunt a second time. but knowing them there is a chance that we might see another reduction in VRAM like with this generation :D (6GB 1000$ GPUs anyone?)

RTX3070 and RX6800 are not 4K GPU, the GPU core don't have enough horsepower for that
Average per-Game FPS FPS 3840x2160


You are looking at 6 games that run sub 60FPS on the 16GB RX6800 at 4K and they are not even 2020-2021 games.
So yeah, just keep your expectation a bit more realistic, ~500usd GPU are not meant for 4K (though 3080/6800 are ~1000usd now :dizzy:).
DirectStorage API is coming out very soon and it would definitely alleviate VRAM requirement, so hang on to those 6/8GB VRAM GPU for now :laughing:
 
RTX3070 and RX6800 are not 4K GPU, the GPU core don't have enough horsepower for that
Average per-Game FPS FPS 3840x2160


You are looking at 6 games that run sub 60FPS on the 16GB RX6800 at 4K and they are not even 2020-2021 games.
So yeah, just keep your expectation a bit more realistic, ~500usd GPU are not meant for 4K (though 3080/6800 are ~1000usd now :dizzy:).
DirectStorage API is coming out very soon and it would definitely alleviate VRAM requirement, so hang on to those 6/8GB VRAM GPU for now :laughing:
I wasn't talking about the 6800, but the 3700 and it's paltry amount of VRAM. And the 6800 is above the 3700 in terms of performance in almost all games, I don't get your point. This is a known fact.


FYI DirectStorage won't alleviate VRAM requirements as it doesn't change the amount of data a game has to put into RAM, it will just make swapping the data faster (it will make loading different worlds/zones faster). And don't expect the same type of tight integration as with consoles.

If you are waiting for directstorage to become a magical fix for Nvidia lowering the VRAM then I'm sorry but it won't.

I consider both cards to be 4K capable in terms of raw GPU power. But gaming is not my biggest gripe, it's that Nvidia crippled the 3070 with 8GB and made it a non-buy for me and what I wanted it for. Games is not all it can do and Blender runs generally better on Nvidia cards. I don't want to pay for a 3090 or a Quadro GPU when looking at new GPUs for a hobby.
 
Last edited:
Hey I'm not knocking your dollar store hardware choices.

You do you.
If that is all that you can say or gather from me, then definitely I will keep doing me, because I dont have to be so limited as you are.
 
If that is all that you can say or gather from me, then definitely I will keep doing me, because I dont have to be so limited as you are.

No I agree, there's definitely a lot to unpack about the way you seem to be almost solely driven by negative choices. Whether it's about Intel, Apple or Nvidia.
 
Back