Latest Steam survey: RTX 3070 is the month's top performer, AMD edges closer to 30% CPU...

imagine buying a 3090 for 1500 (assuming you can) for ONE game...

Lol - 1500 is nothing. Go look at what people spend on simrigs dedicated to one or at best a small handful of sim titles.1500 doesn’t even cover my controllers for DCS World/IL-2/Elite Dangerous.
 
Nobody will sway you of your deep delusion that RT and DLSS mean nothing. Those are the very reasons why GeForce is absolutely destroying Radeon right now. You are so deeply incorrect about these technologies that there is simply no way you have actually ever used them. It’s quite pathetic really. However I have absolute no doubt in my mind at all that when Radeon eventually catches up on RT and AI compression that you will like it and use it.

But you are unlikely to disagree with hard numbers and as you can see from the results in this very survey that AMD sell about one Radeon for every 10 GeForce parts that Nvidia sells at least. The market isn’t stupid. If they were they would buy Radeon...

I find it hilarious that Nvidia has dedicated - D E D I C A T E D - RT cores to handle RT and without the help of DLSS downgrading the image to lower resolution and then leaving a motion like blur after effect once the image is upscaled, Nvidia cards take a huge hit to their raytracing performance (perhaps DLSS 3.0 will eliminate that blurry after effect that 2.0 and 1.0 have...albeit, 2.0 is certainly much improved over 1.0)

Without help from DLSS Nvidia suffers from a lot of performance loss, not as bad as AMD, but it is very noticeable.

As for AMD, they don't have DLSS and nothing yet released to counter it and their RT performance on the 6800XT tends to fall between what a 2080Ti can do and what a 3070 can do.

As it stands, neither company has any kind of bragging rights for such a new technology such as RT that's being utilized in games. Give both sides two more generations to hash things out. Until then, anyone that touts Nvidia is better (given that they have dedicated RT cores and still suck at providing high RT performance without the help of DLSS) is jumping the gun.

Two generations out, then you use X company has better RT performance over Y company....and don't forget that Intel might be part of the equation by then, they could very well be a contender in all of this in a couple of generations down the road.
 
Yup agreed control is a really good implementation of both RT and DLSS. So much better than the earlier games like Battlefield V and Shadow of the tomb raider.

I think RT started out as all hype and no substance but it’s been 2.5 years since we got the first underwhelming RT/DLSS implementations in games. However some of the later implementations have really impressed me. Metro Exodus, Minecraft, Death stranding, Cyberpunk are all very good implementations of RTX tech in my opinion.
The only problem is the massive performance hit the card has to deal with. With the res tricked by DLSS, the performance hit seems not to be there, but it is there anyway. I prefer to play at the native resolution nonetheless, without any technological trick affecting the image quality. All in all, RT is not for now but for the future, with the arrival of more powerful cards.
 
The most remarkable stat on the charts was; ATI Radeon X550 16.67% 10.26% 10.81% 9.38% 37.93% +28.55%. It climbed 28.55% from Feb to March. People (must be the Chinese) are scraping the bottom.
 
The only problem is the massive performance hit the card has to deal with. With the res tricked by DLSS, the performance hit seems not to be there, but it is there anyway. I prefer to play at the native resolution nonetheless, without any technological trick affecting the image quality. All in all, RT is not for now but for the future, with the arrival of more powerful cards.
I find it depends on the game, if it’s a fast paced game then I typically would prefer RT off. But then again control is fast paced and that’s better with it on. But the magic in that game is DLSS2.0, death stranding has it too and it looks better than native. I’m not making that up, several tech outlets have reported the same thing. Alex Battaglia from Digital Foundry described it as objectively better with DLSS on.

You are however incorrect. RT is for now, I’ve played thousands of hours with RT on, mostly in minecraft but it’s awesome and it works and I don’t understand why people online (usually who don’t own an RTX card) go around saying it hurts performance too much and that it’s for the future. I’m getting like 80fps on minecaft, that is more than fine!

I do find it amusing reading all these comments from people saying things like “RTX hurts performance to much to be playable” or “you need a 3090 for RT to be worth it”. I’m sorry but these people are wrong. Good RT experiences are available right now. People who claim they can’t tell the difference need an eye test. The difference between RT on and off is bigger than the difference between low and ultra settings in most games that has it.

RT is absolutely a “now” tech. Even the consoles have it.
 
I find it hilarious that Nvidia has dedicated - D E D I C A T E D - RT cores to handle RT and without the help of DLSS downgrading the image to lower resolution and then leaving a motion like blur after effect once the image is upscaled, Nvidia cards take a huge hit to their raytracing performance (perhaps DLSS 3.0 will eliminate that blurry after effect that 2.0 and 1.0 have...albeit, 2.0 is certainly much improved over 1.0)

Without help from DLSS Nvidia suffers from a lot of performance loss, not as bad as AMD, but it is very noticeable.

As for AMD, they don't have DLSS and nothing yet released to counter it and their RT performance on the 6800XT tends to fall between what a 2080Ti can do and what a 3070 can do.

As it stands, neither company has any kind of bragging rights for such a new technology such as RT that's being utilized in games. Give both sides two more generations to hash things out. Until then, anyone that touts Nvidia is better (given that they have dedicated RT cores and still suck at providing high RT performance without the help of DLSS) is jumping the gun.

Two generations out, then you use X company has better RT performance over Y company....and don't forget that Intel might be part of the equation by then, they could very well be a contender in all of this in a couple of generations down the road.
The RT performance on modern parts is quite acceptable actually. You clearly don’t own or have never used an RTX part. There are lots of games that RT is a bit pointless and rubbish in but then again there are lots of games that it’s awesome and definitely worth turning on.

There are lots of RTX naysayers. 99% of the time these people have never actually seen RT in real life. It works just fine, those graphs you see with frame rates going above 60 on games like cyberpunk are real and yes you do need DLSS and that is a slight hit. But it’s tiny compared to the visual boost RT gives in that game.
 
Are mobile GPU accounted in those numbers ? because the market was flooded by "gaming notebook" with 3070 and 3060. Those laptops are widely available.
 
I find it hilarious that Nvidia has dedicated - D E D I C A T E D - RT cores to handle RT and without the help of DLSS downgrading the image to lower resolution and then leaving a motion like blur after effect once the image is upscaled, Nvidia cards take a huge hit to their raytracing performance (perhaps DLSS 3.0 will eliminate that blurry after effect that 2.0 and 1.0 have...albeit, 2.0 is certainly much improved over 1.0)

Without help from DLSS Nvidia suffers from a lot of performance loss, not as bad as AMD, but it is very noticeable.

As for AMD, they don't have DLSS and nothing yet released to counter it and their RT performance on the 6800XT tends to fall between what a 2080Ti can do and what a 3070 can do.

As it stands, neither company has any kind of bragging rights for such a new technology such as RT that's being utilized in games. Give both sides two more generations to hash things out. Until then, anyone that touts Nvidia is better (given that they have dedicated RT cores and still suck at providing high RT performance without the help of DLSS) is jumping the gun.

Two generations out, then you use X company has better RT performance over Y company....and don't forget that Intel might be part of the equation by then, they could very well be a contender in all of this in a couple of generations down the road.
did you actually try RT + DLSS ? Because there is no blurry effect at all.
you could see blurry images only if you use DLSS on a FHD display: in this case the rendering resolution is so low (usually 720P or below) than even AI upscaling can't do miracles.
But at WQHD or 4K DLSS isn't blurry at all.
 
I'm surprised we're still on 16gb RAM. I bought my mine in 2014 for $90. 7 years later and they're the same price range and no gaming PC needs more than 16GB That has been the standard for 10 years now? The move from 4gb to 8gb and 2gb to 4gb was so much quicker.
did you check actual RAM utilization on modern games ? Even the heavier games don't go higher than 8/9 Gb of allocated RAM. I saw 12/14 Gb only in flight simulators with huge scenario.
 
Yup agreed control is a really good implementation of both RT and DLSS. So much better than the earlier games like Battlefield V and Shadow of the tomb raider.

I think RT started out as all hype and no substance but it’s been 2.5 years since we got the first underwhelming RT/DLSS implementations in games. However some of the later implementations have really impressed me. Metro Exodus, Minecraft, Death stranding, Cyberpunk are all very good implementations of RTX tech in my opinion.
Control and Cyberpunk 2077 are quite different games using RT. The experience is definitely better.
But to be honest even on Shadow of the Tomb raider I enjoyed it very much, especially for shadows.
 
did you actually try RT + DLSS ? Because there is no blurry effect at all.
you could see blurry images only if you use DLSS on a FHD display: in this case the rendering resolution is so low (usually 720P or below) than even AI upscaling can't do miracles.
But at WQHD or 4K DLSS isn't blurry at all.
These people have never tried it. There’s no way. When I got an RTX card I expected not to use RT and DLSS from all the bullsh1t I read online from the AMD fandom. But the reality is that it is viable and you can use it and in many cases you actually really appreciate the uplift in fps. In some cases DLSS looks crisper than native.
 
Control and Cyberpunk 2077 are quite different games using RT. The experience is definitely better.
But to be honest even on Shadow of the Tomb raider I enjoyed it very much, especially for shadows.
I played through SOTTR with RT on. But I can’t say it gave me that much of a benefit. At least with my RTX 2080 I didn’t need to use DLSS to get 60 and that was nice as the DLSS is amongst the worst I’ve seen. But I think really it was a bit gimmicky, certainly not as comprehensive as other RT experiences.

Metro Exodus was the first game that made me think ok yeah this is awesome. That game is a perfect fit for RT with its radiation, light sources and stealth based gameplay.
 
These people have never tried it. There’s no way. When I got an RTX card I expected not to use RT and DLSS from all the bullsh1t I read online from the AMD fandom. But the reality is that it is viable and you can use it and in many cases you actually really appreciate the uplift in fps. In some cases DLSS looks crisper than native.
my first experience with DLSS wasn't great: on a 2070 Super on a 1080P monitor (it was BF V), performance was great, but the image was a little bit blurry.
Playing on a 1440P display is a totally different experience.
 
my first experience with DLSS wasn't great: on a 2070 Super on a 1080P monitor (it was BF V), performance was great, but the image was a little bit blurry.
Playing on a 1440P display is a totally different experience.
Ive not used less than 1440p for some years at this point so I wouldnt know. However I did see a youtuber comparing 240p native I think to 720p DLSS ultra performance which is also 240p internal and the difference was absolutely staggering.

 
This is skewed to due people not being able to get their precious hardware. I expect GPU pricing would be different if stock wasn't an issue and the market would look different and CPU-wise, AMD would eat into Intels market share a lot more.
 
Last edited:
How exactly?

Because besides the RT hype (Ohh, look at that puddle on a paused game!) and the small amount of games using DLSS, the current AMD offerings are indeed very competitive.

Only thing left, that is clear that AMD cant do anything about, are the customers that blindly follow the hype created by the marketing team, instead of following the results of proper reviews, like the ones by HU.

Kind of ironic that I have to say that, given that you are posting on their site....
There is nothing "competitive" from either side. Products from both companies are hard to find and sell at outrageous prices. The impression I have from checking the product listings at local system builders (the only ones that have gpus in stock for about 200% msrp and not 300%) is that Nvidia is producing/supplying way more cards than AMD
Makes sense for AMD to focus on other 7nm products (CPUs), but not for its fans to call their vaporware GPU competitive
Production from AMD seems to be so low, that it makes me wonder why did they actually bother developing these GPUs.
 
There is nothing "competitive" from either side. Products from both companies are hard to find and sell at outrageous prices. The impression I have from checking the product listings at local system builders (the only ones that have gpus in stock for about 200% msrp and not 300%) is that Nvidia is producing/supplying way more cards than AMD
Makes sense for AMD to focus on other 7nm products (CPUs), but not for its fans to call their vaporware GPU competitive
Production from AMD seems to be so low, that it makes me wonder why did they actually bother developing these GPUs.
Looking at my local supplier, both Nvidia and AMD only have top models (6900XT/3090) available, everything else is out of stock. Pretty much depends on where you look.

AMD delivered over million 5000-series GPU's last year (remember: launch was 18.th November). Your imaginations about "vaporware" are simply stupid. Anything not available because on high demand could be considered "vaporware", despite amount of units shipped. AMD has no problems shipping GPU's, neither Nvidia has. Enough demand will cause shortage, no matter how much is made.
 
The most remarkable stat on the charts was; ATI Radeon X550 16.67% 10.26% 10.81% 9.38% 37.93% +28.55%. It climbed 28.55% from Feb to March. People (must be the Chinese) are scraping the bottom.
Seriously?? So the Chinese are dirt poor peasants who can't afford anything but dirt cheap cards...

Unlike us filthy rich Americans who are buying any expensive card we want ....while watching other Americans standing in huge lines at food banks because many can't afford food these days!!

Very keen observation!!
PS: I hope the moderators come to your help and remove this post, as usual.
 
Looking at my local supplier, both Nvidia and AMD only have top models (6900XT/3090) available, everything else is out of stock. Pretty much depends on where you look.

AMD delivered over million 5000-series GPU's last year (remember: launch was 18.th November). Your imaginations about "vaporware" are simply stupid. Anything not available because on high demand could be considered "vaporware", despite amount of units shipped. AMD has no problems shipping GPU's, neither Nvidia has. Enough demand will cause shortage, no matter how much is made.
I have seen Nvidia's Ampere gpus listed way more, and until a couple months back, in "reasonable" prices (competitive compared to turing perf/dollar at least)
I can't say the same for AMD GPUs of the 6000 series.
 
I have seen Nvidia's Ampere gpus listed way more, and until a couple months back, in "reasonable" prices (competitive compared to turing perf/dollar at least)
I can't say the same for AMD GPUs of the 6000 series.
Amperes were released earlier, before crypto boom/Covid really crashed GPU market so that's very predictable. 6000-series were released just when Bitcoin value (that affects other cryptocoins too) started rise. Basically 6000-series never had any time to be on "normal" market, whereas RTX 3000 series had.
 
Looking at my local supplier, both Nvidia and AMD only have top models (6900XT/3090) available, everything else is out of stock. Pretty much depends on where you look.

AMD delivered over million 5000-series GPU's last year (remember: launch was 18.th November). Your imaginations about "vaporware" are simply stupid. Anything not available because on high demand could be considered "vaporware", despite amount of units shipped. AMD has no problems shipping GPU's, neither Nvidia has. Enough demand will cause shortage, no matter how much is made.
AMD 6800 and 6800XT really are vaporware.
It is not high demand: they are nowhere to be seen even on Steam hardware survey. But as usual on your part apologize apologize apologize for everything AMD does.
 
I have seen Nvidia's Ampere gpus listed way more, and until a couple months back, in "reasonable" prices (competitive compared to turing perf/dollar at least)
I can't say the same for AMD GPUs of the 6000 series.
believe me: you are wasting your time.
He will deny everything that doesn't support his narrative pro-AMD.

You are right: Nvidia GPUs are hard to find but you can find them. AMD GPUs are basically vaporware.
 
Amperes were released earlier, before crypto boom/Covid really crashed GPU market so that's very predictable. 6000-series were released just when Bitcoin value (that affects other cryptocoins too) started rise. Basically 6000-series never had any time to be on "normal" market, whereas RTX 3000 series had.
give me a break.
RTX 3070 were released a little bit more than 2 weeks before Radeon 6800.
But a lot of people could buy an overpriced 3070, while the 6800 never reached shops in many countries.
 
AMD 6800 and 6800XT really are vaporware.
It is not high demand: they are nowhere to be seen even on Steam hardware survey. But as usual on your part apologize apologize apologize for everything AMD does.
Steam hardware survey is very inaccurate when it comes to GPU's. Some examples from GPU-parts:

AMD Radeon RX Vega 11 Graphics
AMD Radeon Vega 8 Graphics


"OK". Then we also have:

Intel(R) UHD Graphics

And how about this one:

Intel Haswell

Haswell is CPU, not GPU🤦‍♂️

Whole Steam Hardware Survey is just inaccurate BS. Proven.

Yeah right. You couldn't get even mid range cards anywhere and that should create demand for 6800 and 6800 XT. Also because of demand, their prices are very high.

That proves demand for 6800 and 6800 XT is ultra high.

believe me: you are wasting your time.
He will deny everything that doesn't support his narrative pro-AMD.

You are right: Nvidia GPUs are hard to find but you can find them. AMD GPUs are basically vaporware.
How about putting some facts? Like I said, on my country only AMD and Nvidia top models are available on limited supply. Everything else from 3000 and 6000 series are totally out of stock.

Zen3 CPU's are out of stock anywhere, despite AMD shipped 1 million units Q4/2020 alone. "Vaporware" 🤦‍♂️
 
believe me: you are wasting your time.
He will deny everything that doesn't support his narrative pro-AMD.

You are right: Nvidia GPUs are hard to find but you can find them. AMD GPUs are basically vaporware.
Learn definition of vaporware.

Using your logic, Zen3 CPU's, PS5 and whateverthatmicrosoftconsoleis are also vaporware despite millions units shipped 🤦‍♂️ 🤦‍♂️ 🤦‍♂️
give me a break.
RTX 3070 were released a little bit more than 2 weeks before Radeon 6800.
But a lot of people could buy an overpriced 3070, while the 6800 never reached shops in many countries.
Two weeks is a lot. And 3080 was released nearly two months before. Also better comparison point is Radeon 6700XT, because 3070 is using smaller die than 3080.
 
Back