The GPU Market Finally Gives In, Nvidia Prices Drop

Perhaps you misunderstood what I wrote?

Because your comment is exactly what I meant in the part that you quoted.
Perhaps, that's not the way it read for me. Apologies.

Outside of a couple of outlets that are adamantly anti-AMD, most of the reviewers I follow are recommending AMD alternatives to every 40 series card and I still see too many people praising Nvidia on name alone.
 
I just went and took a look at eBay. There are still tons of GPUs on the site well over MSRP. I hope ALL of the shysters go bankrupt. Chuds.
 
Ha! I love it how AMD has to take the heat for Nvidia's price gouging. Over the last decade, including the pandemic inflation and mining craze, AMD's flagship MSRP increased by 35% while actual price increased by 28%.on the other side, Nvidia has increased their flagship MSRP by 68% and actual retail has gone up 70%. Nvidia jumped MSRP by $500 going from the 3090 to the 3090ti for a 5% performance boost because of scalpers. This ain't a both sides issue. For every 40 series card, you can find a comparable RX 6000 or 7000 card for at least $150 less. These two companies are not the same.
They are not the "same", but have behaved similarly with pricing and particularities arise from their market positions. While Nvidia is the uncontested leader, at least in sales if technical superiority is to be argued by some ardent fanboys, AMD, the beloved underdog, followed the pricing trend. Neither of these companies had a special program to sell cards at MSRP or close to that to gamers when they needed them, neither cared and were extatic to create scarcity, thus inflating prices and what I said earlier. If AMD increase was not as big is because of their market position as the forever second place. The latest proof of that is when Nvidia released the 4080 RTX at 1200$, a card that should be priced at 700-800$, AMD released the 7900XTX at 1000$, instead 700$. And let`s not talk about that ripoff 7900XT at 900$.
 
Yea, but no. Prices aren't "dropping". Old inventory is being moved out. Prices are still high by most people's standards. When I see a 4080 for $1K or a 4070Ti for $600, then I'll know prices have dropped.
IMO, when XX80 cards used to come in at far less than $1K, even $1K for an xx80 series card is far too much and is exactly the type of price increases that people have been swallowing up from Nvidia.
 
To be fair, CUDA has more or less morphed into OpenCL, which AMD also supports.
If I remember correctly, OpenCL runs like crap in nvidia hardware, thanks to nvidia actions, but also thanks to OpenCL itself.

That said, the correct answer is ROCm, from AMD, which literally helps you use CUDA code and modify it to run on AMD and others GPU's.

I must admit, I dont know much of this, so that could be either wrong and simply very, very basic explanation.
 
Perhaps, that's not the way it read for me. Apologies.

Outside of a couple of outlets that are adamantly anti-AMD, most of the reviewers I follow are recommending AMD alternatives to every 40 series card and I still see too many people praising Nvidia on name alone.
Then you need to share those real reviewers, because the ones that I follow are so pro-nvidia that the only hting they are missing is to openly admit they are getting paid directly by Dear Leader Jensen himself. :)
 
If I remember correctly, OpenCL runs like crap in nvidia hardware, thanks to nvidia actions, but also thanks to OpenCL itself.
OpenCL doesn’t necessarily run like crap on Nvidia hardware, or at the very least, no worse/better than on AMD’s. A key aspect of the API is platform portability, so if one doesn’t adapt the code to take advantage of any specific hardware feature, to maintain that portability, the performance will be somewhat underwhelming. Once you start adding in relevant optimizations, then you pull back the performance, but at the cost of portability.
 
OpenCL doesn’t necessarily run like crap on Nvidia hardware, or at the very least, no worse/better than on AMD’s. A key aspect of the API is platform portability, so if one doesn’t adapt the code to take advantage of any specific hardware feature, to maintain that portability, the performance will be somewhat underwhelming. Once you start adding in relevant optimizations, then you pull back the performance, but at the cost of portability.
I am not an expert on this by any means, but as I mentioned, I recall reading about it where people that did know way more than me on this, said that about the performance issues.

That said, given how nvidia has always behave and will push for proprietary software, I simply didnt doubt it neither questioned it.
 
I am interested in a console (first time in my pcmasterrace life) for my son. Can you share some info about these refreshes?
Just the rumour mill - from a big game developer - They always get mid cycle refresh - slim more efficient - but I'm hoping for the Pro versions .
Sony has no reason not to release as PS5 now easy to buy
Even a Pro PS5 - with 2 TB - next gen AMD GPU - same series done on a smaller node - would be good
As for Xmas - that's just my guess as plenty of time GPU came out last year to be modded - and best time to hit market
 
IMO, when XX80 cards used to come in at far less than $1K, even $1K for an xx80 series card is far too much and is exactly the type of price increases that people have been swallowing up from Nvidia.
All I'm saying is that at $1000, I would rather buy a 4080 than a 7900XTX. Now, I'd rather buy either card closer to $600/700 but I don't think that's going to happen.

PS - I did see 4080s for sale today for exactly $1000. Free shipping too. It's somewhat tempting, but still more than I want to spend.
 
That said, given how nvidia has always behave and will push for proprietary software, I simply didnt doubt it neither questioned it.
In this case, it was AMD's version of "Cuda" that was proprietary in the sense that if you wanted to develop for it, you had to pay for the toolkit. Nvidia took M$'s example, in giving away M$ Office, and made the toolkit necessary to develop for Cuda available for free. It was only with OpenCL that AMD's "compute" toolkit became available to the average developer.

And now, that Intel has made their "OneAPI" toolkits available for free, developing for GPUs is far more accessible than it was when "Compute" for GPUs first became a thing.
 
I game with a large cross-section of Gamers/Players and in chat/comms/discord you can always hear someone talking hardware woes & stuff. It is universally understood/known or taken that RT is just a gimmick to liven-up single player games... & has nothing at all to do with faster gaming, or getting faster frames and higher minimums.

To them... it's a superficial gimmick and nothing to do with the reason you buy a GPU, for faster frames.



Frugal people see through the marketing. NVidia has to lower their prices and competing on performance, not gimmicks.

 
I'm not so sure this trend towards higher resolution graphics and, therefore, the need for more VRam will grow exponentially. There was a good article on CNET about developers concerned about global warming and the impact of gaming on global warming. Some developers are looking at ways to make their games more climate friendly and that includes not having high-definition graphics. It all adds up, more Vram, more cores, faster bandwidth and so on. It all takes power.
They sound more like leftist trolls than developers. If developers wanted to make gaming more green, they should focus more on optimization to run more with less. Currently the opposite is happening. Graphics quality is progressing in a plautau like affect soon we will have ultra realistic Graphics with little to no improvement in terms of visual quality. Eventually when the industry reaches this moment in graphics fidelity the industry will turn to running those graphics at efficiency plautau. This was both Nvidia's and Epic's outlook before crypto mining happened in which I believe delayed this by a least a decade.
 
This table here show me all I need to know.

capture-png.297069
 
When people want to drown, they can do it in a small cup...that will or maybe was fixed with a driver update, but said that, 50 watts could translate to around US$2-3 a month.

Which features?
Do you mean DLSS, CUDA and other tools that are made by nvidia to exactly kill competition and keeping you prisoner to their hardware?

I dont know about you, but anti-consumer practices like those are the reason why I do buy AMD.

Man, the RT nonsense is really tiresome.

First, how many games are currently listed in Steam in total?

How many of those have RT in a useful, well implemented way that will make every single gamer want to excuse the performance hit included by enabling RT?

Does that performance hit really improve gameplay that much? No, it doesnt.

We are easily, at least 4 more gens away from having capable hardware and games that are worth the RT performance hit.

In my book, that time will be when I can buy a GPU that will run all the RT nonsense at 4K@120 fps and it cost around US$400 to 500.

But hey, no need to continue hammering this nail in, since its clear that many, many people will find whichever excuse they can to continue giving nvidia money.
I bought Nvidia again, after I sold my 3080 for 400 dollars after almost 3 years, because of features like DLDSR, DLSS, DLAA, Reflex, NvEnc/Shadowplay, RTX Remix, RTX VSR and I could go on.

...and RT does matter for me, as I will be playing Diablo 4 and RT is going to look amazing on my OLED monitor here. Dark environment + Spells = Awesome for RT. Looks unreal. Already seen it in other games.

I paid 530 dollars for my RTX 4080. How? Sold my 3080 for 400 dollars and got Diablo 4 for free, which I would have bought for 70 dollars anyway :laughing:
Paid 699 for my 3080 on release, and got it the day after launch.

Tons of games can be played with RT and high fps when DLSS Quality is used. Most AMD users are in denial because RT ON equals unplayable in most cases. I played tons of games with DLSS2 and RT so far. RT is not worth it in some games, at least not on high settings, but RT can transform some games and especially older games. RTX Remix is awesome. DLDSR is awesome. Playing old games with RT mods, awesome.

Completely revamps games. Especially older ones.

You don't know, because you can't use it. That's human nature; Denying. I know how it works.

AMDs multi monitor watt usage is not fixed at all. All new custom 7900XTX suffers from it. Just read the latest 7900XTX review on TPU. Was tested recently using newest drivers. AMD >said< they would look into a fix but never fixed it. AMD loves to talk. Talk is cheap.


I buy what I want, you buy what you want. A GPU for me is peanuts, I change every 2-3 year. Could not care less about longevity. Nvidia sits at 85% marketshare for a reason. Consumers vote with their wallets. As long as AMD don't offer way superior value and can match Nvidia features just slightly, I won't be considering AMD GPU.

I have an AMD platform tho. Their CPUs are fine - GPUs, not yet for me. Most I know with AMD CPU, use Nvidia GPU :dizzy: :laughing:

My last AMD GPU was 5700XT, a temp solution for a 2nd PC, and it had BSOD and black screen issues for months and months after release + VRM temps were insane at times. AMD fixed it in drivers, by gimping the performance slightly. Tons of users had issues with 5700XT for 6-9 months post release, simply google it. This was my most recent AMD GPU experience and it was awful.


My last great AMD GPU was 7970XT, but first after Cata 12.11 which fixed alot of issues and upped performance (took them ~9 months as well).

So yeah, AMD is better for longevity at times - depends on games played etc. However they most often have a very rough launch with new hardware and lacks proper support when you look at less popular and early access games.

AMDs software team is just smaller than Nvidias. Way smaller. This is why they can't match Nvidia features. That and lack of money. This is why game performance varies ALOT from title to title on AMD cards. Nvidia offers good performance across all games. AMD mostly offers good performance in >popular titles and benchmarks< because they know this is what reviewers will be using.

I play alot of early access games, they tend to run waaaay better on Nvidia as well. Nvidia have launch drivers ready for ALL GAMES on release day, even early access games and beta's - often several days before actual release - AMD does not.

Also the resell value of Nvidia cards is much higher. It's like selling an iPhone compared to an Android phone on the used market. Tons of buyers for the iPhone, little to none for the Android phone, which lost 80-90% of its value. This is something you forget to add in :cool:
 
Last edited:
They're RT performance is just fine. Unlike Nvidia, AMD understand that not everyone has $500 to spend on a 4K high refresh rate monitor just to use their $2000 GPU after spending $1500 on the rest of their system. 90% of PC gamers are fine with 1080. And building a GPU so powerful that any CPU I'm existence will bottleneck is isn't really a feature I'd want. I'd take the higher dual monitor wattage over an Nvidia card that runs 200w over TGP and spikes another 200w higher than that for more than twice the price but only offers 30% better performance.
Funny you mention watt spikes, because this was a huge problem on AMDs 6000 series


6900XT spikes to 636 watts

6800XT spikes to 579 watts

3090 Ti was not really good either, yet beat them soundly and "only" spikes to 570 watts

Watt spikes is what makes PSUs go *poof* or reboot the system.


4000 series perform very well in this regard. So did most of 3000 series. 3090 Ti was just fully peaked and came out like 1½ years after 3090, making it pointless anyway. Just like 6950XT was.

Good to see AMD has less spikes on 7000 series as well, now they just need to lower prices, improve features and release 7800, 7700 and 7600 series... It's been 6 months since 7900 series came out and AMD have been silent since. They rather wanted to sell last gen cards with big watt spikes instead of newer - more efficient - cards.
 
Last edited:
This table here show me all I need to know.

capture-png.297069
Citizens bank has a credit card deal now that gets you 10% cash back upto $1k then goes down to whatever the default rate is. That 4090 can be yours at $1439 fyi.
Funny you mention watt spikes, because this was a huge problem on AMDs 6000 series


6900XT spikes to 636 watts

6800XT spikes to 579 watts

3090 Ti was not really good either, yet beat them soundly and "only" spikes to 570 watts

Watt spikes is what makes PSUs go *poof* or reboot the system.

4000 series perform very well in this regard. So did most of 3000 series. 3090 Ti was just fully peaked and came out like 1½ years after 3090, making it pointless anyway. Just like 6950XT was.
I can attest to the 4090 not spiking out of control in terms of power spikes because I have been running my 4090 suprim liquid with a 750 watt platinum sfx psu since launch.
 
... I bought Nvidia again, after I sold my 3080 for 400 dollars....
... Talk is cheap.
... I buy what I want, you buy what you want. A GPU for me is peanuts, I change every 2-3 year. Could not care less about longevity.

In all of that explaining to us, you never once told us what resolution you play at, or your top 5 games. Only that you are some type of wheel & dealer and gets stuff cheaper than most of us, bcz you like to sell used parts and stuff. You told us all about nVidia gimmicktry and how you love their specific tech... but did not explain (at all) how that tech makes YOUR game better for you or your specific situation. Instead, You are telling is us how it could be better for us... if we are willing to have slower frames and a blurry mess in fast paced games. The logic is backwards for why most people upgrade their GPU... to get more performance.

Lastly, what didn't make sense and the basis of your argument is, if your 3080 and upscaling tech is so great, why would you EVER have to upgrade..? What exactly are you gaining by moving to a RTX4 card..?

It's great that you are a nVidia marketing fan, but competitive Gamers fuel GPU sales and most if not all People upgrade their System or GPU based on needing more competitive frames & to be less reliant on G-sync to be able to enjoy smooth targeting and level competition. Same reason they buy expensive high-refresh monitors w/GPU that can push them. (Pro players do not use G-Sync)

I have NEVER met anyone, or heard of anyone in the 20-100+ Gamers I speak with daily... who even mentions the things you do.



Reality speaks....
61feee8498883c32c90fceea06da20c5df12a85284c485b41e5a731805d66fc4.png
 
Citizens bank has a credit card deal now that gets you 10% cash back upto $1k then goes down to whatever the default rate is. That 4090 can be yours at $1439 fyi.
That table shows me only the 4090 and 4060 are the real value this gen.
For me the 4060 is less value than the 3070 I'm using now.
The 4090 for my 5600x would be a waste of good GPU on a entry level CPU.

BTW I'm not going to get a credit for a new PC/PC parts, not my thing. For a new car/house/wife, maybe.
 
I bought Nvidia again, after I sold my 3080 for 400 dollars after almost 3 years, because of features like DLDSR, DLSS, DLAA, Reflex, NvEnc/Shadowplay, RTX Remix, RTX VSR and I could go on.

...and RT does matter for me, as I will be playing Diablo 4 and RT is going to look amazing on my OLED monitor here. Dark environment + Spells = Awesome for RT. Looks unreal. Already seen it in other games.

I paid 530 dollars for my RTX 4080. How? Sold my 3080 for 400 dollars and got Diablo 4 for free, which I would have bought for 70 dollars anyway :laughing:
Paid 699 for my 3080 on release, and got it the day after launch.

Tons of games can be played with RT and high fps when DLSS Quality is used. Most AMD users are in denial because RT ON equals unplayable in most cases. I played tons of games with DLSS2 and RT so far. RT is not worth it in some games, at least not on high settings, but RT can transform some games and especially older games. RTX Remix is awesome. DLDSR is awesome. Playing old games with RT mods, awesome.

Completely revamps games. Especially older ones.

You don't know, because you can't use it. That's human nature; Denying. I know how it works.

AMDs multi monitor watt usage is not fixed at all. All new custom 7900XTX suffers from it. Just read the latest 7900XTX review on TPU. Was tested recently using newest drivers. AMD >said< they would look into a fix but never fixed it. AMD loves to talk. Talk is cheap.


I buy what I want, you buy what you want. A GPU for me is peanuts, I change every 2-3 year. Could not care less about longevity. Nvidia sits at 85% marketshare for a reason. Consumers vote with their wallets. As long as AMD don't offer way superior value and can match Nvidia features just slightly, I won't be considering AMD GPU.

I have an AMD platform tho. Their CPUs are fine - GPUs, not yet for me. Most I know with AMD CPU, use Nvidia GPU :dizzy: :laughing:

My last AMD GPU was 5700XT, a temp solution for a 2nd PC, and it had BSOD and black screen issues for months and months after release + VRM temps were insane at times. AMD fixed it in drivers, by gimping the performance slightly. Tons of users had issues with 5700XT for 6-9 months post release, simply google it. This was my most recent AMD GPU experience and it was awful.


My last great AMD GPU was 7970XT, but first after Cata 12.11 which fixed alot of issues and upped performance (took them ~9 months as well).

So yeah, AMD is better for longevity at times - depends on games played etc. However they most often have a very rough launch with new hardware and lacks proper support when you look at less popular and early access games.

AMDs software team is just smaller than Nvidias. Way smaller. This is why they can't match Nvidia features. That and lack of money. This is why game performance varies ALOT from title to title on AMD cards. Nvidia offers good performance across all games. AMD mostly offers good performance in >popular titles and benchmarks< because they know this is what reviewers will be using.

I play alot of early access games, they tend to run waaaay better on Nvidia as well. Nvidia have launch drivers ready for ALL GAMES on release day, even early access games and beta's - often several days before actual release - AMD does not.

Also the resell value of Nvidia cards is much higher. It's like selling an iPhone compared to an Android phone on the used market. Tons of buyers for the iPhone, little to none for the Android phone, which lost 80-90% of its value. This is something you forget to add in :cool:
Man, it will take me months to debunk each one of the many incorrect points that you provided, but will reply with one, everything you claimed was already addressed by me on the first post.

Anyways, this is gold and if I didn't knew better, I would say it came directly from the Nvidia Marketing team.
 
In this case, it was AMD's version of "Cuda" that was proprietary in the sense that if you wanted to develop for it, you had to pay for the toolkit. Nvidia took M$'s example, in giving away M$ Office, and made the toolkit necessary to develop for Cuda available for free. It was only with OpenCL that AMD's "compute" toolkit became available to the average developer.

And now, that Intel has made their "OneAPI" toolkits available for free, developing for GPUs is far more accessible than it was when "Compute" for GPUs first became a thing.
I am confused by your many statements there.

I dont know which "AMD's version of CUDA" was proprietary. Also, as far as I know, OpenCL has always being open, even though it was created by Apple.
About Cuda being free, nothing from Nvidia is truly free, since it will always have their hardware tax attached.
In the end, as I stated, not an expert on the matter, so more than whiling in learning more.
 
I am confused by your many statements there.

I dont know which "AMD's version of CUDA" was proprietary. Also, as far as I know, OpenCL has always being open, even though it was created by Apple.
About Cuda being free, nothing from Nvidia is truly free, since it will always have their hardware tax attached.
In the end, as I stated, not an expert on the matter, so more than whiling in learning more.
Games don't run on CUDA.... they run on RDNA.
  • Ask the Xbox Series X & S....
  • Ask the Playstation 5...
  • Ask the Steamdeck & new ROG Ally...

Game Dev's only care about the hardware tech THEY are dealing with.... (whats rtx?)



But AMD's GPUopen are the things Game Developers care about....
9489091dba85f68998c753082a4f2906651ba211c3468857ac5bb0d5d968aaad.png
 
Last edited:
They sound more like leftist trolls than developers. If developers wanted to make gaming more green, they should focus more on optimization to run more with less. Currently the opposite is happening. Graphics quality is progressing in a plautau like affect soon we will have ultra realistic Graphics with little to no improvement in terms of visual quality. Eventually when the industry reaches this moment in graphics fidelity the industry will turn to running those graphics at efficiency plautau. This was both Nvidia's and Epic's outlook before crypto mining happened in which I believe delayed this by a least a decade.
I think it was a small group, but, nonetheless, it's a start. Whether or not it will gain support remains to be seen. However, to your point, the entire industry has to take some ownership of the fact that people are using computers everywhere and there is a social cost to that. I know companies like Microsoft, Google, Amazon etc are working to be carbon neutral/ carbon negative sooner rather than later.
 
Back