Nvidia RTX 4080 prices at Micro Center show custom cards reaching $1,599

There's a good reason as to why all units dedicated to ray tracing algorithms (ray-triangle intersection calculations, BVH traversal acceleration) are located within the primary 'shader' sections of the GPU -- I.e. AMD's Compute Units, Intel's Render Slice, Nvidia's Streaming Multiprocessor. They need direct access to the fastest cache within the GPU and that's located with the shader units. Putting all that on a separate die would significantly lower RT performance; putting it on a separate card would kill it all together.
Your knowledge is so massive that it's just infuriating 😝😔
 
I think RT is the future of gaming and will help achieve perfect photorealism at some point... hold it, AMD fanboiz, but this is not a technology exclusive to Nvidia, they`re just better at it atm. Everybody is working on RT development and AMD can also be pulling decent RT with the 7000 gen and beyond. This is not a gimmick. Light is everything, so, understanding and reproducing the behavior of light will be essential and RT is paramount in this area.
 
I think RT is the future of gaming and will help achieve perfect photorealism at some point... hold it, AMD fanboiz, but this is not a technology exclusive to Nvidia, they`re just better at it atm. Everybody is working on RT development and AMD can also be pulling decent RT with the 7000 gen and beyond. This is not a gimmick. Light is everything, so, understanding and reproducing the behavior of light will be essential and RT is paramount in this area.
It is the future but as it is right now I would choose 4k or 120fps over rt.
It's not about being a fanboy but more about being practical.
Also the price of the new nvidia cards is simply not worth it just for rt.
 
That's like saying a dozen or so more FPS won't make a difference once you go over 100-120 fps, so why pay even $1000 for a 7900?
I don't think that's such an unreasonable question either...

The 4080 faces another challenge too which is meaningful needs for its incremental power over the previous generation.

When the 3080 came out, it benefitted from many gamers who had skipped the prior generation; an evolving tv/monitor world where resolutions beyond 1080p and refreshes beyond 60 were going more mainstream; and a new console generation, meaning game developers would be increasing the core graphics capabilities used/usable by their titles.

The 4080 has less of this. There's no jump from say 4K to 6K displays. People who were often under 60 fps before are probably often over it now. Mainstream AAA art budgets aren't jumping again until the next console generation.

The need or opportunity for visual bump from this generation just feels a lot less vital than it did last generation, even if the raw horsepower is there, and combined with prices that may feel unreasonable plus remaining stock of the prior generation I think it's adding up to this being a skip-generation for many gamers.
 
It is the future but as it is right now I would choose 4k or 120fps over rt.
It's not about being a fanboy but more about being practical.
Also the price of the new nvidia cards is simply not worth it just for rt.
I concur. I was just pointing out how important RT is, but mostly, how is going to be. Nvidia pricing is shameless no doubt and they are sold out only because of low supply, which they keep low to try and sell the huge stock of old gen they pilled up at pumped prices. I said it before, AMD can move in for the kill if they don't get greedy. And AMD RT is most likely better this gen.
 
Who remembers the PhysX accelerator, it was the same BS in his days as RT now . And was bought by NV after a while.

ef157500726d0d6821db144034f9f7be-1200-80.jpg


If they made a RT card separate from GPU I wonder how many would buy it then?
I was one of them at that time, trying to have an exclusive card for physx. In the end I assembled a SLI setup and later Nvidia ditched SLI... my bad luck.
 
Who cares what prices are currently. I'm not buying any new video cards until prices are reasonable. Period. If everyone else did that we wouldn't be in this mess. People come on here to whine and moan about prices and then grudgingly go and buy the overpriced garbage anyway. You need to vote with your wallets and stop buying until prices make sense. As long as they can find enough stupid people to pay $1500-2000 for a video card they're NEVER GOING TO STOP. Why would they? If you could sell your pubes for $100 a hair you wouldn't stop either until your crotch was as smooth as an egg.
 
If I know gamers, and I think I do, they’ll show exceptional self-control and band together to boycott GPU’s at these prices for the greater good of us all! Hurrah!
 
I was one of them at that time, trying to have an exclusive card for physx. In the end I assembled a SLI setup and later Nvidia ditched SLI... my bad luck.
They promoted so many "new tech" along the years and ending a small check in GPU control software. People need to realize buyers can't keep up with vendors without ending broke. They always will find another "new tech" to sell you.
Remember "3D vision" :)
7139wG2lzDL.jpg
 
It will be very interesting if the 7900xtx is faster than this 4080. Ofc we have to exclude ray tracing, in that area AMD is still behind at least 1 gen
Ray tracing is more gimmick than actually enhancing gameplay. For one, most of the people don't have time to stay and look at the light reflections. Secondly, it pulls down the graphics performance of a card, though 4090 is the first one able to maintain higher than 60fps at 4K with RT on. Thirdly, you have to observe deeply to see any difference. At a glance, non-RT games actually look and play better.

But of course, RT crazed fans stand by it and die by it. It's BS, but hey.. if they want to buy it for RT nonsense, go ahead.
 
They promoted so many "new tech" along the years and ending a small check in GPU control software. People need to realize buyers can't keep up with vendors without ending broke. They always will find another "new tech" to sell you.
Remember "3D vision" :)
7139wG2lzDL.jpg
Lol I remember, I had the Asus 3d Vision 120hz 27 inch 1080p tn monitor. It was my 1st 120hz display. Lets just say the 120hz was more useful than the 3d vision part. It was awesome for 1 to 2 games ( only recalling Trine 3 in 3d) but other than that barely used it. The 3d fad was being pushed by everyone. Tvs, AMD, even phones had 3d bs marketing. From my own experience the best implementation was Nvidia's but was it worth the extra few hundred dollars in premium? For me yes and no. Did I get to experience the best implementation of 3d at the time yes but I concluded that the best gaming experience is just a flat ol monitor in front of you. This goes for gsync too while Nvidia was the first to implement this eventually open standard made it affordable to the masses and is used even now by gsync compatible gpus. While the gsync premium was higher it allowed me to play games with frame variance on older hardware that made the gameplay smoother vs a traditional non vrr display that would cause stuttering at lower performance frames per second. While the glass is half empty on one perception its half full on other's perception. While closed proprietary standards like 3d vision, gsync and now rtx ( rt and dlss) do come with a premium eventually they lead to improvements in open standards like freesync and FSR that try to leapfrog each other. Without Nvidia's research and development imo we will not have pressure from open standards to stay competitive and innovation will stagnate.
 
Who cares what prices are currently. I'm not buying any new video cards until prices are reasonable. Period. If everyone else did that we wouldn't be in this mess. People come on here to whine and moan about prices and then grudgingly go and buy the overpriced garbage anyway. You need to vote with your wallets and stop buying until prices make sense. As long as they can find enough stupid people to pay $1500-2000 for a video card they're NEVER GOING TO STOP. Why would they? If you could sell your pubes for $100 a hair you wouldn't stop either until your crotch was as smooth as an egg.
I ve read some commends on videocardz.com, plenty of people seem to be happy to part with 1600.
The argument is, for the top of the toppest, the gpu that will last you a long long time, it is not that much.
But to me, the most depressing thing is that I still remember the old good prices. I used to be able to purchase almost the best at 600 or so.
And if we are talking about inflation, why did not cpus jump in price so high? Do cpus live in a different, less inflated universe?
 
It would be interesting to know how many new rtx 3000 left, are they in millions, hundredths of thousands.
My interest is mostly practical because if they dropped in price further they would be an attractive upgrade.
It seems there are a lot of them. They are in stock at all the big sellers.
At the same time, I see a lot of people finally upgrading being sick of waiting for prices to go to normal.
 
And if we are talking about inflation, why did not cpus jump in price so high? Do cpus live in a different, less inflated universe?
Graphics cards have a lot more components than CPUs do - both comprise a processor and primary packaging (the stuff that houses the chip and connects it to the pins/solder ball points), but graphics card then have DRAM modules, VRMs, video output interfaces, secondary packaging (the ‘motherboard’ that houses all of that), and the cooling system (CPUs can be sold without it).

Just like CPU motherboards have become more expensive in part due to carrying a lot more current through and have more electrical signals running, that require tighter tolerances, so have the boards for graphics cards.

DRAM prices have risen, and fallen, repeatedly over the years due to supply and demand (the manufacturing nodes used haven’t changed as frequently as they for CPU/GPUs) and cards are sporting more and/or larger modules. They’re cheaper in comparison to GPU dies, but it’s still a cost to add in.

AtNozaoGiNMBtrnDWTADSQ.png


Also, GPUs have been increasingly more internally complex and sophisticated. Not as much as a high end CPU, but the costs to design and then ultimately fabricate on what’s usually a brand new process node, have notably risen these past few generations.

fab_chip_costs.png
[Source]

High end graphics cards are never going to return to, say, 2016 prices. Not just because of inflation, but only associated design and manufacturing costs have also risen (and also by factors beyond inflation). But there is definitely scope for them to come back down to something more sensible. Just depends on what one considers to be sensible.
 
Graphics cards have a lot more components than CPUs do - both comprise a processor and primary packaging (the stuff that houses the chip and connects it to the pins/solder ball points), but graphics card then have DRAM modules, VRMs, video output interfaces, secondary packaging (the ‘motherboard’ that houses all of that), and the cooling system (CPUs can be sold without it).

Just like CPU motherboards have become more expensive in part due to carrying a lot more current through and have more electrical signals running, that require tighter tolerances, so have the boards for graphics cards.

DRAM prices have risen, and fallen, repeatedly over the years due to supply and demand (the manufacturing nodes used haven’t changed as frequently as they for CPU/GPUs) and cards are sporting more and/or larger modules. They’re cheaper in comparison to GPU dies, but it’s still a cost to add in.

AtNozaoGiNMBtrnDWTADSQ.png


Also, GPUs have been increasingly more internally complex and sophisticated. Not as much as a high end CPU, but the costs to design and then ultimately fabricate on what’s usually a brand new process node, have notably risen these past few generations.

View attachment 88640
[Source]

High end graphics cards are never going to return to, say, 2016 prices. Not just because of inflation, but only associated design and manufacturing costs have also risen (and also by factors beyond inflation). But there is definitely scope for them to come back down to something more sensible. Just depends on what one considers to be sensible.
Prices, according to Capitalism, are established by the preferences of the market and by shortage. If the market doesn't want to pay your price for a product, you will go on bankruptcy despite your price is completely fair according to the cost. Most people don't want to pay Huang's prices for his GPUs. So, Huang risks to have a serious problem selling his new generation despite his prices are fair according to the cost of manufacturing GPUs. This is 101 in economy.
 
We know it won't happen, most consumers are stupid. They'll continue to complain about the pricing, yet they'll still continue to buy the overpriced stuff.
I think it's pretty much a given that consumers are that stupid. Compared to what customers willingly paid for cards during, "the great mining / scalping era", $1600. is a comparative bargain for a top end card. Hell, people were trying to get $400.00+ for a GTX-1050ti.

The whimpering about card prices here and then, was at least double the volume of this relatively laid back complaint session..
 
I don't know how many people are willing to pay "new" prices. Some will pay some don't.
I sure won't and not because I can't afford. I got a RTX 2060 and it can play the games I like for now. I better buy used after winter ends, most probably a 6700 or a 6800. This way I'm not giving any money direct to neither brand.
 
Most people don't want to pay Huang's prices for his GPUs.
We don’t know that for certain. All we have are comments from people who are either happy to buy (and have thus displayed their purchases on the likes of Reddit) and those who are and who then go onto discussion platforms to air their views. Until we see some actual GPU shipping figures from the usual analysts, there’s no way from being absolutely certain that the market isn’t willing to accept Nvidia’s pricing.
 
It is the future but as it is right now I would choose 4k or 120fps over rt.
It's not about being a fanboy but more about being practical.
Also the price of the new nvidia cards is simply not worth it just for rt.
The question is are the AMD 79xx cards worth it? I'm hearing that 3rd party versions will be a couple hundred more than the reference design. At $900-1000 I think they are already too expensive for most gamers. Sure, some one will pay the price, and even I might pay that for a 7900 XTX, but I really don't want to.
 
We don’t know that for certain. All we have are comments from people who are either happy to buy (and have thus displayed their purchases on the likes of Reddit) and those who are and who then go onto discussion platforms to air their views. Until we see some actual GPU shipping figures from the usual analysts, there’s no way from being absolutely certain that the market isn’t willing to accept Nvidia’s pricing.
Exactly, it appears there were enough people willing to pay the price for a 4090 such that supply is now limited or sold out. I look for 79xx cards to also sell out, even though the prices are high. I guess if you look at it from a business standpoint, would you rather sell a million items at $10 or sell 10 items at $1,000,000? Maybe Nvidia (and AMD) realize that the volume of GPU sales is not going to be what it was and so, they want to maximize revenue and profit.
 
Back