Nvidia apologizes for RTX 3080 launch, promises more cards are coming

On 2nd thought my 1080ti is holding up just fine for what I do. Kind of got caught up in the new card buzz. Frustrating, but it actually saved me $700 by forcing me to wait.

I’m sure they’ll have this all figured out for the 4000 series. Right....?

No, we are just way too far away. They can barely ray trace a simple game like minecraft with 1/16th the render distance of default and 1/35th the frame rate.

Unless a major breakthrough is made in a single generation that gives something like 4 times the current performance, Ampere's 30% RT improvement over Turing just isn't nearly enough.

I personally don't see a point in even enabling ray tracing until the get the performance hit down to under 20%.
 
Not a problem, we can live without 3080 for a few more weeks, or even months, I am still on my 1080, I will upgrade when there is stock, let those *****s who buy from scalpers overpay.
 
And the race to create bots that can deal with CAPTCHA is on......

News alert: I have been using captcha bot for over a year now as a simple plugin on Chrome! And it can even deal with voice captcha, too! So yeah, get your seats ready for another 2 entertainment days coming live to your computer screens soon! ;-P
 
Let's not forget that NVIDIA is a Monopoly on cards with greater or equal performance to 3070.

AMD is out of the game and has lost the arms race.

So yeah, what's happening is unsurprising b/c NVIDIA (& associates), as a Monopoly, is the only firm in the world that can sell these cards. This is what happens in a Monopoly situation.

Imagine if there was a Competitor that would compete with NVIDIA and sold cards of similar performance to a 3080. Would we see the same show? I think not.
 
Last edited:
I am fortunate I managed to get my 3080 EVGA XC3.
Microcenter came through, but even they only had enough for 30 / 100 people on line.

Best Buy had absolutely nothing but if you'd visited their website prior to launch you'd assumed they would be stocked. A whole lot of people camped out and waited only to go home empty handed.

I never could have imagined the Founders Cards from Nvidia could have been hit by what I would label a "cyberattack" in this manner because I'd have expected them to have captchas, phone verification and other methods to ensure that real people were buying.

Fortunately this did happen and other companies will learn from it for future launches.
Apple for example stopped crowds in response to the fear of terrorism.

I will make a try for the 3090 on Thursday just to see if I can get that as well.
Just curious, why buy the 3080 if you plan on getting the 3090 too? Different PCs?
 
In some ways this is a blessing in disguise as it will force me to wait and see what AMD has and then how Nvidia respond (e.g. 20GB 3080 or even a Ti). Could be a long wait to get one card or another in my machine though.
 
So nVidia tries to insult our intelligence yet again. They're not the least bit sorry, they're tickled pink. They've sold out everything that they have and, judging by Zotac's 20,000 back-ordered cards, things are happening exactly as nVidia's marketing team had hoped. Heartfelt apologies do not come from those who can't stop giggling.

Jensen is sitting in his chair doing his best Ian McDiarmid impression:
"Everything is proceeding as I have foreseen!"
IQUDobX.jpg
 
Last edited by a moderator:
I thought the global economy was in a pandemic-driven tail-spin and that all the people I see on tv don’t even have enough money to buy a loaf of bread or feed their babies? Where is the money coming from to buy a $1000 video card? Maybe the media is lying.

You know there is a middle ground between everyone having a job and no one having a job?

A lot of people, myself included are still employed and have spent practically nothing the last 6 months on entertainment, so there’s a lot of pent up demand to splurge on something that’s going to make the next 6 months somewhat more bearable.
 
No, we are just way too far away. They can barely ray trace a simple game like minecraft with 1/16th the render distance of default and 1/35th the frame rate.

Unless a major breakthrough is made in a single generation that gives something like 4 times the current performance, Ampere's 30% RT improvement over Turing just isn't nearly enough.

I personally don't see a point in even enabling ray tracing until the get the performance hit down to under 20%.

Interested to know the logic behind this since from the reviews I’ve ready most games that support ray tracing can hit 60fps (or very near) at 4K with a 3080. Are you saying this because you want 4k 60+ fps?
 
That's the nice thing about being on a budget; I NEVER get burned by first day offerings going bad because I can't afford them until they are 3-5 years old ...... ahhhhhh, peace of mind ....... LOL
You and me both. The extra benefit is you let everyone else chase out all the bugs first too; let someone else get the cards with bad voltage regulators or defective data buses. I'll pick one up in a year two after the MSRP has been slashed once or twice, and infant mortality rates have plummeted.
 
Interested to know the logic behind this since from the reviews I’ve ready most games that support ray tracing can hit 60fps (or very near) at 4K with a 3080. Are you saying this because you want 4k 60+ fps?

1) "Can hit" is with an asterisk. Many games are not hitting 60 FPS with RTX.
2) That's still only with an extremely limited set of RTX effects (1 or 2) at 2 samples per pixel.
3) The performance hit is still massive. sub 60 FPS in many games is not impressive when you were previously getting more than 144 FPS. I would simply not play any shooter under 144 FPS.
4) You are stilling talking about the flagship $700 card here (don't give me that 3080 Ti look, Nvidia called it a flagship itself). This is a best case scenario. RTX won't become mainstream until more people actually get the tech.

As I pointed out earlier, simple games like minecraft with full ray tracing (the goal) come with 1/16th the render distance of default and 1/35th the frame rate. They at least need to be able to run Minecraft at default render distance with minimal FPS loss and I'd call that just a baseline. More complicated games would require significantly more power.
 
1) "Can hit" is with an asterisk. Many games are not hitting 60 FPS with RTX.
2) That's still only with an extremely limited set of RTX effects (1 or 2) at 2 samples per pixel.
3) The performance hit is still massive. sub 60 FPS in many games is not impressive when you were previously getting more than 144 FPS. I would simply not play any shooter under 144 FPS.
4) You are stilling talking about the flagship $700 card here (don't give me that 3080 Ti look, Nvidia called it a flagship itself). This is a best case scenario. RTX won't become mainstream until more people actually get the tech.

As I pointed out earlier, simple games like minecraft with full ray tracing (the goal) come with 1/16th the render distance of default and 1/35th the frame rate. They at least need to be able to run Minecraft at default render distance with minimal FPS loss and I'd call that just a baseline. More complicated games would require significantly more power.
I see you're talking about ray tracing in Minecraft so then it begs the question: Is ray tracing even worth having? "Fake" shadows and lighting are barely distinguishable, if at all, from ray traced ones especially when you're moving and are focused on actually playing the game. Games like Minecraft and Quake II don't get magically become realistic with Nvidias ray tracing, it's just a gimmick and they could pull that off, the shadows, lighting and everything else, without ray tracing at all. IMO It is safe to say ray tracing doesn't improve graphics noticably because titles with realistic graphics are already good enough and make ray tracing pointless, it's just reinventing the wheel and in the end you get the same result, but with less FPS. I'm not just talking about Nvidia, but ray tracing in general, if or when it gets implemented in every game will we even see the difference between that and whatever we have now? I think sacrificing performance for nothing is ridiculous, some may disagree.
 
I see you're talking about ray tracing in Minecraft so then it begs the question: Is ray tracing even worth having? "Fake" shadows and lighting are barely distinguishable, if at all, from ray traced ones especially when you're moving and are focused on actually playing the game. Games like Minecraft and Quake II don't get magically become realistic with Nvidias ray tracing, it's just a gimmick and they could pull that off, the shadows, lighting and everything else, without ray tracing at all. IMO It is safe to say ray tracing doesn't improve graphics noticably because titles with realistic graphics are already good enough and make ray tracing pointless, it's just reinventing the wheel and in the end you get the same result, but with less FPS. I'm not just talking about Nvidia, but ray tracing in general, if or when it gets implemented in every game will we even see the difference between that and whatever we have now? I think sacrificing performance for nothing is ridiculous, some may disagree.

That's a tough question to answer. You can definitely do many RT like effects with regular rasterization as gamernexus has demonstrated in the past. The question then becomes what has better quality or is easier for devs to implement? I'm not a dev so I can't really answer those questions. What I can say is that in it's current state it certainly isn't satisfactory performance wise and for me personally the only effect I'm interested in is global illumination, specifically how light bounces off surfaces. RT shadows don't really provide a visual improvement IMO and RT like reflections are already being added to game engines without the need for the performance impact of ray tracing. I had originally hoped that we'd see very large RT performance gains so that we could get something like subsurface scattering but I'm not sure if we'll get there anytime remotely soon with only small performance gains each generation.
 
You and me both. The extra benefit is you let everyone else chase out all the bugs first too; let someone else get the cards with bad voltage regulators or defective data buses. I'll pick one up in a year two after the MSRP has been slashed once or twice, and infant mortality rates have plummeted.
It's funny you know because I actually saw Linus say something genuinely wise on one of his WAN shows (I know, wisdom from Linus, eh? LOL) and he said that he was always of the mind that a high-end card of the previous generation was generally the best way to go when it came to value. I'm also of the same mind. I just got my RX 5700 XT about 2 months ago and before that I was using an R9 Fury because despite being five years old, it still delivered objectively good gaming performance and there was nothing out there that out-performed it for less than $800CAD so I just kept on' truckin'! LOL
 
Just an update:


I take back my original statement giving kudos to Nvidia due to changes 2 weeks after I made it. Nvidia have continued to allow scalpers to buy up stock (over 2,227 sold in the US alone at massively inflated prices to date) and have stated that availability is to be scarce until 2021.

Nvidia has failed to keep it's word in regards to both stock increasing and blocking scalpers. In fact it seems the exact opposite of what it promised is happening. It seems to be rather happy of the situation as well.
 
Back