RTX 3080 spotted in database: 10GB of GDDR6X, 2.1GHz maximum clock speed

Good chance there wont be a 3080Ti at launch if they're going with the 3090 branding.
Latest from moore'slawisdead YouTube channel is that the 3080ti will be held in reserve if AMD outdo themselves. The 3080 and 3090 will be the two top cards costing $999 and $1999!!!!!!!!!!!! Can't quite remember how and if a titan fitted into this, or if the 3090 would even be considered the Titan. I had planned to get the 3090 at up to $1400. 1999 no chance.
 
Really ? :confused:
I'm not playing at 4K, but I never saw any game asking for more than 8 Gb of VRAM in the requirements ...

For a few years now actually. Mirror's Edge 2 for example was released in 2016 and used close to 12 GB of ram at 4K with max settings.

Nvidia went from 6 GB to 11 GB from the 980 Ti to 1080 Ti. The 2080 Ti has the same 11 GB. Going from 8 GB on the 1080 to 10 GB on the 3080 is a very small jump in comparison and that's over two generations, not one.

It makes sense from Nvidia's standpoint though, VRAM won't make an immediate performance impact in games now as you have to be seriously deficient in VRAM space but when you are it has a severe impact on your game. It's planned obsolescence. Of course turing was the same way, zero increase in RAM sizes plus ridiculouslky under powered RTX hardware. It's a win - win for Nvidia and a majority of PC gamers won't care as they seem to be completely fine with planned obsolescence. Giving just enough RAM for current games saves Nvidia money, power consumption, and has the bonus of people upgrading more frequently.

Honestly if I were AMD I would skimp out on the VRAM as well. Reviews based their conclusions on the now and most people buy for now. Reviewers and users don't seem to care if products are pushed to 100% or consider the ramifications of that in the near future. Consumerist culture to a T.
 
Last edited:
Just the fact that Ampere will feature GDDR6X have proven that there are definite truths to these leaks. No one even suspect GDDR6X were actually in development.

Knowing Nvidia's efficiency motto, If 3080 and 3090 indeed have 23% and 63% higher bandwidth throughput than 2080 Ti, I would think their respective performance relative to 2080 Ti are not far from those figures.

Now come the super hard to swallow pill, if 3090 does cost 1700-2000usd, who can afford them :D
 
For a few years now actually. Mirror's Edge 2 for example was released in 2016 and used close to 12 GB of ram at 4K with max settings.

Nvidia went from 6 GB to 11 GB from the 980 Ti to 1080 Ti. The 2080 Ti has the same 11 GB. Going from 8 GB on the 1080 to 10 GB on the 2080 is a very small jump in comparison and that's over two generations, not one.

It makes sense from Nvidia's standpoint though, VRAM won't make an immediate performance impact in games now as you have to be seriously deficient in VRAM space but when you are it has a severe impact on your game. It's planned obsolescence. Of course turing was the same way, zero increase in RAM sizes plus ridiculouslky under powered RTX hardware. It's a win - win for Nvidia and a majority of PC gamers won't care as they seem to be completely fine with planned obsolescence. Giving just enough RAM for current games saves Nvidia money, power consumption, and has the bonus of people upgrading more frequently.

Honestly if I were AMD I would skimp out on the VRAM as well. Reviews based their conclusions on the now and most people buy for now. Reviewers and users don't seem to care if products are pushed to 100% or consider the ramifications of that in the near future. Consumerist culture to a T.

Will the upper VRAM requirements for most mainstream games for the next few years not be limited to whatever VRAM the consoles have in the shared memory pool? They are touting 4K and so textures and other assets will be similar size if not the same between the PC and console versions.

I’ve read this should be around 10GB which seems to match the 3 series.
 
For a few years now actually. Mirror's Edge 2 for example was released in 2016 and used close to 12 GB of ram at 4K with max settings.

Nvidia went from 6 GB to 11 GB from the 980 Ti to 1080 Ti. The 2080 Ti has the same 11 GB. Going from 8 GB on the 1080 to 10 GB on the 2080 is a very small jump in comparison and that's over two generations, not one.

It makes sense from Nvidia's standpoint though, VRAM won't make an immediate performance impact in games now as you have to be seriously deficient in VRAM space but when you are it has a severe impact on your game. It's planned obsolescence. Of course turing was the same way, zero increase in RAM sizes plus ridiculouslky under powered RTX hardware. It's a win - win for Nvidia and a majority of PC gamers won't care as they seem to be completely fine with planned obsolescence. Giving just enough RAM for current games saves Nvidia money, power consumption, and has the bonus of people upgrading more frequently.

Honestly if I were AMD I would skimp out on the VRAM as well. Reviews based their conclusions on the now and most people buy for now. Reviewers and users don't seem to care if products are pushed to 100% or consider the ramifications of that in the near future. Consumerist culture to a T.
The 2080 has 8 gig vram not 10, if I read your comment correctly
 
Will the upper VRAM requirements for most mainstream games for the next few years not be limited to whatever VRAM the consoles have in the shared memory pool? They are touting 4K and so textures and other assets will be similar size if not the same between the PC and console versions.

I’ve read this should be around 10GB which seems to match the 3 series.
I won't count on it. PS 4 had only 8 GB total Ram/Vram but still HZD on PC could easily consume around 8 GB just in VRAM even at 1080p. At 4k it could easily reach 9 GB of VRAM.

Now consider that PS 5 has double the memory of PS 4 (I.e. 16 GB) plus it has high speed storage further easing load. So I am highly doubtful that just a 10 GB VRAM card will be able to hold the line even after a year from now.
 
I won't count on it. PS 4 had only 8 GB total Ram/Vram but still HZD on PC could easily consume around 8 GB just in VRAM even at 1080p. At 4k it could easily reach 9 GB of VRAM.

Now consider that PS 5 has double the memory of PS 4 (I.e. 16 GB) plus it has high speed storage further easing load. So I am highly doubtful that just a 10 GB VRAM card will be able to hold the line even after a year from now.

Good point on the high speed storage, I hadn’t factored that in. With developers knowing 10GB being the upper limit on consumer performance hardware, would they deploy compression techniques to ensure games could run on current/near future hardware without performance issues?
 
Good point on the high speed storage, I hadn’t factored that in. With developers knowing 10GB being the upper limit on consumer performance hardware, would they deploy compression techniques to ensure games could run on current/near future hardware without performance issues?
Sure, the games will run perfectly for years with some concessions. Undoubtedly the RTX 3080 is very powerful card and even 3-4 years down the line simply bringing couple of texture settings down from Ultra to Very high will be more than enough to ensure a great gaming experience especially with DLSS 2.
 
The 2080 has 8 gig vram not 10, if I read your comment correctly

Typo. Meant the 3080 as this article discussed it have 10GB.

Will the upper VRAM requirements for most mainstream games for the next few years not be limited to whatever VRAM the consoles have in the shared memory pool? They are touting 4K and so textures and other assets will be similar size if not the same between the PC and console versions.

I’ve read this should be around 10GB which seems to match the 3 series.

No. Console VRAM limitations have no bearing on how much VRAM max settings on the PC version of the game will consume. Never has and never will. There are plenty of games that have been released this gen that exceed current console's total available memory without factoring in memory reserved for the OS. Think about it, most PCs have 16GB of RAM just for main system memory, the PS4 has 8GB of memory total. Now include the Video card's memory on the PC and you have around triple the memory. You wouldn't want to run a PC with as much VRAM as a console, it would have stuttering issues. 4GB main system memory and 4GB VRAM would run most games extremely poorly on PC. Next gen consoles will have 16GB of total memory. Better than last gen but still much less than even midrange PCs.

Next-gen consoles don't have enough power to run 4K natively. Either they are going to be adding fake frames at 4K or use something similar to DLSS / CAS or the FPS is going to be poor. Even if the consoles could run 4K without any of that, resolution the game is running isn't an indicator of texture quality. With more VRAM you can use less compression or higher resolution textures. Skyrim's had 8K textures for a long time now as an example.
 
I'm in the market for a 3080 or a 3090 but the prices make game consoles look like a much better idea. I hate Nvidia.

No one would blame you for getting a console if the pricing of graphics cards keeps high. When a GPU, a single PC part of the 7 you need, costs as much as an entire system, the console, and both have equal performance you know pricing on GPUs is wack.
 
Next-gen consoles don't have enough power to run 4K natively. Either they are going to be adding fake frames at 4K or use something similar to DLSS / CAS or the FPS is going to be poor.
Microsoft themselves are promoting variable rate shading, as being one of the solutions required:

XSX-12.jpg


XSX-13.jpg


You can see the rest of the slides in the link below, but you'll notice that anything resembling temporal upscaling isn't mentioned at all:


Edit: I'm blind! I missed this slide:

XSX-19.jpg
 
No one would blame you for getting a console if the pricing of graphics cards keeps high. When a GPU, a single PC part of the 7 you need, costs as much as an entire system, the console, and both have equal performance you know pricing on GPUs is wack.

Nvidia gotta fund that ARM acquisition somehow! It honestly disgusts me, I'd call their pricing predatory.
 
Makes me wonder if the joke is on us.

Because it is, nVidia is very good at marketing and training people to accept higher and higher prices for less extra performance just look at RTX2080Ti, a card that should never got accepted by the community and yet there are people who praise it for its performance
 
If the price rumors are true... LOL. Then nVidia are in for a surprise. What with CoV-19 and its concurrent economic downturn etc. I see A LOT of people just picking up the XBOX-X or PS5 come December... Hopefully AMD can show up in time for the party. Good Times.
 
Back