Nvidia 'Ampere Next' GPUs are set to arrive in 2022, 'Ampere Next Next' will follow in...

Sad part is that I'm not even sure they'll get regular Ampere back to normal availability and pricing before 2022 unless it's basically the end of 2022.

Not sure why Nvidia wants to just have an effectively missing generation and is now rushing these cards out of the door it seems but we'll have to see what they deliver.

Missing generation ? They're selling more than twice what they did turing. They're having the best year ever. The reason you cant find the cards easily is because they are selling, not because they dont exist
 
When one launches that I can actually purchase MSRP, that one will be my next GPU. I'm almost to the point where buying a 30 series Nvidia GPU has come and gone. Purchasing a RTX 3080ti is the one remaining hope this generation for me. Then it will be hoping they don't screw up the 40 series launch like the did the 30 series.
 
What I do not understand is why there isn't a nice 75 watt ITX GPU for ordinary use... 128 bit 6GB capable of 60FPS in high graphics mode at 1080p. All this hoopla about 20GB 512bit 200FPS in 4k at 144Hz - for my gaming nickel it is just silly.

Seems almost as if we are getting getting out onto an extreme branch and leaving the stage open for Yeston or the like to arrive with a $100 card built on 3 year old tech.

I think the Emperor has new clothes.
 
Scalpers and miners actively plan and improve their methods of acquisition all the time. These cards don't even exist yet, but are already as good as gone. As for gamers, nvidia doesn't care about them anymore and never will again.

Not entirely true. If all they cared about were miners they'd refocus all the current gen gpus onto cards that do not require brackets; video output circuit board layouts, ports on the cards.

They'd buy the cheapest fastest loudest fans possible for better clock rates for hashing, up the gpu voltage and put them on EBAY directly and let everyone bid at will.

But they are not. reality's is as LINUS pointed out this is all about production scheduling, panic buying systems to work from home AND mining.

(
)
 
"It's too early to predict things like performance, energy efficiency, or cooling capabilities"

No? Its super simple: HORRID, OFFENSIVE and INADMISSIBLE . there.
Reviews?.. Glowing with prai$e, STOCK? NONE. ( saving this post)
 
What I do not understand is why there isn't a nice 75 watt ITX GPU for ordinary use... 128 bit 6GB capable of 60FPS in high graphics mode at 1080p. All this hoopla about 20GB 512bit 200FPS in 4k at 144Hz - for my gaming nickel it is just silly.

Seems almost as if we are getting getting out onto an extreme branch and leaving the stage open for Yeston or the like to arrive with a $100 card built on 3 year old tech.

I think the Emperor has new clothes.
The problem is the games themselves. My now 4 year old 1060-6b plays retro-wow like a tiny little god. But as soon as I play BF-V with eye candy it is (pun) game over due to texture sizes.

It has little top do with the cards, and everything to do with people expecting the next ''big game'' to have more textures, bigger textures, ray tracing, streaming codecs built in, etc.

If every computer has the same video card, your proposed card would work. But we live in a world where game-devs build the flashiest crap they can and wait for card-devs to catch up and build the next ginormous card
 
The problem is the games themselves. My now 4 year old 1060-6b plays retro-wow like a tiny little god. But as soon as I play BF-V with eye candy it is (pun) game over due to texture sizes.

It has little top do with the cards, and everything to do with people expecting the next ''big game'' to have more textures, bigger textures, ray tracing, streaming codecs built in, etc.

If every computer has the same video card, your proposed card would work. But we live in a world where game-devs build the flashiest crap they can and wait for card-devs to catch up and build the next ginormous card

It was never like that before, and still isn't for the most part. Is Raytracing mainstream? absolutely not, its added in through patches. We need a few more years before raytracing goes mainstream and is in game development from the ground up.
 
I live where some folks take their flashy ride to the corner store. I think that I'll stick with my bike. Currently playing Witcher 3, Fallout 4, Wasteland 2, Far Cry 5, Metro Redux and Hellblade on my 10 year old Dell (6-core, 12 thread) with my 4 year old GTX 1050 Ti. I'm not sure I would appreciate the eye-candy enough to spend a k-buck, but I do rather wish I had a 6GB 1060.
 
Problem is... even if you are rich and dumb enough to get these next, next cards, the state of current games will make you sick anyway.
Yeah, between this and gamers calling for an entire class of technology to be banned, so they can clock 120 FPS in Call of Duty [CURRENT_YEAR_RELEASE] (while pretending that it's because they care so much about the environment), I'm getting tired of gaming and gamers as a whole.
 
I live where some folks take their flashy ride to the corner store. I think that I'll stick with my bike. Currently playing Witcher 3, Fallout 4, Wasteland 2, Far Cry 5, Metro Redux and Hellblade on my 10 year old Dell (6-core, 12 thread) with my 4 year old GTX 1050 Ti. I'm not sure I would appreciate the eye-candy enough to spend a k-buck, but I do rather wish I had a 6GB 1060.
No problem if you want to play like that, but other gamers prefer higher resolutions, more framerate and more eye candy.
 
What I do not understand is why there isn't a nice 75 watt ITX GPU for ordinary use... 128 bit 6GB capable of 60FPS in high graphics mode at 1080p. All this hoopla about 20GB 512bit 200FPS in 4k at 144Hz - for my gaming nickel it is just silly.

Seems almost as if we are getting getting out onto an extreme branch and leaving the stage open for Yeston or the like to arrive with a $100 card built on 3 year old tech.

I think the Emperor has new clothes.


because this
 
What I do not understand is why there isn't a nice 75 watt ITX GPU for ordinary use... 128 bit 6GB capable of 60FPS in high graphics mode at 1080p.
With a 75 W hard power cap, 6 GB of GDDR6 would leave a budget of 57 to 63 W for the GPU (as each DRAM module would require around 2 to 3 W, depending on speed and voltage). The likes of the GTX 1650 manages on 75W, though it only has 4 GB of 12 Gbps, which leaves around 67 W for the processor. It performs okay with high settings at 1080p, but if you start to reduce the available power, not matter how small that decrease is, the performance will obviously suffer.

A 128-bit memory bus would normally only allow for either 4 or 8 GB of memory. To get 6 GB on such a bus, two memory modules would have to be 2 GB in size, and the other two being 1 GB. While not impossible to do so, it's less than ideal to have, although it would open up for a little more power being available for the GPU.

Tight power limits are part of the reason as to why integrated GPUs in CPUs are fairly weak (even though they take up the bulk of the processor's die area).
 
A 128-bit memory bus would normally only allow for either 4 or 8 GB of memory.


So I guess the optimal card would be a 1650 128 bit 8GB running on about 90-100 watts with the extra power connector - too bad that the slot is limited to 75 watts.
 
Back