Why GPUs are the New Kings of Cache. Explained.

A way to solve the problem of caches no longer shrinking as processes advance, wouldn't it be 3D stacking as used in HBM?
 
Doesnt NVIDIA started to boost some of their Enterprise line, with CPU and GPU sharing a mass amount of vram/cache, like AMD "type" but overboosted ?
It is possible at some point CPU with integrated RAM and shared with GPU, start to reduce the need for "middleware" caches sizes ?
 
I'm wondering if we'll eventually just see a general APU that just combines all these together, much like console's SDRAM. Right now, it doesn't seem great, but there are experiments that substitutes out expensive GPUs with APU fed with a bunch of DRAM that's converted into VRAM.

It's starting to feel like APU can become one giant SoC at some point, with integrated CPU, GPU, SDRAM working closely together. And it is rather appealing given that you can just grab a MOBO, an APU, and a PSU, and done, that's a computer and it works great...

...but I bet that they'll price it something stupid and that makes replacing APU really annoying.
 
Which raises a tangential question, with more graphical cards sporting larger GDDR6XXX RAM in the order of 8GB-24GB the GDDR prices must have become somewhat more "palatable" than before. Why hasn't anyone attempt to replace "ordinary" DRAM for CPU, with the G (greater?) ones?
 
Back