8K.... seriously? Even the Titan and Ti cards are struggling to maintain 60fps 4K maxed out on powerful PCs.
And on another note, it's amazing to see consoles continue using AMD-based CPUs and GPUs in them, while on PCs it's rivals are stealing the show, in terms of benchmarks, efficiency and heat production.
**shrugs**
A lot of you don't understand the difference between output resolution and rendering resolution. PS5 can output at 8K; doesn't mean it's going to render games at 8K. Similar to how the Xbox One S can do 4K Netflix and HDR, but still renders games at 1080p or whatever setting the developer has selected. Stop making a big deal out of nothing.
And why wouldn't Sony use AMD again? The PS4 was a smashing success (something like 90 million units sold), and unlike nVidia, AMD has proven it's willing to do semi-custom work for its partners and team up with Sony engineers to deliver what the PlayStation team wants, and doesn't charge an arm and a leg in licensing or royalty fees. There's a reason NONE of the consoles use Intel CPUs or nVidia GPUs (I know the Switch is a Tegra but that's basically an off-the-shelf B2B sale with little to no modification done for Nintendo).
Nerds like us like to geek on the technical specs, but business partnerships hinge on relationships between people and parties, not fastest clock speeds or snazziest tensor cores.