John Carmack makes the case for future GPUs working without a CPU

Alfonso Maruccia

Posts: 1,851   +559
Staff
Forward-looking: John Carmack, the visionary often credited with revolutionizing the first-person shooter genre, is not one to hold back. Known for his straightforward opinions and bold predictions about the future of PC technology, his latest commentary may be his most audacious yet.

John Carmack envisions a future where GPUs could function independently of host CPUs. The legendary programmer behind PC gaming milestones like Commander Keen, Doom, and Quake believes modern GPUs are becoming so powerful and versatile that they could effectively serve as all-in-one "PCs" from the user's perspective.

Carmack shared his unconventional "GPUs as PCs" concept on X, nostalgically reflecting on the glory days of GPU chains during the Voodoo era. Back when Voodoo2 graphics cards reigned as the most powerful "3D accelerators," tech-savvy gamers could link two cards using a simple ribbon cable to significantly boost game performance.

With just a ribbon cable, you could double the pixel rate, Carmack noted. He recalled how friends would host hardware parties, combining their 3D cards to enjoy a faster, smoother gaming experience. "Play Quake 2 at 1,280 x 1,024 120 Hz with 4xAA in 1998. If the cards had vertex transform, you could scale out for motion blur and stereo/VR multi-view rendering," Carmack added.

Modern rendering engines in games rely heavily on the render-to-texture process, which isn't well-suited for multiple daisy-chained GPUs. However, Carmack suggested an alternative: GPUs could be arranged in a "ring" topology and enhanced with explicit transfer operations, enabling both 3D rendering and machine learning frameworks to fully exploit the potential of this novel hardware setup.

The former id Software mastermind believes today's GPUs could eventually operate entirely without host CPUs, provided they have a "private link." According to Carmack, resourceful (and exceptionally wealthy) users could construct powerful accelerator chains. In such setups, GPUs would generate their own video signal with diagnostic information and receive direct power input, bypassing the need for a traditional host PC system.

These standalone GPUs could potentially run a "tiny" Linux operating system onboard, enabling full computing independence. Input peripherals like mice and keyboards could be managed through a DisplayPort link, offering functionality even in the absence of a USB port.

Carmack's vision of "computing" GPUs isn't entirely new. In fact, developers have previously experimented – successfully, in part – with running the original Doom game's code directly on a GPU instead of relying on a CPU.

Carmack, who spent his later years at Facebook attempting to turn the metaverse concept into reality, left the company to pursue other ventures. More recently, Carmack predicted a significant breakthrough in artificial general intelligence by 2030, continuing his trend of bold, forward-thinking ideas.

Image credit: Drew Campbell

Permalink to story:

 
Only one way to know if John is onto something here -- build prototype, and see if finally Crysis will run with all the eye candy with no stuttering.
 
Multi-GPUs are definitely a forgotten technology today for regular consumers (for good reasons). If a new way of doing it can make it viable again then I think it's a path worth pursuing.
 
Multi-GPUs are definitely a forgotten technology today for regular consumers (for good reasons). If a new way of doing it can make it viable again then I think it's a path worth pursuing.

For rendering you always had the problem where the secondary would need to copy the scene out of VRAM, which was slow. Great for FPS, but horrid latency resulted in frequent microstuttering.

The fact DX12/Vulkan moved the processing out of the GPU driver to the API layer basically killed multi-GPU for gaming anyways; while combining multiple different GPU models is architecturally sound, dealing with all the potential modes of failure is just not worth the effort for a developer.
 
Or...
On the flip side, heterogenous APUs like AMD's is producing.

By this time next year, nearly any casual or thrill gamer will be able to buy a NUC powered by AMD's up-and-coming APUs (due to be released in a few months time at CES).

Again, in the near future only FPS gamers will have the need to go with a stand-alone Graphics card.
 
Could it? No reason why not. Especially if you throw enough money at the problem. Will it? Highly doubt it. You have to remember why we have USB. Firewire was the accepted standard and much better than the slightly later USB 1.0 One problem it didn't need a CPU to operate. So Intel backed USB over it and Firewire withered on the vine. I can see the same thing happening again.
 
Could it? No reason why not. Especially if you throw enough money at the problem. Will it? Highly doubt it. You have to remember why we have USB. Firewire was the accepted standard and much better than the slightly later USB 1.0 One problem it didn't need a CPU to operate. So Intel backed USB over it and Firewire withered on the vine. I can see the same thing happening again.
Thats interesting, but I think problem was also that Firewire was a security nightmare.
 
Back