John Carmack suggests the world could run on older hardware – if we optimized software better

Alfonso Maruccia

Posts: 1,730   +505
Staff
In context: Google researcher and reverse engineer "LaurieWired" recently posed a thought-provoking thread on X: What would happen after a CPU manufacturing apocalypse? How would the tech world respond to a future without newer, faster processors? Programming and optimization legend John Carmack offered an equally compelling answer.

LaurieWired proposes the idea of a "Zero Tape-out Day" (Z-Day), an event causing manufacturers to stop producing new silicon designs. Considering the existing supply, the researcher predicts skyrocketing computer prices, stalled cloud capacity, and a ticking clock on electromigration slowly degrading the most advanced chips built on smaller nodes – all within the first year after Z-Day.

Conditions would deteriorate even further in the following years, with a booming black market for processors and Xeon CPUs valued more than gold. Computing technology could regress by decades as older systems built on larger nodes prove far more resilient to electromigration.

People would mod classic processors like the Motorola 68000 to operate for thousands of years without significant gate wear. More advanced systems – such as the iMac G3s sold between 1998 and 2003 – would become workstations for the elite, while the proles use repurpose hardware from Gameboys, Macintosh SEs, and Commodore 64s.

LaurieWired suggests that 30 years after Z-Day, the world would become a dystopia where computing resembles the 1970s or 1980s. The modern internet would vanish, replaced by sneakernet data exchanges on SSDs and efforts to safeguard valuable desktop hardware from confiscation.

Former id Software developer John Carmack decided to weigh in on the thought experiment. Having created the Doom graphics engine in just 28 hours on "vintage hardware," his expertise provided some perspective. Carmack said that a significant part of the modern world could run on outdated hardware if software optimization were a priority for developers.

The god-tier coder suggests that developers could transition all interpreted, microservice-based products to monolithic, native codebases. Programmers would abandon modern development patterns and seek more efficient approaches, such as those used during earlier computing eras when there was no internet to push patches.

Such a paradigm reset would force post-apocalyptic coders to make ancient hardware hum through software optimization. Carmack also acknowledges that innovative new products would become much rarer without ultra-cheap and scalable computing.

While framed within the context of LaurieWired's thought experiment, Carmack's ideas hold practical relevance in today's computing landscape. For example, would Microsoft still impose strict hardware requirements if it prioritized optimizing Windows 11? It's a question worth considering. Similarly, how much could the gaming industry benefit from better optimization?

Permalink to story:

 
No duh? People have been calling out the laziness of software developers for eons now. Remember when EA decided it wasnt going to compress music files for "performance reasons" in Titanfall, a game made on the source engine that ran on a potato? that was over a decade ago.

Also:
For example, would Microsoft still impose strict hardware requirements if it prioritized optimizing Windows 11?
Yes, because it has NOTHING to do with performance. Per Microsoft, an 8th gen dual core 3 ghz pentium is perfectly fine to run 11, but a quad core 4 GHz 7th gen core i7 isnt. Unless of course its one of the 4 7th gen procs MS used in surface devices, then suddenly its perfectly fine. Why? Dont question your overseer!

Windows 11's "requirements" exist solely to push new hardware at best, or attempt to mandate hardware DRM locks via TPM at worst. It's also failed miserably so far.
 
No duh? People have been calling out the laziness of software developers for eons now.
I think much of the gains in computing performance today are being used to make programming easier and to boost productivity. If we still had to write everything in assembly, there probably wouldn't be nearly as many programmers or as much software. That said, if we're only seeing 5–10% performance improvements with each new generation of CPUs and GPUs, then software optimization is going to become increasingly important. Let’s hope we still have the talent and discipline to make that happen.
 
"The modern internet would vanish, replaced by sneakernet data exchanges on SSDs" - more likely data exhcange would be on magnetic tape, as ssds would be long gone by then.
 
Software optimizations have the potential to deliver performance improvements by factors of ten or more, whereas hardware advancements achieving similar gains within a decade would be highly unexpected. ;)
 
One example of bloatware that makes old machines obsolete faster is the modern web. There are so much crap on modern webpages, that it takes a considerable more powerful CPU to render the exactly same amount of information, in a slightly fancier package.

We had interactive webpages with music, videos, games, the same content we consume now, 20 years ago, but try to use a CPU of that vintage to browse now...
 
I've tested this on my old Pentium Ii, I can use word, Excel, media player, and more just fine if I use era correct versions
 
I think much of the gains in computing performance today are being used to make programming easier and to boost productivity. If we still had to write everything in assembly, there probably wouldn't be nearly as many programmers or as much software. That said, if we're only seeing 5–10% performance improvements with each new generation of CPUs and GPUs, then software optimization is going to become increasingly important. Let’s hope we still have the talent and discipline to make that happen.
I'd be perfectly OK with less software; that is of a higher quality, than the deluge of absolute trash we have today.

Strangely, the lack of modern coding language and lesser hardware did NOT stop companies and individuals from putting out metric TONS of software int eh 80s and 90s. All the "productive boosting" has resulting in is ever inflating lead times, development delays, and wasted productivity on garbage like micro transaction stores or worthless DRM that doesnt work.
 
Well yeah but it would greatly increase development time.

It's probably the most visible in games considering some really old titles still look great and run great on modern hardware whilst some new titles run and look like crap but have silly hardware requirements.

I mean ideally everything would be written in highly optimized assembly code. But no one has time for that so we use high level programming languages and massive libraries and don't bother optimizing until performance actually is an issue. That's the beauty of hardware getting more powerful, programmes can just write code and it almost always works good enough without having to know all the ins and outs of the hardware and the OS.

It's a bit like cars and bicycles. Cars are huge, terrible for the environment and worse for your own health and wallet compared to a bicycle. But at the same time they do enable the average person to go just about anywhere and save time so cars in a lot of countries are more common than bicycles.
 
Duh! When storage expanded the way it did, the software developers said why bother making the code optimized. Takes too much time and money.
 
Carmack’s point hits hard—so much of our software today is bloated not because it has to be, but because we rely on ever-faster hardware to paper over bad design. The idea that we'd fall back to 70s/80s hardware is fascinating, especially since most modern software would simply collapse without the luxury of today's silicon.
 
No duh? People have been calling out the laziness of software developers for eons now. Remember when EA decided it wasnt going to compress music files for "performance reasons" in Titanfall, a game made on the source engine that ran on a potato? that was over a decade ago.

Also:

Yes, because it has NOTHING to do with performance. Per Microsoft, an 8th gen dual core 3 ghz pentium is perfectly fine to run 11, but a quad core 4 GHz 7th gen core i7 isnt. Unless of course its one of the 4 7th gen procs MS used in surface devices, then suddenly its perfectly fine. Why? Dont question your overseer!

Windows 11's "requirements" exist solely to push new hardware at best, or attempt to mandate hardware DRM locks via TPM at worst. It's also failed miserably so far.
I'm sorry, but you're clueless on pretty much all points. Too much tinfoil content.

1) As a software maker, your goal is to MINIMIZE the amount of hardware you OFFICIALLY support. Why? Because more hardware = more support to provide = most cost. There's a finite amount of resources you want to pour into support, and we're talking about decade-old CPUs. Could MS make Win11 run on Pentium 4? Surely. Would it be worth? Most certainly not. You have to draw the line somewhere, that's all there is to it. If you really don't care about having a current system, why would they care to provide support for you? Especially for free?

2) Performance isn't the only aspect of any hardware. There are feature sets, HW flags, bits and pieces, all in need to be taken care of. Each with their own driver, firmware, and hardware quirks and issues. It's really not just about clock and IPC.

3) MS doesn't receive loyalties from OEMs for increased HW sales.

4) MS doesn't sell any hardware that's not Win11-compatible, so there's absolutely no "hardware push" they could achieve in any shape or form with such CPU restrictions. In fact, the fact that they SPECIFICALLY added their old Surface CPU to the list PRECISELY contradicts your "theory" about pushing new hardware down your throat. They did the EXACT OPPOSITE, kept supporting old cr@p just for the sake of customers not having to replace it. Bit the bullet and accepted the added support burden that comes with it.

5) The "mandate HW DRM locks via TPM" part I can't even decipher. Will they protect their games via TPM, or what is it you're trying to say? Fine, so what? What do I care? If I want to play a game, I'll buy the game. Thieves, cry me a river please. DRM is like the least interesting part of TPM.

Among other security features, probably most importantly, TPM allows for seamless encryption of your drives. Why is it such a big deal? Well, most people don't like losing their laptops, especially when the thief can also access all their files the moment they insert the SSD into another PC. That's what FDE protects against, and that's what only works painlessly if you have something like a TPM in your system. IT'S GOOD TO HAVE PROTECTION ON YOUR SYSTEM. It's amazing how you absolutely rule out the mere possibility that TPM can be actually useful and helpful for you.

Before TPMs, we had to hand out "unlocker" USB drives to all our colleagues. Or use TWO passwords, one for their accounts, and one for their workstations. Imagine entering 2 passwords every morning.

They ended up doing these things:
- keeping the USB in their laptop bags, completely defeating the whole purpose of protection against data leaks
- keeping the USB right in the laptop USB port, sticking out, completely defeating the whole purpose of protection against data leaks
- leaving the USB at home, asking IT to unlock their laptops with the recovery key kept in AD. Major inconvenience both for the employee and IT.

Each of these scenarios are completely eliminated with the use of TPM. It's really that simple. It was so bad, that when we found out the TPM solution (waaay back, like in 2016, just 1 year after the release of Windows 10), we even bought external TPM chips for all the desktop workstations that supported it, like the Gigabyte GC-TPM2.0, so that we can get rid of USB sticks once and for all. FDE for everyone, everywhere, period. No exceptions. That's the only way your data can be safe and secure.

TLDR you're way to cynical and into conspiracy theories, dude. Too much time spent on forum boards, arguing about made up issues and evil plans.
 
Talk about hypocrisy... the original DOOM itself pushed the limits those days. I first played it in my cousin's 486DX2-66. When I got my Pentium 133, it ran the smoothest.

But Wolfenstein 3D? Yeah, that ran smooth even in my 286-16.

But after that, all id games starting to push the limits of the current hardware. The Ultra setting in Quake 3 required a beefy system - comparing to today - almost like it can only run the best in those who have at least the 4090.

Too bad, current games keep require higher and higher VRAM, RAM and CPU & graphics horse power, but the games are so....meh...really meh.
 
Back