Compact custom version of Windows 11 released

Daniel Sims

Posts: 830   +33
Staff
What just happened? Windows 95 occupied less than 100MB upon installation. A clean Windows 11 install, however, needs around 20GB. Nobody doubts that software will need more data as technology marches forward, but many believe Windows hasn't justified a 20,000% install size increase over 28 years. One developer may have proven this with a custom installation that cuts Windows 11 to half its default size.

This week, NTDEV released Tiny11, a version of Windows 11 that needs only ~8GB of your hard drive and can run on 2GB of RAM. It also removes Windows 11's somewhat demanding system requirements, but users should know it makes some steep sacrifices to slim down.

A streamlined version of Windows 11 Pro 22H2, Tiny11 comes in an ISO available on archive.org that's just 3GB compared to Microsoft's official 5.1GB ISO download.

The smaller version of Microsoft's operating system includes the bare necessities like accessibility software, the Calculator, Notepad, and Paint. It also retains the Microsoft Store, so users can install whatever extra Microsoft software they need. The system comes with local accounts by default but can also use online accounts.

The sacrifice NTDEV credits the most for downsizing Windows 11 is the Windows Component Store (WinSxS). Without it, users won't be able to install new languages or major features. The developer says Tiny11 is "not serviceable," but confirms the system can receive .NET, drivers, and security definitions through Windows Update.

NTDEV assures users concerned about the security risks of a custom Windows version that Tiny11 doesn't contain anything from non-Microsoft sources. However, users anxious about privacy shouldn't install it expecting to be totally free of Microsoft's telemetry.

Tiny11's main goal is to broaden the range of systems that can access Windows 11. The custom edition can run on any system that runs Windows 10 and can dual boot with that OS.

Windows 11 launched to controversy over its surprisingly strict system requirements, particularly regarding CPUs. Due to TPM requirements, the OS normally needs at least an 8th-gen Intel Core or AMD Zen+ processor. Unsupported systems can run Windows 11, but doing so requires going through extra hoops, which Tiny11 eliminates.

StatCounter latest figures indicate Windows 11 has yet to reach one in five Windows users. Microsoft's latest OS is still gaining market share, but far more slowly than Windows 10 which still runs on at least 70 percent of Windows systems.

Permalink to story.

 

Hodor

Posts: 580   +398
Programmers are getting lazier and lazier. Long time ago if you needed a function that takes 2 lines of code, you'd write it yourself. Or copy it from a library and insert the source in your code.

Nowadays they include a 100 MB library just to use a single function from it. Do that 10 times, by each of 200 programmers, and you get a slow brontosaurus of an app. Moore's Law curve for CPUs looks very flat compared to the Slug's Law for software development, which is killing performance.

Though, some would say that brakes are more important than engine. In which case we have very safe software. Speed won't kill us.
 

zecoeco

Posts: 13   +23
I had been using multiple debloated windows editions, and actually ended up optimizing every windows build image-level before installation, I had been doing that for several years with the aim to maximize performance and have an uninterruptable experience. However, eventually I realized that most of the optimizations applied will eventually screw up with the functionality of the OS, and the worst thing in all those "debloated" windows build is that you'll compromise security for performance, and that is not good at all. Windows Update is very important regardless of how annoying it can get. Simple, just pause updates to run only once every month and enable metered connection.

I realized that it is always better to do some basic and manual optimizations that are risk free, things like deleting useless apps that you won't use, trimming your SSD weekly, enabling storage sense to run every month as well as the automatic maintenance. Turn off UWP background activity and try to spot the apps that eats up your resources.
 

WhiteLeaff

Posts: 154   +270
Programmers are getting lazier and lazier. Long time ago if you needed a function that takes 2 lines of code, you'd write it yourself. Or copy it from a library and insert the source in your code.

Nowadays they include a 100 MB library just to use a single function from it. Do that 10 times, by each of 200 programmers, and you get a slow brontosaurus of an app. Moore's Law curve for CPUs looks very flat compared to the Slug's Law for software development, which is killing performance.

Though, some would say that brakes are more important than engine. In which case we have very safe software. Speed won't kill us.

I completely agree, the system gets heavier and does not bring any significant advantage in return. If it is an evolution it should either be more resource efficient or at least faster.
 

Bullwinkle M

Posts: 911   +814
What was the point again?

Windows 11 still requires at least 8GB Ram to run optimally

I'll stick with Windows XP
500MB Installer / 4GB Ram and a 3 second boot time on Nehalem

500MB is the full install / nothing gimped / nothing missing!
 
What was the point again?

Windows 11 still requires at least 8GB Ram to run optimally

I'll stick with Windows XP
500MB Installer / 4GB Ram and a 3 second boot time on Nehalem

500MB is the full install / nothing gimped / nothing missing!

Why wouldn't you just use Ubuntu then if you don't believe you're missing any features? Still matches your hardware requirements and actively updated.

Btw, please consider not using the term gimped.. quite offensive to the disabled community. Just a thought. Obviously you are free to say whatever you want.
 

Bullwinkle M

Posts: 911   +814
Why wouldn't you just use Ubuntu then if you don't believe you're missing any features? Still matches your hardware requirements and actively updated.

Btw, please consider not using the term gimped.. quite offensive to the disabled community. Just a thought. Obviously you are free to say whatever you want.
Ubuntu does not run the software I use
It is too gimmmm....ah ...sorry....I meant it's too crippled!
 

yRaz

Posts: 5,120   +6,747
Why wouldn't you just use Ubuntu then if you don't believe you're missing any features? Still matches your hardware requirements and actively updated.

Btw, please consider not using the term gimped.. quite offensive to the disabled community. Just a thought. Obviously you are free to say whatever you want.
As someone who is disabled I've never been offended by the term "gimped". Please don't tell me what I should be offended by

Ubuntu does not run the software I use
It is too gimmmm....ah ...sorry....I meant it's too crippled!
All jokes aside, Mint is probably more up your ally and it's based on Ubuntu. It's what I daily drive. I hate what MS has been doing with their windows support, a major reason for why I hold out, but using something as old as XP seems like a bit of a security risk.
 

gamerk2

Posts: 778   +752
Programmers are getting lazier and lazier. Long time ago if you needed a function that takes 2 lines of code, you'd write it yourself. Or copy it from a library and insert the source in your code.

Nowadays they include a 100 MB library just to use a single function from it. Do that 10 times, by each of 200 programmers, and you get a slow brontosaurus of an app. Moore's Law curve for CPUs looks very flat compared to the Slug's Law for software development, which is killing performance.

Uhh no. If you need a library function, you just link against the dll; its rare to ever have to statically link libraries, and you typically only do so for embedded systems where you actually care if literally anything (like said external libraries) change.

Even then, code that isn't executed doesn't cost sny RAM usage (again, barring embedded systems where things tend to be statically typed); the only thing you lose is storage space, but it's better to sacrifice a few MB and include the entire library then get punished when future upgrades end up wanting to use an expanded feature set.
 

yRaz

Posts: 5,120   +6,747
Uhh no. If you need a library function, you just link against the dll; its rare to ever have to statically link libraries, and you typically only do so for embedded systems where you actually care if literally anything (like said external libraries) change.

Even then, code that isn't executed doesn't cost sny RAM usage (again, barring embedded systems where things tend to be statically typed); the only thing you lose is storage space, but it's better to sacrifice a few MB and include the entire library then get punished when future upgrades end up wanting to use an expanded feature set.
I think a lot of the anger stems not so much as to the increase in system requirements, it's WHY the system requirements are increasing. A significant portion of these things are always on data collection services. Storage is so cheap that having the DLLs around is a no brainer. The problem arises when system resources are being taken up "features" nobody asked for. It would be one thing if we could easily turn these features off or even have them as an "opt-in" thing, but we can't.

Microsoft is creating a revenue stream off of a product we already own and it's taking up system resources. That's where the anger comes from. I'm okay with being the product if something is free, but I paid for windows license.

The bloat we see in windows is not from new "features" it's from all the windows services running in the background all collecting data and not really doing anything useful.
 

dualkelly

Posts: 261   +331
I think a lot of the anger stems not so much as to the increase in system requirements, it's WHY the system requirements are increasing. A significant portion of these things are always on data collection services. Storage is so cheap that having the DLLs around is a no brainer. The problem arises when system resources are being taken up "features" nobody asked for. It would be one thing if we could easily turn these features off or even have them as an "opt-in" thing, but we can't.

Microsoft is creating a revenue stream off of a product we already own and it's taking up system resources. That's where the anger comes from. I'm okay with being the product if something is free, but I paid for windows license.

The bloat we see in windows is not from new "features" it's from all the windows services running in the background all collecting data and not really doing anything useful.
This is why upon install of any new windows for my personal use I block every single windows IP address. I turn off and delete all collection services. There are amazing programs out there that will scrub windows of their absolutely waste of space and garbage data collection services. Sometimes it takes hours on end but at the end my version of windows can not even contact any MS services over the web include win update which in this day and age tends to break more windows than they fix. I can manually download and install those on my own.
 

gamerk2

Posts: 778   +752
I think a lot of the anger stems not so much as to the increase in system requirements, it's WHY the system requirements are increasing. A significant portion of these things are always on data collection services. Storage is so cheap that having the DLLs around is a no brainer. The problem arises when system resources are being taken up "features" nobody asked for. It would be one thing if we could easily turn these features off or even have them as an "opt-in" thing, but we can't.

Microsoft is creating a revenue stream off of a product we already own and it's taking up system resources. That's where the anger comes from. I'm okay with being the product if something is free, but I paid for windows license.

The bloat we see in windows is not from new "features" it's from all the windows services running in the background all collecting data and not really doing anything useful.
Which is fair. The problem becomes once you have a single application that tries to use those services, and the headache to the user (most of who I note are *not* in the least tech savvy) when those services are not activated. So from a user-friendliness point of view, it's easier to enable most major features even if the users may or may not need them, to prevent any application incompatibles. That's why it's generally legacy features that are disabled.

That being said, even when enabled most of those services will take no CPU uptime if they aren't being used. The high minimum specs are overkill from a *performance* standpoint; it's more the feature set (EG: WDDM for GPUs, TPM for CPUs, etc.) that is the primary driver of requirements nowadays. Also consider how badly Vista got a bad rap for being a resource hog; I was an early Vista adopter, and aside from the UAC being far too aggressive out of the box, it was *fine* if you had the specs for it. Since then, Microsoft has been a lot more careful when giving "minimum" specs, to ensure lower-end machines don't install something that can't handle it well.
 

yRaz

Posts: 5,120   +6,747
Which is fair. The problem becomes once you have a single application that tries to use those services, and the headache to the user (most of who I note are *not* in the least tech savvy) when those services are not activated. So from a user-friendliness point of view, it's easier to enable most major features even if the users may or may not need them, to prevent any application incompatibles. That's why it's generally legacy features that are disabled.

That being said, even when enabled most of those services will take no CPU uptime if they aren't being used. The high minimum specs are overkill from a *performance* standpoint; it's more the feature set (EG: WDDM for GPUs, TPM for CPUs, etc.) that is the primary driver of requirements nowadays. Also consider how badly Vista got a bad rap for being a resource hog; I was an early Vista adopter, and aside from the UAC being far too aggressive out of the box, it was *fine* if you had the specs for it. Since then, Microsoft has been a lot more careful when giving "minimum" specs, to ensure lower-end machines don't install something that can't handle it well.
So much of Vista is was driver issues. Microsoft released to partners saying that "these systems are vista compatible" and they just weren't. People would upgrade and have a HORRIBLE time. UAC was really annoying but I honestly like Vista. I did have a brand new system at the time. AMD Athlon X2 4200+, Asus M2N32-SLI Deluxe, X1900XT with 2 gigs of DDR2-1066 so I had no problem. But Vista ran FINE on that system. Why is it that almost 20 years later a system that was a "resource hog" has 1/8th the system requirements of modern systems? To put more perspective on it, linux mint only requires 1GB of ram and 15gigs of hard drisk space. And it doesn't even use the full 15 gigs, it's only unpacks to around 3.8gigs.

It's just silly and when you look at what alternatives are doing you it becomes even more rediculous. If all you need to do is basic computing we haven't had a need for better hardware for almost 2 decades.

This is why upon install of any new windows for my personal use I block every single windows IP address. I turn off and delete all collection services. There are amazing programs out there that will scrub windows of their absolutely waste of space and garbage data collection services. Sometimes it takes hours on end but at the end my version of windows can not even contact any MS services over the web include win update which in this day and age tends to break more windows than they fix. I can manually download and install those on my own.
That's not the point, you shouldn't have to do that at all. It's your machine, you OWN it, you PAID FOR IT. Frankly, it was easier for me to just switch to Linux than it was to remove all of the windows bloat everytime I did a reinstall, or I let an update through that undid all those changes.
 

Bullwinkle M

Posts: 911   +814
All jokes aside, Mint is probably more up your ally and it's based on Ubuntu. It's what I daily drive. I hate what MS has been doing with their windows support, a major reason for why I hold out, but using something as old as XP seems like a bit of a security risk.

All jokes aside, I already use Mint, but XP is not a security risk for me

I have a highly modified copy of XP to study malware
I run it online without any MS security updates in a full admin account
It has been online for the past 9 years without a single security problem
In fact, it is more secure and problem free than a fully updated copy of Windows 11
 

NikoBB

Posts: 134   +85
Programmers are getting lazier and lazier. Long time ago if you needed a function that takes 2 lines of code, you'd write it yourself. Or copy it from a library and insert the source in your code.
Nowadays they include a 100 MB library just to use a single function from it. Do that 10 times, by each of 200 programmers, and you get a slow brontosaurus of an app. Moore's Law curve for CPUs looks very flat compared to the Slug's Law for software development, which is killing performance.
Though, some would say that brakes are more important than engine. In which case we have very safe software. Speed won't kill us.
You seem to be very new to code development. Significant (large) libraries (if the developers are not *****s) are never linked statically, but dynamically (which increases the speed requirements for processors, because dynamic linking is always slower than static assembly, when only the necessary blocks from libraries are connected).

The problem is not in this in a bunch of different versions, what M$ has been calling "DLL Hell" for a long time, and this is already the fault of both M$ and software development companies that cannot come to a common unification of key libraries. Therefore, the system often contains the same libraries, but different versions due to the fault of stupid and lazy software creators. Usually, their "stupidity" is directly dependent on the salary - the lower it is, the less incentive they have to write high-quality code. A small salary is a direct correlation with the qualifications of the developer. Unless, of course, if he does not work in a concentration camp, at gunpoint (this happens in the world and quite often, even in the 21st century).

What was the point again?
Windows 11 still requires at least 8GB Ram to run optimally
I'll stick with Windows XP
500MB Installer / 4GB Ram and a 3 second boot time on Nehalem
500MB is the full install / nothing gimped / nothing missing!
XP in the matter of available memory completely merges W7 at 4GB. Especially when combined with an SSD.
TRIM does not work on XP, there is no GPT support, outside of WS2003, which means that disks are larger than 2TB, there is a disgusting TCP and SMB stack that cannot work properly with 1Gb / s + connections (which is especially bad for wi-fi connections - you are there never get more than 11-12MB/s) and there is a hard limit on GDI/system resources, and programs cannot allocate more than 2GB of memory to themselves, which just leads to a crash.

The only thing that XP completely kills all versions, starting with Vista - the speed of work without a swap file and the latency of the system. This is the only version of M$ that is completely ready for real-time applications. You will never get, even on W7, such low latency and such smooth playback of audio-video content without glitches.

As proven many times by professional reviews. Not a single modern laptop under W10/11 passes the system latency test for real-time applications. XP passes these tests. That is why it OS was so loved by sound engineers.

Why wouldn't you just use Ubuntu then if you don't believe you're missing any features? Still matches your hardware requirements and actively updated. Btw, please consider not using the term gimped.. quite offensive to the disabled community. Just a thought. Obviously you are free to say whatever you want.
Because Ubuntu is a piece of s**t that needs to be cut all night long in order to at least bring it to the level of XP from 2001 by usability and stability. Even version 18 LTS from 2018 could no longer work properly on 4Gb - it crashes on startup where W10 normally works. And the new versions are even worse. And the amount of business software (especially well-written) under Linux compared to Windows is at least 1:10000. Therefore, Linux is still 2-3% of the world's population, despite its "freeware".

....but using something as old as XP seems like a bit of a security risk.
As professional security tests have shown, it is now possible to get into an exploit in XP with the chances 100 times less than under W11. Simply because the developers of exploits and trojans have long forgotten about it...


----------
Who does not want increased telemetry, you can simply install LTSC 1809. By blocking all unnecessary services and updates. But even in corporate versions (not officially available to ordinary customers), M$, as professional tests have proven, brazenly lies with the level of telemetry. That is, additional work is required to block it on routers and other, independent equipment.

Who needs speed - LTSB 2016. It's even faster than LTSC 1809.

LTSC 2021 is already a collection of useless garbage, including the integrated Edge and even more insanity with the UI than in 1809. More memory and CPU usage, which is clearly visible from the moment it was installed compared to 1809, not to mention LTSB 2016.

I recently had a relative's laptop optimized after they said it was making a lot of noise.

Disabled all update services and the vile telemetry part (especially destroyed all Google services and the like), which included earlier, disabled all useless services. In chrome and firefox, set up a banner and script cutters and explained how to use them correctly - an introductory course for a young fighter. Now they have a laptop on YouTube when playing video - it is almost silent. I can't hear it from the meter. Everything works very fast under Zen+.

Unfortunately, the immoral M$, even though you people are paying her for the OS, is blatantly and illegally putting ads in there. Illegal because they have an overwhelming market share and illegally use it with the connivance of corrupt authorities in the US and other countries. If the antimonopolists of all countries worked in accordance with the law and strictly, they would have long ago forced M$ to remove telemetry and all advertising in paid OS and software. But alas, the world is a dirty and corrupt cesspool...
 

Nobina

Posts: 4,122   +4,818
I had been using multiple debloated windows editions, and actually ended up optimizing every windows build image-level before installation, I had been doing that for several years with the aim to maximize performance and have an uninterruptable experience. However, eventually I realized that most of the optimizations applied will eventually screw up with the functionality of the OS, and the worst thing in all those "debloated" windows build is that you'll compromise security for performance, and that is not good at all. Windows Update is very important regardless of how annoying it can get. Simple, just pause updates to run only once every month and enable metered connection.

I realized that it is always better to do some basic and manual optimizations that are risk free, things like deleting useless apps that you won't use, trimming your SSD weekly, enabling storage sense to run every month as well as the automatic maintenance. Turn off UWP background activity and try to spot the apps that eats up your resources.
I came to the same conclusion. After tampering with systems for years, cutting them down as much as I can, I have found out that the performance improvements are very minimal and not worth it. Eventually, something will break or I will need a feature I deleted. So, now I use things "stock" and get rid of things I don't want the conventional way. Microsoft has won in this regard.

The fact that someone can trim Windows 11 this much makes me wonder how much Microsoft could further cut down and optimize their OS but they just don't bother to. Instead, they introduce more bloat than last time.

When I look back on how I used to use my PC, I did pretty much the same things I do today, yet for whatever reason, I need 16GB of RAM rather than 1GB, I need at least a quad-core processor instead of a single or a dual-core one. Software has gotten fatter without offering much more features and it doesn't make a lot of sense.
 

Bullwinkle M

Posts: 911   +814
I agree with much of the last 2 posters although I found that I never needed trim for an SSD with XP and never use GPT partitions to this day, even when using Windows 10 or 11

Crashes in Windows XP ended earlier than 2010 when I simply eliminated programs that were buggy, caused DLL conflicts and Reg problems

I have never (even once) had a program crash in Windows XP since 2010 and only use programs that work flawlessly

I too used to try reducing the size of XP but gave that up long ago as it caused more problems than it solved and now use a full install for every version of Windows
 

AndreV

Posts: 28   +9
I had been using multiple debloated windows editions, and actually ended up optimizing every windows build image-level before installation, I had been doing that for several years with the aim to maximize performance and have an uninterruptable experience. However, eventually I realized that most of the optimizations applied will eventually screw up with the functionality of the OS, and the worst thing in all those "debloated" windows build is that you'll compromise security for performance, and that is not good at all. Windows Update is very important regardless of how annoying it can get. Simple, just pause updates to run only once every month and enable metered connection.

I realized that it is always better to do some basic and manual optimizations that are risk free, things like deleting useless apps that you won't use, trimming your SSD weekly, enabling storage sense to run every month as well as the automatic maintenance. Turn off UWP background activity and try to spot the apps that eats up your resources.
the worst thing in all those "debloated" windows build is that you'll compromise security for performance", you can allow only security updates but not features updates.
 

NikoBB

Posts: 134   +85
I have already listed the key things in 32-bit XP, which completely decide the meaning of its use. And for a long time. And the lack of secure browsers ended the matter. Just the lack of Trim support makes the idea meaningless, as well as the lack of support for large disks. Although if you have a NAS this is not a problem, but then again you will never get 100MB / s under XP over the network due to SMB 1.0 restrictions. And any modern business software is swollen so that it only needs 2GB of memory to start, and XP has 2GB - the limit. I personally encountered this. Moreover, many business software, when switching to x64 code, for some reason required 2 times more memory than before (bad developers). but I have to use it on business and it just doesn't work under XP anymore.

As I wrote above - I left the old PC with XP only for listening to music and for movies on the projector, because there is the most perfectly smooth picture and sound and there is a native Dolby Headphone for movies - which is not in Vista+ for a simple reason - Dolby deprived Realtek of a license for it under Vista+. And Dolby Atmos is worse, even though it is paid.
 

gamerk2

Posts: 778   +752
Why is it that almost 20 years later a system that was a "resource hog" has 1/8th the system requirements of modern systems? To put more perspective on it, linux mint only requires 1GB of ram and 15gigs of hard drisk space. And it doesn't even use the full 15 gigs, it's only unpacks to around 3.8gigs.

It's just silly and when you look at what alternatives are doing you it becomes even more rediculous. If all you need to do is basic computing we haven't had a need for better hardware for almost 2 decades.
As I noted again: A *lot* is due to requirements like WDDM versions and TPM, which are found only on newer chips. I'm reasonably sure, for example, absent those requirements everything that ran 10 could run 11.

Also again, Microsoft caters to a *far* larger userbase then Linux or Mac does, so they need to ensure absolute compatibility in a way that is invisible to the end user. Yes, a clean Linux install is much much smaller, until you have to start manually installing every library that various applications need to run. In Windows, most of that is included by default, because your non-technical user don't know or care how those items get installed. Linux caters to experts, Windows doesn't. Hence, the latter *has* to be more bloated because it must include far more out of the box to cover almost every possible use case.
 

Gars

Posts: 336   +46
I had been using multiple debloated windows editions, and actually ended up optimizing every windows build image-level before installation, I had been doing that for several years with the aim to maximize performance and have an uninterruptable experience. However, eventually I realized that most of the optimizations applied will eventually screw up with the functionality of the OS, and the worst thing in all those "debloated" windows build is that you'll compromise security for performance, and that is not good at all. Windows Update is very important regardless of how annoying it can get. Simple, just pause updates to run only once every month and enable metered connection.

I realized that it is always better to do some basic and manual optimizations that are risk free, things like deleting useless apps that you won't use, trimming your SSD weekly, enabling storage sense to run every month as well as the automatic maintenance. Turn off UWP background activity and try to spot the apps that eats up your resources.
optimize image on 14 floppy diskettes (W95)
thats a must see
 

Axeia

Posts: 77   +77
I see a lot of half truths here, too many to try and quote/reply directly to.

I agree with the people saying it's hardly worth trying to squeeze all the bad blood out of Windows 11 (or 10 really), as yes at some point you'll run into something not working right, installing right, erroring out etc because you're missing something. Simply getting rid of all the telemetry with some tool works great, saves resources and helps reduce system load (noticeably so on low-end systems). I wouldn't recommend doing much more than that.

And no... running Windows XP is not a good idea if you're connected to the internet. It's a leaky basket and criminals don't simply forget about it, they have to spread their resources as well so it takes a little longer before they find you. But they sure as heck aren't going to pass up on a system that can be exploited that easily as it hasn't gotten security updates since 2014!

And no XP wasn't that light weight, it just lasted that long. If you're old enough you'll remember that when XPs system requirements were initially announced people were appalled as well. Twice that of Windows 2000 and what for? It's basically the same thing etc etc.

Vista took up a huge amount of resources as well, it's was a fairly daring step (as far as Microsoft goes). A hardware accelerated UI suddenly requiring some graphical grunt, a big push towards making 64-bits happen (Windows XP 64 bit didn't really catch on) which is mainly responsible for its bad rep. The lack of drivers for a 64-bit OS meant a lot of older hardware didn't function properly - if you bought a brand new machine it all worked pretty nicely to be honest. But if you were upgrading an older PC with all kinds of old hardware plugged into it the experience was pretty miserable.
Then to make things worse for Vista Netbooks were a huge hit (dinky little laptops running underpowered Intel Atom CPUs). Microsoft kinda admitted defeat there and gave OEMs the go ahead to ship them with Windows XP instead. Publicly this was seen by a lot of people as a signal from Microsoft that Windows XP was just simply a lot more resource efficient (and it was) so they kept running XP for ages as well. In turn Microsoft reluctantly kept supporting Windows XP for far longer than it ever intended.

Microsoft didn't want something like Netbooks to lead them to supporting an old OS for ages again so Windows 7 came around which wasn't very different from Windows Vista tbh. Pretty much everyone had upgraded their hardware by now and netbooks had fallen from grace, they were just too darn weak for ever evolving internet sites taking more and more resources. Manufacturers finally got around to making 64 bits drivers as well so Windows Vista at this point was actually pretty decent but with a very bad rep.
Now that everything was actually in place for Windows Vista to be good and Windows 7 (which was very similar but a little bit more optimised) all the people that had skipped Vista installed Windows 7 and it was heralded world wide as the best Windows.

To be fair, Windows 7 might have been the best windows. It was the last one were they listened to the market instead of trying to make more money. Windows 8 launched with its awful Tablet optimised interface to try and make those sell and got a bad rep again, 8.1 fixed most of it and iirc actually has less telemetry than Win10. No one bothered with it for a while though because Windows 7 was great.

Windows 10 came around and slowly build market share as it was decent. Most people don't really care too much if it's preinstalled to change it. A good part of the people building their own PC and installing their own Windows eventually got swayed because DX12 was Win10 exclusive.

Interesting of how the myth of Microsoft having a cycle of "bad windows -> good windows" is more of a bad windows release slowly getting fixed followed by a new version that isn't much different but clears the board.

As far as Linux hate/love here in the comments goes. Just don't run XP if you're connected to the internet okay... run a Windows that actually receives security updates. If it can't, run Linux or get rid of it. And Linux doesn't have to be Ubuntu or Mint, there's distro's that target hardware from the WinXP era that run great, receive security updates and can actually run a modern browser.

For the people that think that basic use of a PC hasn't changed in 2 decades. It has.
The main or at least a big use case on pretty much any PC is using a web browser, websites have gotten so much fatter. There's so many advertisements loading in, so much javascript going, the layouts are so much more complex. And although unlike claimed here by some that desktop software libraries use massive amounts of resources (they usually don't as only what's needed is loaded/used) that definitely is the case with websites.
All that needs to happen is one simple form validation thing to make sure that a form field has a specific type of value? Better pull in a massive JavaScript library that slows down the entire experience because a lot of developers only know how to work with a certain library rather than 'base' javascript.

PS: Sorry for the big rant, I've been around too long and (think I) know to much to let some of these half truths go by so my post somehow ended up as half a rant and half a history lesson ;)
 
Last edited:

Bullwinkle M

Posts: 911   +814
I see a lot of half truths here, to many to try and quote directly to.

PS: Sorry for the big rant, I've been around too long and (think I) know to much to let some of these half truths go by so my post somehow ended up as half a rant and half a history lesson ;)

Speaking of "half truths".....

XP-SP2 is perfectly safe online "IF" you know what you are doing

After eliminating all the malware magnets like Flash / Java / Quicktime / Silverlight / Internet Explorer etc...
Simply adding an aftermarket firewall, closing a few more ports, adding a few more tweaks and running XP in a locked down Read Only partition will make it perfectly safe for online use "IF" you know what you are doing

I don't use it for online banking or online passwords of any sort because I understand its "online limitations"

Most people blindly listen to whatever Microsoft tells them and live on an endless treadmill to nowhere (now called Windows 11)

I can run XP-SP2 online for 9+ YEARS without a single issue because I understand which malware magnets needed to be eliminated in order to do so

I was the first to find that wannacry had no effect on XP-SP2 and I would know because I use XP-SP2 to study malware

Ransomware has no affect on my system / wipers also have no affect / rootkits cannot take hold

I have been doing this without a single MS security update and doing it in a full admin account while online for nearly a decade

XP is perfectly safe "IF" you know what you are doing

I can play with malware all day long and then simply reboot to a clean system, but if I am not playing with malware, I can remain online as long as I like without a single problem

For a Read Only XP experience, try "Driveshield" and avoid the Microsoft version called Steady State
Your system "may" crash after long periods of use if you make the boot partition too small, but I've never had a problem when using a 64GB boot partition with XP and using 1/2 of the remaining free space for the temp writes in Driveshield

You're never too old to learn a few more tricks!
 
Last edited: