History of the Personal Computer, Part 5: Computing goes mainstream, mobile, ubiquitous

Jos

Posts: 3,073   +97
Staff

history personal computer part computing apple microsoft windows amd intel arm

The microprocessor made personal computing possible by opening the door to more affordable machines with a smaller footprint. The 1970s supplied the hardware base, the 80s introduced economies of scale, while the 90s expanded the range of devices and accessible user interfaces.

The new millennium brought a closer relationship between people and computers. More portable devices became the conduit that enabled humans' basic need to connect. It's no surprise that computers transitioned from productivity tool to indispensable companion as connectivity proliferated.

This is the fifth and final installment in the series, where we look at the history of the microprocessor and personal computing, from the invention of the transistor to modern day chips powering our connected devices.

Read the complete article.

 
Last edited by a moderator:
Decades ago I read an exciting paperback "true story". Tried to locate it by title, but searching "a.m.d. unofficial biography history computer" on Google did not give me the results I wanted.

Can readers suggest other exciting "history of computing, true stories"?
 
Cheers Steve. I didn't realize how much history was involved until I had to get it ordered into a semblance of a story. The series could easily have been many times longer and still not captured every major twist and turn of the industry.

Decades ago I read an exciting paperback "true story". Tried to locate it by title, but searching "a.m.d. unofficial biography history computer" on Google did not give me the results I wanted.
Can readers suggest other exciting "history of computing, true stories"?
These titles amongst others I used as reference material for the article series, and all are worth a read.

Inside Intel by Tim Jackson
The Intel Trinity by Michael S. Malone
History of Semiconductor Engineering by Bo Lojek
Computers: A Life Story of a Technology by Eric G. Swedin and David L. Ferro
Makers of the Microchip: A Documentary History of Fairchild Semiconductor by David C. Brock and Jay Last
Fabless: The Transformation of the Semiconductor Business by Daniel Nenni and Paul McLellan
Foundations of Computer Technology by Alexander John Anderson
 
Really enjoyed this series. I was wondering if you'd be willing to provide the articles in ebook form or pdf. I'd love to be able to read this in a few decades, or pass it on to my kids if I ever have any!
 
What about part 6, about the most current products and the ones expected in the near future?
Dare to make a prognosis?
The only problem with making predictions is that, like past history, the landscape is/will be littered with the remains of once prominent companies - companies that some people have formed an attachment to. People find comfort in continuity, and many are averse to change in general so feedback tends towards the negative. I'm happy enough to make predictions (as my forum postings would show - this from 3 years ago regarding AMD's CPU future for example - I won't link to the actual thread since I was rather cynical regarding another tech site, but I'm happy to supply the link via PM), but their shelf life can be incredibly short so they may not be worthy of the archive status that a pure history article would garner. It only takes a significant merger/buy out/business alliance to render any earlier estimate useless.

As a standalone article it would fly, as would the asterisks in the forum comments I suspect.:D
 
The only problem with making predictions is that, like past history, the landscape is/will be littered with the remains of once prominent companies - companies that some people have formed an attachment to. People find comfort in continuity, and many are averse to change in general so feedback tends towards the negative. I'm happy enough to make predictions (as my forum postings would show - this from 3 years ago regarding AMD's CPU future for example - I won't link to the actual thread since I was rather cynical regarding another tech site, but I'm happy to supply the link via PM), but their shelf life can be incredibly short so they may not be worthy of the archive status that a pure history article would garner. It only takes a significant merger/buy out/business alliance to render any earlier estimate useless.

As a standalone article it would fly, as would the asterisks in the forum comments I suspect.:D

And then there was this Gordon Moore fella :)

I'm not really interested in predictions about companies, only about the technologies, no strings attached:

- How long will it take for ARM to take over x86 on desktops?
- How long before quantum PC-s appear on the shop shelves?
- How many CPU core-s will there be till we stop caring?
- How far a transistor will down-scale in the end?
- How thin an iPhone 10 will be?

Strike the last one, it was a joke :)
 
And then there was this Gordon Moore fella :)
I'm not really interested in predictions about companies, only about the technologies, no strings attached
They can be somewhat intertwined if a group of influential companies decide upon a co-operative course of action
- How long will it take for ARM to take over x86 on desktops?
I don't see it happening TBH, at least not in the foreseeable future
- How long before quantum PC-s appear on the shop shelves?
You mean PC based upon this? General rule of thumb is that it takes a technology 25-30 years to go from idea/outline to being a commercial commodity product ( mobile phones, the internet, radio, tv for example), so don't bank on getting one for Christmas.
- How many CPU core-s will there be till we stop caring?
At the rate that present code is optimized for parallelized computing? We're probably at (or near) that point now. It's really only the enterprise sector where code is optimized for specific systems to gain in the performance-per-watt metric where core scaling and efficiency seem to be prioritized.
- How far a transistor will down-scale in the end?
Theoretical or commercial ? Theoretically, sub-5nm nodes have been investigated and researched for a while (and this from 2006), and theoretically sub-nanometer (picometer) nodes can be demonstrated. The problems lies in tooling (development /building /cost / validation), production rate, energy requirement, manufacturing defect rates. The smaller the node the slower the production ramp
- How thin an iPhone 10 will be?
Strike the last one, it was a joke :)
Bonus prediction! With all the copycat smartphone vendors trying to outdo Apple, the iPhone 10 will go completely retro
489144_1922MobilePhone.jpg
 
You took it a little too seriously and also pessimistic. It takes a bit of good imagination to look beyond the current trends ;)

- Talks of quantum computers hitting the mass market have been out for decades ;)
- The major hindrance in going for smaller transistors is increasing current leaks ;)
 
You took it a little too seriously and also pessimistic. It takes a bit of good imagination to look beyond the current trends ;)
If we lived in a world where pure/basic research dictated the pace, I'd be a little more upbeat, but the fact is that return on investment is the prime mover. As I mentioned in the article, RCA basically stopped George Heilmeier's research into LCD technology fifty years ago at their New Jersey research centre to protect their cathode ray tube business and is pretty well-known example of how research can be killed by economic reality
The schedule for the 1200-element integrated displays called for them to be operational by the end of the first quarter of 1968. This proved to be overly ambitious, not only for the technical difficulty, but also because of lack of support from the RCA Laboratories Integrated Circuit Center and the thin-film fabrication facility...[snip]...During 1969, RCA abandoned entirely the objective of making a liquid-crystal TV display, although other applications, e.g., watches, calculators, printers, automobile mirrors, etc., were pursued until 1972.
- Talks of quantum computers hitting the mass market have been out for decades ;)
People have been talking about cold fusion or 80-odd years as well. I was referring to the first practical demonstration through to commercialization. For example:
Smartphone : Originally described in the early 1970's, first work done ~1990, first public demo 1992, widespread adoption only within the last few years.
TV : First demonstration 1926, widespread adoption mid-1950's
Radio: First patents 1872, widespread adoption 1920's
Digital Computer: Developed 1937, wider adoption late 1950's - early 1960's.
What's your prediction for commercial availability of quantum computers? I certainly don't have a mortgage on predictions and claim no industry insider knowledge, so your predictions are as valid as mine - predict away!
- The major hindrance in going for smaller transistors is increasing current leaks ;)
Hence the move to more esoteric rare earth metal compounds.
 
Thank You for the Effort that this series required of you, nicely done and thoroughly enjoyable.
I, too, would be interested in a PDF, not to mention 'the Extended version', LOL.
 
Great articles, great series, very good job!
Thanks a lot for this.
A lot of memories....

And this conclusion, very accurate.
"The next stage in computing history may just center on how we went from shaping our technology to how our technology shaped us."
We will see.
Greatings from Poland. Take care of You.
See Ya.
 
Back