History of the Personal Computer, Part 5: Computing goes mainstream, mobile, ubiquitous

By Jos
Oct 15, 2014
Post New Reply
  1. history personal computer part computing apple microsoft windows amd intel arm

    The microprocessor made personal computing possible by opening the door to more affordable machines with a smaller footprint. The 1970s supplied the hardware base, the 80s introduced economies of scale, while the 90s expanded the range of devices and accessible user interfaces.

    The new millennium brought a closer relationship between people and computers. More portable devices became the conduit that enabled humans' basic need to connect. It's no surprise that computers transitioned from productivity tool to indispensable companion as connectivity proliferated.

    This is the fifth and final installment in the series, where we look at the history of the microprocessor and personal computing, from the invention of the transistor to modern day chips powering our connected devices.

    Read the complete article.

    Last edited by a moderator: Mar 6, 2015
  2. Steve

    Steve TechSpot Editor Posts: 2,184   +1,215

    Great series, wish there could be more!

    I will check back in a few years once some more history has taken place :)

    Thanks again for all the great work.
    dividebyzero and Julio Franco like this.
  3. gregzeng

    gregzeng TS Enthusiast Posts: 30

    Decades ago I read an exciting paperback "true story". Tried to locate it by title, but searching "a.m.d. unofficial biography history computer" on Google did not give me the results I wanted.

    Can readers suggest other exciting "history of computing, true stories"?
  4. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    Cheers Steve. I didn't realize how much history was involved until I had to get it ordered into a semblance of a story. The series could easily have been many times longer and still not captured every major twist and turn of the industry.

    These titles amongst others I used as reference material for the article series, and all are worth a read.

    Inside Intel by Tim Jackson
    The Intel Trinity by Michael S. Malone
    History of Semiconductor Engineering by Bo Lojek
    Computers: A Life Story of a Technology by Eric G. Swedin and David L. Ferro
    Makers of the Microchip: A Documentary History of Fairchild Semiconductor by David C. Brock and Jay Last
    Fabless: The Transformation of the Semiconductor Business by Daniel Nenni and Paul McLellan
    Foundations of Computer Technology by Alexander John Anderson
    Phr3d, Jos and Julio Franco like this.
  5. Really enjoyed this series. I was wondering if you'd be willing to provide the articles in ebook form or pdf. I'd love to be able to read this in a few decades, or pass it on to my kids if I ever have any!
    Julio Franco likes this.
  6. VitalyT

    VitalyT Russ-Puss Posts: 3,105   +1,375

    What about part 6, about the most current products and the ones expected in the near future?

    Dare to make a prognosis?
    dividebyzero likes this.
  7. Thank you for a great series. It brought up a lot of memories.
    dividebyzero and Julio Franco like this.
  8. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    The only problem with making predictions is that, like past history, the landscape is/will be littered with the remains of once prominent companies - companies that some people have formed an attachment to. People find comfort in continuity, and many are averse to change in general so feedback tends towards the negative. I'm happy enough to make predictions (as my forum postings would show - this from 3 years ago regarding AMD's CPU future for example - I won't link to the actual thread since I was rather cynical regarding another tech site, but I'm happy to supply the link via PM), but their shelf life can be incredibly short so they may not be worthy of the archive status that a pure history article would garner. It only takes a significant merger/buy out/business alliance to render any earlier estimate useless.

    As a standalone article it would fly, as would the asterisks in the forum comments I suspect.:D
    Phr3d and VitalyT like this.
  9. VitalyT

    VitalyT Russ-Puss Posts: 3,105   +1,375

    And then there was this Gordon Moore fella :)

    I'm not really interested in predictions about companies, only about the technologies, no strings attached:

    - How long will it take for ARM to take over x86 on desktops?
    - How long before quantum PC-s appear on the shop shelves?
    - How many CPU core-s will there be till we stop caring?
    - How far a transistor will down-scale in the end?
    - How thin an iPhone 10 will be?

    Strike the last one, it was a joke :)
  10. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    They can be somewhat intertwined if a group of influential companies decide upon a co-operative course of action
    I don't see it happening TBH, at least not in the foreseeable future
    You mean PC based upon this? General rule of thumb is that it takes a technology 25-30 years to go from idea/outline to being a commercial commodity product ( mobile phones, the internet, radio, tv for example), so don't bank on getting one for Christmas.
    At the rate that present code is optimized for parallelized computing? We're probably at (or near) that point now. It's really only the enterprise sector where code is optimized for specific systems to gain in the performance-per-watt metric where core scaling and efficiency seem to be prioritized.
    Theoretical or commercial ? Theoretically, sub-5nm nodes have been investigated and researched for a while (and this from 2006), and theoretically sub-nanometer (picometer) nodes can be demonstrated. The problems lies in tooling (development /building /cost / validation), production rate, energy requirement, manufacturing defect rates. The smaller the node the slower the production ramp
    Bonus prediction! With all the copycat smartphone vendors trying to outdo Apple, the iPhone 10 will go completely retro
    VitalyT likes this.
  11. VitalyT

    VitalyT Russ-Puss Posts: 3,105   +1,375

    You took it a little too seriously and also pessimistic. It takes a bit of good imagination to look beyond the current trends ;)

    - Talks of quantum computers hitting the mass market have been out for decades ;)
    - The major hindrance in going for smaller transistors is increasing current leaks ;)
  12. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    If we lived in a world where pure/basic research dictated the pace, I'd be a little more upbeat, but the fact is that return on investment is the prime mover. As I mentioned in the article, RCA basically stopped George Heilmeier's research into LCD technology fifty years ago at their New Jersey research centre to protect their cathode ray tube business and is pretty well-known example of how research can be killed by economic reality
    People have been talking about cold fusion or 80-odd years as well. I was referring to the first practical demonstration through to commercialization. For example:
    Smartphone : Originally described in the early 1970's, first work done ~1990, first public demo 1992, widespread adoption only within the last few years.
    TV : First demonstration 1926, widespread adoption mid-1950's
    Radio: First patents 1872, widespread adoption 1920's
    Digital Computer: Developed 1937, wider adoption late 1950's - early 1960's.
    What's your prediction for commercial availability of quantum computers? I certainly don't have a mortgage on predictions and claim no industry insider knowledge, so your predictions are as valid as mine - predict away!
    Hence the move to more esoteric rare earth metal compounds.
  13. VitalyT

    VitalyT Russ-Puss Posts: 3,105   +1,375

    Graphene, while seemingly a solution, isn't particularity esoteric ;)
  14. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    Since I specifically referenced "rare earth metal compounds" - which doesn't have any direct relationship to Graphene, I'll just assume you're trolling at this point.
    I was obviously referring to the current and near future Rare metal oxide technology
    Last edited: Oct 15, 2014
  15. Phr3d

    Phr3d TS Booster Posts: 212   +38

    Thank You for the Effort that this series required of you, nicely done and thoroughly enjoyable.
    I, too, would be interested in a PDF, not to mention 'the Extended version', LOL.
  16. Solmyr

    Solmyr TS Rookie

    Great articles, great series, very good job!
    Thanks a lot for this.
    A lot of memories....

    And this conclusion, very accurate.
    "The next stage in computing history may just center on how we went from shaping our technology to how our technology shaped us."
    We will see.
    Greatings from Poland. Take care of You.
    See Ya.

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...