The future of computing...

SuperCheetah

Posts: 704   +1
I just got to thinking and I was wondering what you guys thought the future of computers will be like? By that I mean, will all computers be linked through networks, will the Internet 2 ever come out and speed up the Net, things like that.

Also, some hardware issues, like where future bottlenecks in computing might occur (my guess is the hard drives), and also all our favorite graphics cards and where they'll go in the future as far as performance and design, etc.

I personally believe that one day soon the TV will be replaced by the computer monitor or vise versa as the two become ever closer. I already watch TV shows on my monitor and movies, etc. and I believe it is only a matter of time before this happens.

I would love to believe that there will be some upstart operating system that could compete with Windows, like Linux but with easier use like Windows and Mac OS X. If I had my way I would build an operating system just for performance and networking and whatnot to accomodate gamers and the internet, as close to the assembly code as possible as to make everything faster. Just my opinion though.

I also hope I'm not repeating a previous post by doing this, but I was just curious, so feel free to express your thoughts on where we're going with the Internet and computing in general.
 
Well, if I had to put my 2 cents in it would be that Linux is going to develop into a consumer based OS. Manufacturers are going to start seeing the value in Linux as Microsoft's legal woes continue and the cost of Windows continues to rise. I recently installed Linux on my computer and it was an easier install than Windows. If Game developers start supporting Linux Microsoft will finally have some well deserved competition.
 
Originally posted by SuperCheetah
I personally believe that one day soon the TV will be replaced by the computer monitor or vise versa as the two become ever closer. I already watch TV shows on my monitor and movies, etc. and I believe it is only a matter of time before this happens.
HDTV's are basically monitors without the resolution changing capability.
 
Originally posted by SuperCheetah
all our favorite graphics cards and where they'll go in the future as far as performance and design, etc.

Computer graphics will be at the wall when one cannot tell the difference between what is fake and what is real. The next step may be projecting the graphics in 3D instead of on a 2D monitor. Holodeck, anyone?

Originally posted by SuperCheetah
I personally believe that one day soon the TV will be replaced by the computer monitor or vise versa as the two become ever closer. I already watch TV shows on my monitor and movies, etc. and I believe it is only a matter of time before this happens.

Another interesting thing to note is that the human eye can no longer detect jaggies at a resolution of 4000x4000. I think it'll be a few years before we get to that resolution.
 
Well with serial ATA on the horizon and serial ATAII already being promoted (next generations of hard disk technology) I don't think hard disks will become a bottle neck anytime soon. Also with ATA100 being fast enough at the moment and UltraATA133 recently being released the possibility of hard disks becoming computing bottle necks looks slim. There are expensive server HDs available at 10k+ rpm with large caches that won't be considered slow for quite a few years yet (18.1Gb Fujitsu 68Pin SCSI 4.5ms 15000rpm 8MB Cache) .

As for the internet2 there are over one hundred universities utilising this and testing with this emerging technology, but there is no immediate plan (that I can find on the net) as to when this may replace the existing internet backbone....
I think the future is in optical computing. As detailed at the Intel Developers Forum, Intel have been looking into this as a way of speeding up processors with less heat generated, this is far off in the future. Another interesting topic brought up at the IDF was MEMS
From Anandtech :

Intel has been experimenting quite a bit with what are known as Micro-electro Mechanical System (MEMS) devices which are essentially devices that rely on the physical properties of the silicon they're built on to mimic mechanical devices that exist today. Examples would be levers, capacitors, inductors, etc... all implemented in silicon through the use of these MEMS devices.

There are some interesting developments in computing that may come to fruition. But as with any competitive market, which become Betamax technologies and which become successful VHS are anyones guesses right now ;)
 
We are currently using the dotted decimal notation for IP addressing, i.e:

IP 147.13.194.204
SUBNET MASK 255.255.255.0

The problem is, the addresses are running out. So we are gradually moving to dotted hexadecimal notation:

IP 02E.10A.C1.5C
SUBNET MASK FFF.FFF.FFF.0

(as far as I know this is the notation, but I just mucked about on the scientific calculator and something didn't look right.... :) )

This requires a new tcp/ip stack.... The IPv6 stack.

This will allow for a ridiculously greater number of IP addresses, and thusly a MUCH greater number of devices connected to the Internet, or indeed the Internet2 as Arris mentioned.

We might see, for example, mobile phones with IP addresses (no biggie there), but also TVs, Videos, and even such mundrane items as the microwave oven (downloads better instructions from the manufacturer on how to exactly cook something) having their own IP.



Its always thought that ultimately, processing power will reach a cul-de-sac where its usefulness will diminish, and that the same is true for storage space. I would disagree with that. Firstly, remember Bill Gates and his "640K of memory should be enough for anybody" prediction (actually, this is generally held to be an urdan myth, but never mind.) The point is that a MB was once thought to be an enormous about of data, and so was a GB, and a TB. Already its possible to build a home computer with a TB of storage space. What you gonna use all that space for????

Well, back in my days of first installing Quake II, onto my lovely shiny new 3 GB HDD, that game robbed me of LOTS of precious space. Nowadays, its nothing for a game to be 650 MB or something..... But back then it was a robbery. I imagine that software will continue to grow in size, as not just computer games but Operating systems and applications grow in functionality and features. Imagine a computer game that LOOKS REAL, NEVER plays the same way twice, and encorporates proper articial intelligence based on neural net technology.

Which reminds me.... Many of my good friends are doing PhD studies in neural nets. Crudely speaking, this is an attempt to design a processor, or a computer, that computes in a way that's based on the operation of the human brain....

....Imagine a computer that learns HOW you like to operate it, without you having to really tell it??? Imagine a computer that thinks of finding new ways of helping YOU, and only YOU to do what YOU want to get done. I mean, REALLY thinks about it, just for YOU. It becomes more like an extension of your own mind, and starts thinking a little like the way you do.

Work involing quantum computing, the holy grail of modern computing research, where computations are processed not on the electronic, but on the quantum level, are also underway. We might create a computer more powerful than current large CRAYs, that is just the size of a pin head. Remember that even simple CPUs of today used to be so big that they filled whole rooms....

This might sound like sci-fi, but the sci-fi of yesterday is often the practical reality of today.

I also imagine that the future, in the way that it always does, will be far stranger than any of us imagine. An invisible computer casing, perhaps? Or one that has no mass??? Or one that's powered by anti-matter / matter reactions like the warp drive of the starship Enterprise....
 
Oh, and I heard a little bird tell me that a girl in the US was working on a laptop that cost $15 to produce....

How this was possible (when we are talking about building feature generations of CPU in space under anti-gravity conditions) I do not know....

Maybe it was made out of paper, or something. Or used clockwork....
 
Although neural networks with the ability to learn would be marvellous, I really think that optical and maybe even quantum computing will be ready for general real world use before this.

NASA scientists work to improve optical computing technology

From Scientific American : Breaking computing speed barriers:

All current computer device technologies are indeed limited by the speed of electron motion. This limitation is rather fundamental, because the fastest possible speed for information transmission is of course the speed of light, and the speed of an electron is already a substantial fraction of this. Where we hope for future improvements is not so much in the speed of computer devices as in the speed of computation. At first, these may sound like the same thing, until you realize that the number of computer device operations needed to perform a computation is determined by something else--namely, an algorithm.

From The international society for optical engineering:

West Lafayette, Ind. | 16 January 2002 -- Advanced optics such as lasers, crystals, and holograms may work in concert with quantum theory to revolutionize computers in this century, promising tremendous speed and abilities, according to a new book.

Computers created within the next two decades could revolve around a technology in which holographic images process huge amounts of information, says the author, Purdue University physics Professor and SPIE member David D. Nolte.

In his book, Mind at Light Speed: A New Kind of Intelligence, Nolte describes how optics-based computer technologies may evolve over three generations during the next century.
 
Future advancements in computing by
98.gif
 
I read something recently, and I don't remember where, but it was this:

You've got your laptop. Its all set up for your company's LAN with tcp/ip settings, printers, mapped drives, etc. Oops, now you have to go to another site and plug it into the network there.....

Problem? Everything has to be configured again.... Time to call that bad tempered Techie (probably me) to sort it out.... But, of course, he' s on his lunch break, and you need the data NOW and right NOW!

The solution? A intelligent LAN Manager created by Intel. Just plug it in and watch it reconfigure itself, just like the way you plug a USB device in and it instantly starts working.
 
I asked Professor Colin Fyfe, a PhD supervisor in The University of Paisley in Scotland, who leads a team who research Artificial Neural Networks (i.e. only exist in software) to speak about how he sees the neural network as influencing the personal computer in time to come:


Well Phantasm66, Artificial neural networks are one response to a crisis of confidence in standard AI as it slowly dawned that traditional AI created a very brittle "intelligence", an intelligence that seemed very like human intelligence in some areas but that quickly collapsed when it faced problems outwith its very narrow area of expertise.

Artificial neural networks start with very simple building blocks and attempt to build intelligence which is grounded in sensory perception. As computing becomes ubiquitous - fridges talking to phones talking to cookers talking to tvs etc, we require a holistic intelligence which is immersed in our form of life- 21st Century man with powerful computing facilities.

It is essential that our IT-based machines act with common sense which means that they must have some form of knowledge of our type of life-style so that robust and intelligent decisions can be made. Also as PCs have become much more powerful the computational overheads associated with such powerful software becomes possible. There will almost certainly, at least initiallly, have to be some interfacing with standard AI so that people will be able to quiz the reasons for the decisions taken but later, I suspect that we will just come to accept the decisions - after all, I accept that you are making intelligent decisions yet I cannot look inside your head.

As to when, I guess that this will be demand driven - the more people wish to have ubiquitous computing, the more necessary it will be to create such machines.



I would like to take this opportunity to thank Colin for taking the time to talk to you mere mortals.

Colin's website is here:
http://cis.paisley.ac.uk/fyfe-ci0/

Its interesting to read some of the papers if you think you have a chance in hell of understanding them. If you do, then I don't think Colin would mind if you e-mailed him to discuss his work.

There are also some cute pictures of his grandson on the site.
 
Thanks for the responses guys! I've already learned stuff I hadn't know before. Keep those posts coming and lets inform the less computer inclinded (including myself :) ) about where computers are headed!!!
 
Re:

Firstly, speeds and bandwidth will begin to increase... eg. 1.8ghz... 2.2ghz... 3.3ghz... ATA66... ATA133.. ATA333... DDR266... DDR333... DDR533. Things will become more complexed. Soon your peripherals will also not be subjected to Serial/Parellel/PCI/AGP/USB but other weird input. Next time Gb or even TB will become useless due to the amount of mp3's we downloaded. 56k MAY not advance, but Cable/ADSL would, and you'll get them at unbelievably cheap prices. People wil become more complexed. A child at 5 MAY even know how to connect and use the internet. And soon, they may invent a chip to be placed in your head and use it to pay for your day-to-day expenses.
 
I will join Homer Simpson as one of the first to be stamped Intel Inside and run on Microsoft Brain 2.0 (I said 2, not 1, k?)
 
i'm really interested whather we could really achieve those 3D projectors (or even more, the 3D computer in FF) in scifi movies and the Jetsons.
all i can think of now is that a projected light needs to have a screen of some kind, or else we won't be able to see it. so what'll we use for the screen for this 3D projectors...
some kind of dust? water? it just doesn't seem possible...
 
The Shape of things to come...

Originally posted by eddy05
Firstly, speeds and bandwidth will begin to increase... eg. 1.8ghz... 2.2ghz... 3.3ghz... ATA66... ATA133.. ATA333... DDR266... DDR333... DDR533.

I think that one of the main problems is that we keep increasing bandwidth, but what we really need to SERIOUSLY start looking into is improving latency.


A child at 5 MAY even know how to connect and use the internet.

My 5 year old sister already can, after a fashion.
 
there comes to say, that ever since technology increase, people became more complexed. One of my friends never used internet at her home before, on after i gave her a 56k a/c, and she awed at the ability to use the net. On the other hand, another 7 year old boy completes the game Tomb Raider. This goes to show that youngsters are more complexed...

speaking of the topic, u all dun mind having the "chip" installed into ur head?
 
Originally posted by Ai Hate
i do.
i have a thing against the thought of "improving the mankind species".

partially Agreed... if someone wants to change him/her self, then go on... just don't expect me to do it...

And I have this little thingy against letting people know where I am and what I'm doing every second of my life...

Long live privacy... (not to be confused with piracy ;):D)
 
Hey!

No boot? No boot means no start.
You will not be able to start ur computer, therefore , u cannot use :D
 
Terabytes is the future of computing, once we see all expressed in terabytes, we'll be in the futures :):) even ram and video cards...
and maybe refresh rates of 1 terahertz and millions of terapixels, and dot pitches of only 0.1 or even negative... and more sci-fi stuff
 
Originally posted by Phantasm66
You will be assimilated! Resistence is futile!
223-3.jpg

www.theborgcollective.com

Well, as long as we have Picard and Data I guess we'll be allright :D

As for the terra... I guess you're right... But then again, quite a few are living in the future right now... I don't have the cash to run a machine with a terrabyte of harddrive space, but quite a few does...

But I think it's too bad we're moving into that amount of space... Programs today are bloated enough... If only we'd get some more programmers from the "old school"...
(A progam doesn't have to use over 500Mb...)

.02$
 
Back