The network vs. the computer

Bob O'Donnell

Posts: 81   +1
Staff member

The history of the technology industry has seen several swings back and forth between dependence on a network that can deliver the output of centralized computing resources, to client devices that do most of the computing work on their own.

As we start to head towards the Gigabit LTE and then 5G era, when increasingly fast wide-area wireless networks make access to massive cloud-based computing resources significantly easier, there’s a fundamental question that must be asked. Do we still need powerful client devices?

Having just witnessed a demo of Gigabit LTE put on by Australian telco carrier Telstra, along with network equipment provider Ericsson, mobile router maker Netgear, and modem maker Qualcomm, the question is becoming increasingly relevant. Thanks to advancements in network and modem (specifically Category 16 LTE) technologies, the group demonstrated broadband download speeds of over 900 Mb/s (conveniently rounded up to 1 Gb/s) that Telstra will officially unveil in two weeks. Best of all, Gigabit LTE is expected to come to over 15 carriers around the world (including several in the US) before the end of 2017.

Gigabit LTE is expected to come to over 15 carriers around the world (including several in the US) before the end of 2017.

Looking forward, the promise of 5G is not only these faster download speeds, but also nearly instantaneous (1 millisecond) response times. This latter point, referred to as ultra low latency, is critical for understanding the real potential impact of future network technology developments like 5G. Even today, the lack of completely consistent, reliable network speeds is a key reason why we continue to need (and use) an array of devices with a great deal of local computing power.

Sure, today’s 4G and WiFi networks can be very fast and work well for many applications, but there’s isn’t the kind of time-sensitive prioritization of the data on the networks to allow them to be completely relied on for mission critical applications. Plus, overloaded networks and other fairly common challenges to connectivity lead to the kinds of buffering, stuttering and other problems with which we are all quite familiar. If 5G can live up to its promise, however, very fast and very consistent network performance with little to no latency will allow it to be reliably used for applications like autonomous driving, where milliseconds could mean lives.

In fact, the speed and consistency of 5G could essentially turn cloud-based datacenters into the equivalent of directly-attached computing peripherals to our devices. Some of the throughput numbers from Gigabit LTE are now starting to match that of accessing local storage over an internal device connection, believe it or not. In other words, with these kinds of connection speeds, it’s essentially possible to make the cloud local.

The speed and consistency of 5G could essentially turn cloud-based datacenters into the equivalent of directly-attached computing peripherals to our devices.

Given that the amount of computing power in these cloud-based datacenters will always dwarf what’s available in any given device, the question again arises, what happens to client devices? Can they be dramatically simplified into what’s called a “thin client” that does little more than display the results of what the cloud-based datacenters generate?

As logical as that may at first sound, history has shown that it’s never quite that simple. Certainly, in some environments and for some applications, that model has a great deal of promise. Just as we continue to see some companies use thin clients in place of PCs for things like call centers, remote workers and other similar environments, so too, will we see certain applications where the need for local computing horsepower is very low.

In fact, smart speakers like the Amazon Echo and Google Home are modern-day thin clients that do very little computing locally and depend almost completely on a speedy network connection to a cloud-based datacenter to do their work.

When you start to dig a bit deeper into how these devices work, however, you start to realize why the notion of powerful computing clients will not only continue to exist, but likely even expand in the era of Gigabit LTE, 5G and even faster WiFi networks. In the case of something like an Echo, there are several tasks that must be done locally before any requests are sent to the cloud. First, you have to signify that you want it to listen, and then the audio needs to go through a pre-processing “cleanup” that helps ensure a more accurate response to what you’ve said.

Over time, those local steps are likely to increase, placing more demands on the local device. For example, having the ability to recognize who is speaking (speaker dependence) is a critical capability that will likely occur on the device. In addition, the ability to perform certain tasks without needing to access a network (such as locally controlling devices within your home), will drive demand for more local computing capability, particularly for AI-type applications like the natural language processing used by these devices.

AI-based computing requirements across several different applications, in fact, are likely going to drive computing demands on client devices for some time to come. From autonomous or assisted driving features on cars, to digital personal assistants on smartphones and PCs, the future will be filled with AI-based features across all our devices. Right now, most of the attention around AI has been in the datacenter because of the enormous computing requirements that it entails. Eventually, though, the ability to run more AI-based algorithms locally, a process often called inferencing, will be essential. Even more demanding tasks to build those algorithms, often called deep learning or machine learning, will continue to run in the data center. The results of those efforts will lead to the creation of more advanced inferencing algorithms, which can then be sent down to the local device in a virtuous cycle of AI development.

Admittedly, it can get a bit complicated to think through all of this, but the bottom line is that a future driven by a combination of fast networks and powerful computing devices working together offers the potential for some amazing applications. Early tech pioneer John Gage of Sun Microsystems famously argued that the network is the computer, but it increasingly looks like the computer is really the network and the sum of its connected powerful parts.

Bob O’Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting and market research firm. You can follow him on Twitter . This article was originally published on Tech.pinions.

Permalink to story.

 
"As we start to head towards the Gigabit LTE and then 5G era, when increasingly fast wide-area wireless networks make access to massive cloud-based computing resources significantly easier, there’s a fundamental question that must be asked. Do we still need powerful client devices?"

And the fundamental answer to anyone who actually cares about online security, data privacy and network redundancy is "yes"...
 
No matter how fast or low the latency of the connection, the fact that I must pay ~$1200 CAD a year for any kind of reasonable internet access and that it comes with a monthly data cap (their billing period) makes this whole discussion (for me, a regular home internet user) irrelevant. And that is not even considering the security aspect.
Without cheaper access, no cap and better security, the likes of which we have not yet seen, there is no sale to me.
But of course I am only one type of 'connectivity' user
 
In general these technologies are tailored for consumption, slow uploads huge downloads. And when in the wild they tend to be much much slower because of the ISPs. The cost is crazy too.

I prefer to have my own servers to be honest.
 
"As we start to head towards the Gigabit LTE and then 5G era, when increasingly fast wide-area wireless networks make access to massive cloud-based computing resources significantly easier, there’s a fundamental question that must be asked. Do we still need powerful client devices?"

And the fundamental answer to anyone who actually cares about online security, data privacy and network redundancy is "yes"...

It's not just that though.

Yes, bandwidth will be increased for many cloud related services but as that happens there will also always be increasingly demanding applications that require a powerful client machine. It's like "well great, 5G enables everyone to stream HD on their mobile devices without a problem, except that it isn't even out and by the time it is 4k will be standard." As they increase the amount of data being streamed so too will the amount of data people want to stream increase. While cloud may eventually replace a certain number of powerful mobile devices and tablets where the latency and disadvantages of cloud are acceptable, I really can't see people who are engaged in the smartphone spec war even considering cloud.

Let's not even forget one of the major disadvantages of cloud computing, in that it still requires an app that has to run off the operating system of the client's system. That is, because it is not the operating system and must run on higher layers you will always get complications and performance disadvantage then say if you were running the app on the native device. Double that because you are essentially translating whatever you are sending from the cloud server to the internet and then to your device. The differences between the languages the server, internet, and your device each speak create bottlenecks.
 
No matter how fast or low the latency of the connection, the fact that I must pay ~$1200 CAD a year for any kind of reasonable internet access and that it comes with a monthly data cap (their billing period) makes this whole discussion (for me, a regular home internet user) irrelevant. And that is not even considering the security aspect.
Without cheaper access, no cap and better security, the likes of which we have not yet seen, there is no sale to me.
But of course I am only one type of 'connectivity' user

You bring up a really good point, internet data caps are the bane of cloud services. They don't really make sense either, as your ISP isn't actually providing the data, just the pipe. They don't incur notable extra costs based on your data usage and likely loose more money when the network is under-utilized. Data caps are asinine in that those supporting them, like the new FCC Chair Ajit Pai, try to compare them to water or electricity. The big difference is the ISP isn't providing the "water" or the "electricity", only the method to which it gets to you.

Canada is already in a worse position with data caps for many customers.j
 
What if there is no reception?
then isn't it equivalent to bringing a brick with me?

And as far as I know 5G is working at super high frequency meaning it's short ranged and bad at penetrating walls, so... a bit overly fantasized about the technology this article, maybe?
 
What if there is no reception?
then isn't it equivalent to bringing a brick with me?

And as far as I know 5G is working at super high frequency meaning it's short ranged and bad at penetrating walls, so... a bit overly fantasized about the technology this article, maybe?

Yes, 5G does have problems going through walls and they are currently trying to sort that issue out.
 
So you want me to pay even more every month, on top of what I already pay, for cloud computing/service ?
The only reason I have cloud "storage" now, {Box, Dropbox}, is because I received it free. I store additional backups there along with local storage in case I need something on the run.
And who's watching the data center ? Thanks, but no thanks.
I'll secure my own files/info.
 
Yes, 5G does have problems going through walls and they are currently trying to sort that issue out.

Hundreds of distributed antenna everywhere in every corner of a building I guess, unless maybe they come up with some sort of quantum mechanics antenna :p ...in reality I think it's switched modes like what's happening now, cell will switch between 3G and 4G depending on reception, so anyhow, this dream of completely centralized mobile computing with a dumb phone this article suggests will never happen.
 
Hundreds of distributed antenna everywhere in every corner of a building I guess, unless maybe they come up with some sort of quantum mechanics antenna :p ...in reality I think it's switched modes like what's happening now, cell will switch between 3G and 4G depending on reception, so anyhow, this dream of completely centralized mobile computing with a dumb phone this article suggests will never happen.

Yeah, that was one of the purposed solutions. Even if we ignore the costs and physical complications of that, it would still be a problem in that anything thick enough can block signal. I can't imagine having to setup that in a building and finding every possible deadspot.
 
Back