The history of the technology industry has seen several swings back and forth between dependence on a network that can deliver the output of centralized computing resources, to client devices that do most of the computing work on their own.

As we start to head towards the Gigabit LTE and then 5G era, when increasingly fast wide-area wireless networks make access to massive cloud-based computing resources significantly easier, there's a fundamental question that must be asked. Do we still need powerful client devices?

Having just witnessed a demo of Gigabit LTE put on by Australian telco carrier Telstra, along with network equipment provider Ericsson, mobile router maker Netgear, and modem maker Qualcomm, the question is becoming increasingly relevant. Thanks to advancements in network and modem (specifically Category 16 LTE) technologies, the group demonstrated broadband download speeds of over 900 Mb/s (conveniently rounded up to 1 Gb/s) that Telstra will officially unveil in two weeks. Best of all, Gigabit LTE is expected to come to over 15 carriers around the world (including several in the US) before the end of 2017.

Gigabit LTE is expected to come to over 15 carriers around the world (including several in the US) before the end of 2017.

Looking forward, the promise of 5G is not only these faster download speeds, but also nearly instantaneous (1 millisecond) response times. This latter point, referred to as ultra low latency, is critical for understanding the real potential impact of future network technology developments like 5G. Even today, the lack of completely consistent, reliable network speeds is a key reason why we continue to need (and use) an array of devices with a great deal of local computing power.

Sure, today's 4G and WiFi networks can be very fast and work well for many applications, but there's isn't the kind of time-sensitive prioritization of the data on the networks to allow them to be completely relied on for mission critical applications. Plus, overloaded networks and other fairly common challenges to connectivity lead to the kinds of buffering, stuttering and other problems with which we are all quite familiar. If 5G can live up to its promise, however, very fast and very consistent network performance with little to no latency will allow it to be reliably used for applications like autonomous driving, where milliseconds could mean lives.

In fact, the speed and consistency of 5G could essentially turn cloud-based datacenters into the equivalent of directly-attached computing peripherals to our devices. Some of the throughput numbers from Gigabit LTE are now starting to match that of accessing local storage over an internal device connection, believe it or not. In other words, with these kinds of connection speeds, it's essentially possible to make the cloud local.

The speed and consistency of 5G could essentially turn cloud-based datacenters into the equivalent of directly-attached computing peripherals to our devices.

Given that the amount of computing power in these cloud-based datacenters will always dwarf what's available in any given device, the question again arises, what happens to client devices? Can they be dramatically simplified into what's called a "thin client" that does little more than display the results of what the cloud-based datacenters generate?

As logical as that may at first sound, history has shown that it's never quite that simple. Certainly, in some environments and for some applications, that model has a great deal of promise. Just as we continue to see some companies use thin clients in place of PCs for things like call centers, remote workers and other similar environments, so too, will we see certain applications where the need for local computing horsepower is very low.

In fact, smart speakers like the Amazon Echo and Google Home are modern-day thin clients that do very little computing locally and depend almost completely on a speedy network connection to a cloud-based datacenter to do their work.

When you start to dig a bit deeper into how these devices work, however, you start to realize why the notion of powerful computing clients will not only continue to exist, but likely even expand in the era of Gigabit LTE, 5G and even faster WiFi networks. In the case of something like an Echo, there are several tasks that must be done locally before any requests are sent to the cloud. First, you have to signify that you want it to listen, and then the audio needs to go through a pre-processing "cleanup" that helps ensure a more accurate response to what you've said.

Over time, those local steps are likely to increase, placing more demands on the local device. For example, having the ability to recognize who is speaking (speaker dependence) is a critical capability that will likely occur on the device. In addition, the ability to perform certain tasks without needing to access a network (such as locally controlling devices within your home), will drive demand for more local computing capability, particularly for AI-type applications like the natural language processing used by these devices.

AI-based computing requirements across several different applications, in fact, are likely going to drive computing demands on client devices for some time to come. From autonomous or assisted driving features on cars, to digital personal assistants on smartphones and PCs, the future will be filled with AI-based features across all our devices. Right now, most of the attention around AI has been in the datacenter because of the enormous computing requirements that it entails. Eventually, though, the ability to run more AI-based algorithms locally, a process often called inferencing, will be essential. Even more demanding tasks to build those algorithms, often called deep learning or machine learning, will continue to run in the data center. The results of those efforts will lead to the creation of more advanced inferencing algorithms, which can then be sent down to the local device in a virtuous cycle of AI development.

Admittedly, it can get a bit complicated to think through all of this, but the bottom line is that a future driven by a combination of fast networks and powerful computing devices working together offers the potential for some amazing applications. Early tech pioneer John Gage of Sun Microsystems famously argued that the network is the computer, but it increasingly looks like the computer is really the network and the sum of its connected powerful parts.

Bob O'Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting and market research firm. You can follow him on Twitter . This article was originally published on Tech.pinions.