Virtually everyone who closely watches the tech industry has heard venture capitalist Marc Andreesen's famous quote about "software eating the world." The implication, of course, is that software plays the most important role in tech and the capabilities of software are the only ones that really matter. In addition, there's the further suggestion that the only way to really make money in tech is with software.

While I won't disagree with the underlying principles, I am starting to wonder if what we've traditionally thought of as software will really continue to exist several years into the future. It's not that there won't be code running on hardware devices of all types, but the way it's packaged, sold, discussed, and even developed is on the cusp of some radical transformations.

In fact, there have already been substantial changes to the traditional types of software that were so dominant in the tech industry for decades: operating systems and applications.

Operating systems (OS's) used to be considered kings of the software hill. Not only did they sit at the heart of client devices, servers and virtually every intelligent device ever created, they also enabled the all-powerful ecosystems. It was their structure, rules, APIs and other tools that enabled 3rd party companies to create applications, utilities, add-ons, and other software pieces that turned OS's into platforms.

While those structures remain in place, the world around us has evolved to include multiple important OS options. In addition, though there are certainly important differences between OS choices across different types of devices, most application vendors have had to focus on the commonality across platforms, rather than those unique differences, leading to applications that run across multiple platforms. For this, and many other reasons, platforms and specific operating systems have lost much of their value. Yes, they still serve an important purpose, but they are no longer the sole arbiters of what kinds of applications can be built.

Applications have also seen dramatic transformations. Gone are the days of large, monolithic applications that only run on certain platforms. They've been replaced by smaller "apps" that run across a variety of different platforms. From a business model perspective, we've gone from standalone applications costing hundreds of dollars to single digit dollar mobile apps to completely free apps that rely on services and subscriptions to make money.

Gone are the days of large, monolithic applications that only run on certain platforms. They've been replaced by smaller "apps" that run across a variety of different platforms.

Even in the world of large applications, there's been a dramatic shift to subscription-driven pricing, with Microsoft's Office 365 and Adobe's Creative Cloud services being some of the most popular. Not all end users are excited about this model, but it seems clear that's where traditional applications are heading.

Service and subscription-driven models have also come to mobile clients, servers and other devices, as companies have realized that the continuous flow of smaller amounts of regular income provided by these models (as opposed to large lump sum purchases) offers much more stable revenues.

Even the structure of software has changed, with large applications being broken down into smaller chunks that can act independently, but work together to provide the functionality of a full application. This notion of containers (or chunks of code that function as independent software objects) is particularly prevalent among cloud-based applications, but it's not hard to imagine it being applied to device-based applications as well. In addition to their other benefits, containers bring with them platform and physical location independence and portability, two key attributes that will be essential for new types of computing architectures---such as edge computing---which are widely expected to dramatically influence many future tech developments.

Another benefit of containers is reusability, meaning they could be leveraged across multiple applications. While this is certainly interesting, is does start to raise questions around complexity and monetization for containers, that don't yet have easy answers.

There are even growing questions about what really constitutes software as we know it. Technically, building voice-based "skills" for an Amazon Echo-based product is software design, but the manner with which people interact with skills is much different than how they've interacted with other types of software. As digital assistant models continue to evolve, the nature of how these component-like pieces are integrated into the assistant platform will also likely change. Plus, as with containers, though some new experiments have been started, there are still serious questions about how this type of code can be monetized.

Finally, and most importantly, virtually everyone is adding in Artificial Intelligence (AI) and machine learning capabilities into their software code. Right now, much of these additions are relatively simple pattern-recognition based functions, but the future is likely to be driven by software that, in many ways, can start to rewrite itself as it learns these patterns and adjusts appropriately. This obviously marks a significant shift in the normal software development process, and it remains to be seen how companies will try to package and sell these capabilities.

In fact, one could argue that software is being "eaten" by services.

Taken together, the implications for all of these software-related developments are profound. In fact, one could argue that software is being "eaten" by services. That's already occurring in several areas (think Software as a Service, or SaaS), and the future of code-based capabilities will likely all be delivered through some type of monetized service offering. While that may be appealing in some ways, there is a legitimate question about how many services any person, or any company, will be willing to sign up for. Particularly when there are costs related to these services---we need to realistically recognize that this business model can only be taken so far.

Watching the tech industry evolve over the last several decades, it's fascinating to see how many pendulum shifts occur across many different segments. From computing paradigms to semiconductor architectures to the role and balance between hardware, software and services, it seems that what was once old can quickly become new again. In the case of software---which used to be bundled for free with early computing hardware---we may be coming full circle, with most code soon becoming little more than a means to sell services that leverage its capabilities. It certainly won't happen overnight, but the end of software as we know it may be sooner than we think.

Bob O'Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting and market research firm. You can follow him on Twitter . This article was originally published on Tech.pinions.