HPE debuts container platform for hybrid cloud implementations

Bob O'Donnell

Posts: 80   +1
Staff member
In context: In the world of enterprise computing, few topics are as hot as hybrid cloud and cloud-native containerized applications. Practically every company that sells to enterprise IT now seems to have an offering and/or an angle that speaks directly to at least one, if not both, of those areas.

Most of the attention, of course, comes from software companies or the software divisions of larger conglomerates because of the critical role that software plays in enabling these technologies. As a company that’s been known almost exclusively for hardware over the last several years, HP Enterprise (HPE) was seemingly at a significant disadvantage—at least until their announcement this week at the KubeCon conference.

In a move that was both surprising and encouraging, the company debuted a new Kubernetes-based tool called the HPE Container Platform that it says will help organizations hasten their adoption of hybrid cloud architectures by, among other things, allowing legacy, non-native applications to be containerized and managed in a consistent fashion. Ever since Dell Technologies’ purchase of VMWare, in particular, HPE has been seen by many as a company that understood and evangelized the concept of hybrid cloud but didn’t really have the tools to back up that vision. With its Container Platform, however, HPE now has what appears to be a solid set of software tools that will allow organizations to address some of their biggest challenges around legacy software modernization.

Unbeknownst to many, HPE has been acquiring a number of smaller software companies over the last few years, most notably BlueData and MapR. It’s the combination of those companies’ technologies, mixed in with a healthy dose of pure, open source Kubernetes, that gave HPE the software capabilities it apparently needed to build out this new hybrid cloud-friendly platform.

"Instead of presuming that everything would eventually move to the public cloud, there’s been a recognition that a hybrid computing model that supports both public cloud and on-premise private cloud is going to be with us as the mainstream option for many years to come."

As HPE and many other companies have pointed out—and the market itself has started to recognize—cloud-based software technologies and public cloud-style computing-as-a-service capabilities are incredibly powerful, but they don’t work for all types of applications and all types of companies. In fact, IaaS (Infrastructure as a Service) and PaaS (Platform as a Service) services represent only a small percentage of the workloads in most companies. Because of costs, regulation, complexity, data gravity (that is, the attraction of applications and services to large amounts of data, much of which has yet to migrate to the cloud because of storage costs, etc.), and most importantly, the wealth of difficult-to-change legacy applications that still play an incredibly important role in organizations, there’s been a significant shift in thinking over the last 12-18 months or so.

Instead of presuming that everything would eventually move to the public cloud, there’s been a recognition that a hybrid computing model that supports both public cloud and on-premise private cloud is going to be with us as the mainstream option for many years to come. In fact, there’s still a huge percentage of total computing workloads that don’t have much, if any, connection to the cloud at all.

On the one hand, that recognition has brought a new sense of vigor to the enterprise hardware computing companies like HPE, Dell Technologies, Lenovo, Cisco, etc. that many had essentially written off as dead a few years back when the general thinking seemed to be that everything was going to move to the public-cloud. On the other hand, there have been learnings from the consumption-based business models of cloud computing (e.g., witness HPE’s GreenLake announcements from earlier in the year and Dell Technologies On Demand offering from just last week), as well as the cloud-native software development model of containerized microservices. As HPE’s Phil Davis succinctly points out, “The cloud is not a destination — it’s an experience and operating model.”

The end result is that organizations want to figure out ways in which they can combine many of the benefits of that cloud-based operating model with the reality of their own on-premise hardware and legacy applications, while fulfilling the unique requirements of those older applications. HPE’s Container Platform—which is expected to be available in early 2020—attempts to merge the two worlds by containerizing older applications without having to go through the long, painful, and expensive process of rewriting or refactoring them.

More importantly, Container Platform provides the ability to run those containerized legacy applications (as well as regular cloud-native containerized applications) on bare metal servers, without having to incur the costs of running virtual machines—a clear knock at Dell Technologies, and more specifically VMWare. In addition, the HPE Container Platform’s other twist is that it can automatically provide access to persistent storage for these containerized legacy apps. Many older apps need persistent storage to run properly, but that’s not a capability that containers easily enable. As a result, this one requirement has prevented many apps from being modernized and moved to the cloud. By directly addressing this need, HPE believes it can work with its base of customers—who are more likely to be running legacy applications anyway—to move them to a unified environment based on containers. That, in turn, should let them more easily manage their applications in a consistent fashion, thereby saving costs and reducing complexity for IT organizations.

The logic and vision behind this new platform strategy are sound, and it’s encouraging to see HPE take a significant new jump back into the software world. It remains to be seen, however, how well the company can convince potential customers of its software acumen and its ability to function as a key software platform provider. For certain customers, the capabilities of the HPE Container Platform seem like they could be very appealing, but the world of enterprise software is extremely complex and fragmented. Others with large existing investments in other platforms might have a harder time making a switch. Still, this seems like a strong strategic move by HPE and its management team, and one that’s clearly going to point the company in some interesting and exciting new directions.

Bob O’Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting and market research firm. You can follow him on Twitter . This article was originally published on Tech.pinions.

Permalink to story.

 
Back