TechSpot

The real software revolution is in the data center

By Jos
Sep 1, 2015
Post New Reply
  1. Sometimes, behind-the-scenes work is more important than the up-front stars. Just ask anyone who’s worked on a special-effects-laden movie or other video project.

    In the case of today’s tech business, the “stars” are mobile devices like smartphones and all the capabilities available with them. The real work, however, is happening behind the scenes in massive data centers powering all the services and applications that bring our mobile and other computing devices to life.

    Much of the tech world and press focus almost solely on the “stars.” Of course, there are some good reasons for this bias. It’s hard not to notice how quickly many people’s eyes glaze over as soon as a phrase like “data center” is uttered in polite company. For many, it’s just plain boring and, even for interested parties, it can be an extraordinarily complex topic.

    Nevertheless, there are some key topics and advancements that are not only worthy of but, frankly, in need of some discussion. One of the biggest data center topics is virtualization. Essentially, virtualization means the ability to run software at a layer that sits above direct contact with the hardware—a process sometimes called hardware abstraction.

    Practically speaking, virtualization allows computing devices to do multiple independent things—not just multitasking, but simultaneously running multiple operating systems or functioning as the equivalent of several independent devices. On PCs, not many people have the need to do this, so it’s not a huge market. In servers, however, it’s absolutely essential and has completely revolutionized the architecture of today’s data centers.

    VMWare, whose big VMWorld trade show is happening this week, popularized this development about 14 years ago with virtualization on the server itself. Since then, there have been dramatic improvements to the technology by VMWare, Citrix, Microsoft, and others. Even more importantly, virtualization has expanded to other devices as well.

    Desktop virtualization—sometimes called VDI (Virtual Desktop Infrastructure)—enables a single server to actually provide multiple independent desktops (complete with operating systems and applications) to hundreds of connected devices. In a classic case of what’s old is new again, this mainframe-like computing model has now become a mainstream part of how computing gets done in business.

    Unlike the mainframe model, however, today’s iterations can offer workstation-quality graphics, as the result of things like nVidia’s new Grid 2.0 architecture. Not only does Grid 2.0 enable the usage and virtualization of GPUs in today’s servers, it allows multiple GPUs to work simultaneously on a single task, offering even greater than workstation-quality performance.

    In addition, today’s version of virtualized desktops works with much more than desktop PCs or traditional thin clients. In fact, many of today’s biggest mobile applications—such as mapping, personal assistants, and more—are leveraging the same virtualization-driven, cloud-based computing models and turning all of our smartphones into thin clients. That’s why these data center developments are so critical for today’s mobile devices.

    Today’s biggest mobile applications—such as mapping, personal assistants, and more—are leveraging virtualization-driven, cloud-based computing models and turning all of our smartphones into thin clients.

    Another key data center technology development is called hyperconvergence. In a sense, you can think of hyperconvergence as taking virtualization to the next extreme, because it involves organizing all of the separate components found in a data center—servers, large Storage Area Networks (SANs), networking routers, etc.—and turning them into a single logical unit that comes under software control.

    Like virtualization, hyperconvergence isn’t a brand new technology—although it’s only been around for a few years—but there are some key new developments that are becoming critical for today’s consumers and business users to understand. And, once again, there’s a cloud computing-based connection. In fact, the whole concept of hyperconvergence was largely popularized by megasites like Google, Amazon and Facebook, who quickly realized traditional data center architectures didn’t meet their rapidly expanding needs.

    As a result, these companies started to create their own commoditized hardware components and build powerful software to control all of it in a unified way. Now, companies of all types and sizes are looking to create these kind of powerful yet flexible data center architectures, which will help them power the next generation of services and applications to inform, entertain, educate, and transact with us.

    To enable those capabilities, traditional server vendors like Dell, HP, and Lenovo are partnering with smaller software-focused companies like Pivot3 and others to deliver hyperconverged data center appliances, which integrate all elements of a data center into a single box. The idea is to offer much simpler solutions that are significantly easier (and less expensive) to manage.

    Data center technologies may have bewildering names, but they are playing increasingly essential roles for all the devices and services—both consumer and commercial—that keep us engaged every day. They certainly aren’t as sexy as mobile apps, but they’re at the heart of a revolution that’s bound to be much longer lasting.

    Bob O’Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting and market research firm. You can follow him on Twitter . This article was originally published on Tech.pinions.

    Permalink to story.

     

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...