The road to 44 trillion gigabytes: Software-defined datacenters and seeding the cloud

Julio Franco

Posts: 9,099   +2,049
Staff member

software-defined seeding storage ssd enterprise datacenter lsi guest sddc

When looking at the near future of data center technology, there are two very important trends to consider. First—the adoption of public and private cloud computing continues to become much more pervasive. Enterprises, software developers, and home users are all making the transition to cloud-based models for services and storage.

Second—devices, data, and network demand are all projected to grow at explosive rates over the next few years. By 2020, the digital universe – the data we create and copy annually – will reach 44 zettabytes, or 44 trillion gigabytes. There will be an estimated 24 billion IP-connected devices online (up from 13 billion in 2013). Network expansion is expected to more than triple in that same timeframe, from 63 million new server ports to 206 million.

And it’s these sorts of looming demand increases in particular that are driving the development of a software-defined datacenter (SDDC) that can deliver cloud-based services with optimized capacity, efficiency, and flexibility.

The evolution of SDDC is being enabled by the standardization of IT hardware infrastructure. Core IT resources (compute, network, and storage) are abstracted from the underlying hardware that resides in resource-specific pools, potentially across multiple physical locations.

These virtualized resources are overlaid with advanced management capabilities, which allow IT resources such as computing cycles, storage, and network to be allocated on-demand and at-scale for specific software requirements. Automated provisioning and orchestration functions boost the efficiency of cloud-based applications while reducing the burden on IT.

Software-defined datacenters will enable IT administrators to efficiently allocate resources on demand and track usage for ease-of-billing to internal business units. It also offers developers potential time-to-market advantages with the ability to quickly release new software, rapidly scaling capacity up and down with demand over an app’s natural lifecycle.

How does SDDC work?

The primary technical objective of SDDC is to create a virtualized pool of the three main component silos in the traditional IT infrastructure stack (compute, network, and storage) with the ability to scale across these components as needed.

This re-imagined data center architecture will allow IT managers to deploy hardware resources in support of applications and more effectively manage the lifecycles of individual hardware components, without ever disrupting application uptime.

Another benefit of a comprehensively virtual architecture is that it can offer capabilities beyond those of a top-down control structure, where the software merely simplifies the functions of subordinate hardware systems.

Instead, software-defined datacenters offers a dynamic feedback loop between the resource layers of the data center and the operating software. These layers can interact through applied analytics enabling automated controls and real-time IT management. To achieve this, however, the underlying hardware platform must be sufficiently intelligent and have the capabilities to integrate with centralized software control.

In 2014, the digital universe will equal an astounding 1.7 megabytes a minute for every person on Earth. With explosive growth in user demand a practical inevitability over the next few years, SDDC will provide a standards-based, converged infrastructure that can offer new data center capabilities, greater efficiency, and improved flexibility for administrators of private and public cloud services.

To learn more about SDDC and the changes it’s driving in the datacenter, check out this LSI whitepaper.

Permalink to story.

 
So "one stop data shops" for hackers to go snooping through.
It isn't home computers russian hackers are getting 1 billion passwords from.
Now more of your data, on someone elses hardware with terms of service that keep them out of prosecution when they fail to protect it.
Surely the best option is a NAS box, encrypted.
And thats if people don't claim that the CPU's in them are all shonky with a gifted back door for the NSA and such.
USB's all flawed. Google and Yahoo encrypted email, which they will go through.
I don't know why people should put their faith in the cloud.
 
This is one thing that my boss believes to be true. That we'll no longer purchase hardware and manage our own datacenters. That we'll buy storage and CPU from someone else in the future. Hopefully trustworthy. And save some money in doing so. (Hopefully that means I'll still have a job in the future!)
 
For non-important personal garbage I don't have an issue using cloud bases services. I don't care if Uncle Sam see's any of that. For important documents and data I would never trust any cloud based service. I have local RAIDS setup on PC's with no WIFI and LAN cards removed/disabled.
Professionally, we don't offer any cloud storage and stay away from it here at the Hospital. We know of many business's locally that do the same.
We don't have an agenda against it but for security reasons its not a consideration at this point.
 
I can see a couple of IT positions becoming redundant in the event that this is adopted.
 
Per NSA PRISM revelations, Microsoft Cloud services & Google Cloud services have Backdoor Access..........Once your Data is in the Cloud, the Vendor OWNS your Data.......
 
Just remove the porno traffic, and the world is saved again, on a single hard drive ;)
 
Back