Page 12 - FCW, June 15, 2016
P. 12

FocusOn | HYPER-CONVERGED INFRASTRUCTURE
network configuration and security for workloads. This approach is particularly useful as the infrastructure grows and workloads are deployed or migrate across dozens of nodes and between data centers.
Given that HCI aggregates storage across multiple nodes into larger virtual pools, it’s often used as an alternative to monolithic storage arrays on a storage-attached network. Indeed, several of the leading HCI vendors got their start in storage by providing scale-out building blocks that could be joined into a virtual array for both primary storage (e.g., databases and shared filesystems) and backup/archiving.
HCI catalysts and government IT challenges
The rise of hyper-converged systems results from the con- fluence of several large technology and business trends that span disciplines:
• Hardware. Decades of Moore’s law progress in pro- cessor performance and memory speed and capacity have rendered the average server overpowered for typi- cal workloads, making the one-application, one-server design model obsolete. Furthermore, the inherent advan- tages of semiconductor memory and rapid improvements in nonvolatile flash density have allowed servers to far exceed the input/output performance of hard disks and approach price parity with high-performance disks for primary storage.
• Software. Server virtualization allows today’s overpow- ered hardware to be carved into workload-appropriate logical units. That decoupling of physical and logical resource sizing enables workload consolidation onto fewer systems while providing flexibility to more precisely match application requirements with resource capacity.
More recently, storage virtualization software has worked in the opposite direction, enabling multiple, previ- ously discrete spindles and disk arrays to be aggregated into arbitrarily large pools that can be tapped for block, file or object storage.
• IT operations. Server virtualization and the demand for new applications — the latter fueled by the digitization of business processes and the explosion in the number of mobile clients — have led to a concomitant increase in system management complexity and overhead that’s unsustainable without system consolidation, simplifica- tion and automation.
The Federal Data Center Consolidation Initiative is a direct response to that trend. According to the Govern- ment Accountability Office, agencies had closed 3,125 data centers by the end of fiscal 2015 and plan to close another 2,078 by the end of fiscal 2019.
• Business environment. Lingering recessionary effects
have intensified business competition and heightened management emphasis on efficiency and sustainable growth, which together have created a sustained period of tight budgets for noncore overhead activities such as IT. According to the IT Dashboard, the compound annual growth rate for federal IT spending over the past six years was just 1.3 percent.
Meanwhile, federal IT departments spend three times more on operations than on service development and modernization. Translating some of that 68-plus percent share of the budget from routine maintenance to IT inno- vation has been an explicit strategy of the Obama admin- istration — and a clear opportunity for technology-fueled automation and efficiency.
The IT Dashboard results also show that agencies are more efficient than most enterprises because IT analysts peg the average IT organization’s spending on routine activities such as system management, patching/updating, monitoring and troubleshooting at 70 percent or more. According to a 2015 IDC report, operational spending on government IT results from large, older data centers that aren’t consolidated. Those centers often support ineffi- cient, legacy, three-tier applications that are ill-suited to cloud deployment or redesign. The result is little available staff time and operating budget to respond to new service requests or support business innovation.
Private-sector organizations are increasingly turning to cloud services to support new applications, a prac- tice quite familiar to federal IT managers working under the cloud-first mandate. Still, the cloud isn’t the cheap- est option, particularly for static, predictable workloads; those that need a lot of storage; or those with stringent government rules and requirements for backup, archiving and data sharing — hence, the need for a more efficient and scalable architecture for internal data centers.
Collectively, the budget, operational and IT governance dynamics play right into HCI’s wheelhouse, making it a great fit for many agency workloads.
HCI opportunities in agency IT
The HCI market is young and rapidly maturing, and ven- dors have approached it by stressing different strengths and targeting different workloads. Many HCI products emerged as scale-out storage alternatives to traditional network-attached storage arrays and have since evolved to allow hosting of general-purpose application VMs.
The most common and propitious HCI scenarios are: • Bulk shared storage. By aggregating locally attached storage volumes, HCI serves as a scalable software-defined storage platform. Basic file and block storage are the


































































































   10   11   12   13   14