Page 14 - FCW, April 2017
P. 14

CAPITALIZING ON THE CLOUD
MODERNIZE DATA ARCHIVES
Cloud-based tools hold the key to imposing structure on unstructured data.
SPONSORED CONTENT
TOM KENNEDY
VICE PRESIDENT AND GENERAL MANAGER, VERITAS PUBLIC SECTOR
AGENCIES HAVE a mandate to manage all permanent electronic records in electronic format by the end of 2019. Until only a few years ago, some agen- cies were still using the print-and-file
method. Although most agencies have made sig- nificant progress since then, others are struggling to put a records management strategy into place. Fortunately, new cloud-based tools can help.
In the Data Genomics Index released last year, Veritas found that more than 70 percent
of the data held by companies and agencies
had no relevant business value, and 41 percent of the data was considered stale, meaning it hadn’t been touched in three years. Such data
is a prime candidate for secondary or tertiary storage. Furthermore, content-rich orphaned data, which doesn’t have an owner, is overlooked but taking up more than its share of disk space.
Two macro trends are influencing those business decisions. The first one is the phenomenal growth in unstructured data, which some experts say could be as high as 50 percent a year. Clearly, throwing infrastructure at the problem doesn’t work anymore. Agencies must instead create a strategy for managing their data.
The other big trend is the emergence of cloud technology. Today, data sources might not only be in multiple locations on premises, they might also exist in private and public clouds. At the simplest level, a cloud can be viewed as just another data source, and the purpose of putting content into any archive or records management tool is to add structure to unstructured data so that content can be searched and classified.
Several emerging cloud tools can help agencies move forward. For instance, a cloud-based archiving tool is a simple way to incorporate auto-classification. The agency sets the retention policy, and the technology then classifies the documents or email messages in
accordance with that policy. It adds a content- based classification process to the federal government’s role-based Capstone approach.
Other tools identify where data resides
and provide statistics based on a multitude of classifications. The goal is to give agencies the ability to make good decisions and get rid of data that has no business value. This approach also eliminates risk by not keeping unnecessary data while reducing an agency’s data footprint and the infrastructure and costs associated with it.
Once an agency has a strong data management strategy, it can streamline many cross-functional workflows. For instance, a large federal agency was struggling with its process for preserving information that might be needed for litigation purposes. It had 600+ active legal holds at one time, and multiple employees were managing the process using a large spreadsheet, with all the risks inherent in that approach.
But because the agency had developed a good information management strategy, it was able
to add an automation tool and transform the workflow from a risky, manual process into a tight, automated workflow.
Other agencies have improved their e-discovery processes and reduced costs by using tools that intelligently cull relevant data. Likewise, a good classification engine can quickly search and find the files needed to respond to Freedom of Information
Act requests, while automated editing and redaction tools can help agencies make sure they’re not sharing inappropriate material.
A data strategy cuts across the entire agency, and a combination of cloud technology and automation can go a long way toward empowering people to make the important cross-functional decisions that improve workflows and keep costs under control.
Tom Kennedy is vice president and general manager of Veritas Public Sector.
S-14









































































   12   13   14   15   16