Page 14 - GCN, August/September 2017
P. 14

DATA ANALYTICS
MAKE SENSE OF THE DATA DELUGE
Modernizing infrastructure reveals a whole new level of value from gathered data.
SPONSORED CONTENT
CAMERON CHEHREH
CHIEF OPERATING OFFICER, CHIEF TECHNOLOGY OFFICER & VP, DELL EMC FEDERAL
AGENCIES ROUTINELY COLLECT data from a variety of sources; ending up with a rich, often unmined reposi- tory that could help them make faster, more informed decisions. Data is
generated by public safety organizations, social services, financial transactions, metrics from government services web logs, and a growing number of different sensors. The potential of this data to help agencies save money and improve citizen services is almost limitless, as long as agencies collect, store, protect, and analyze the data efficiently.
Take the simple example of street light performance. By analyzing sensors in street lights along with traffic usage patterns, agencies can better manage electricity at the state and local level. Also consider 911 services—by analyzing police and ambulance logs and vehicles—agencies can better determine how many police vehicles should be assigned to specific city sectors.
Three Steps to Get the Most Value From Your Data
The potential is there, but many agencies are using older unconnected data repositories, making comprehensive and effective analytics difficult. Add to that fast-growing stores of both structured and unstructured data and traditional infrastructure, and it becomes easy to see the problem.
1. Focus on the Outcome
It’s vitally important to focus on the desired mission outcome or strategic outcome instead of a technologically based outcome if you want to get the most value out of your data. If you skip this step, you run the risk of becoming misaligned with your mission. Revisit that mission statement frequently—it’s not uncommon for missions
to change over time as budgets and priorities shift. With your mission in mind, you are better prepared to select the most relevant data.
2. Modernize Your Infrastructure
Next, rethink and modernize your infrastructure, starting at the hardware level. The goal is to
grow and scale while reducing your footprint, power, and cooling requirements, and simplifying your data architecture. Two good strategies
for achieving these goals involve moving to a converged or hyper-converged architecture. This consolidates the infrastructure into building- block type boxes you can add as needed.
With this type of infrastructure, agencies implement a data lake—a way to consolidate previously siloed data repositories containing both structured and unstructured data into a scalable, cloud-ready, software-based data warehouse. A data lake allows analysts to use modern data analytics tools to examine all relevant data. This incremental, modular approach also helps agencies scale out and up as needs change.
3. Evaluate Your Application Layer
While the application layer may not seem relevant to data storage and analysis, it is indeed critical. Data is accessed through this layer. In
a way, that is how we recognize the value of the data. Older applications may use out-of-date processes or languages, or may no longer be mission-critical. So over time, it makes sense
to modernize applications using Agile and DevOps strategies to create more lightweight, flexible, easy-to-use, cloud-native applications. Modernizing applications this way also means they can more easily communicate with data lakes and big data analytics solutions.
When transforming your infrastructure or upgrading your applications, it’s important to
keep your eye on the prize. Don’t transform for transformation’s sake. Do it to make more intelligent decisions, improve citizen services, and reduce costs. And always do it with your mission in mind.
Cameron Chehreh is chief operating officer, chief technology officer at Dell EMC Federal.
S-14


































































































   12   13   14   15   16