Page 20 - GCN, Oct/Nov 2016
P. 20

ADVANCED ANALYTICS
S
SP
P
O
ON
N
S
SO
OR
RE
ED
D
C
C
O
ON
NT
T
E
EN
NT
T
WRANGLING BIG DATA
In-memory computing offers agencies an efficient way to create value from data.
LAURA GRANT
HEAD OF DIGITAL INNOVATION,
PUBLIC SECTOR, SAP
ALL GOVERNMENT AGENCIES are seeing data volumes exploding. Agencies are managing data from emails, transactional systems, videos, social media, sensors, and other
sources that are creating treasure troves
of information about customers, citizens, products, and services. Less than 0.5 percent of all of this data is ever analyzed—yet if organizations leveraged the full power of big data—they could decrease their operating costs by as much as 30 percent.
Why are agencies not leveraging these opportunities? They are trying, but as data volumes grow, data environments grow increasingly complex. Agencies struggle
with how to get the most out of their data as
IT budgets decrease. How can government organizations efficiently bring disparate
data sources together and analyze it to make decisions that positively impact the mission? These challenges require a new way of thinking.
In the past, analytical data was nicely structured and kept in relational databases
and data warehouses. As data demands increased, organizations simply added hardware and designed custom applications. These approaches, however, took more time and cost more than ever intended. Workarounds, such as complex spreadsheets that are unmanageable at scale, have proliferated across departments.
Agencies are dealing with this data explosion while also being expected to provide proactive, continuous intelligence and automated responses that are integrated into their operational model on any device. As a result, IT departments must think differently about their approach to garnering insight from data.
Instead of traditional data warehouses, organizations should adopt an agile platform approach that leverages the latest in memory technology and has the ability to support geospatial, predictive, text, and sensor data.
In traditional analytical environments, due to technology limitations of the past, data scientists pulled a subset of data out of a database onto their local machines, built models, and then moved the models to a production environment to use the model against more data. This is no longer needed.
Agencies can now keep data, modelling, and analysis within the same environment, which is much more efficient. Agencies can reduce
or remove nightly, weekly, and monthly data extractions and batch jobs.
By combining data analytics and in-memory computing, agencies are achieving impressive results. One organization is crunching a century’s worth of weather data to provide more accurate climate information to drive decision making based on quantitative analysis. Another has been able to reduce crime rates in targeted areas by 55 percent; and several federal agencies have improved their financial visibility and audit preparedness by reducing the time and effort required to perform data acquisition and validation.
The impact of analyzing data at the lowest levels can be seen in a state organization struggling to find the root cause of their high infant mortality rates from data residing in more than 800 individual systems. After a quick implementation, they’ve been able to analyze agency information in a factual, real- time, collaborative environment. This type of granular analysis allows these agencies to tailor solutions to the people who need help most.
Organizations are starting to come to
grips with the growing complexity of their data environment and the limitations of
old technology. It’s time to embrace an
agile platform to store, access, and most importantly, analyze data. After all, what good is data if you can’t use it?
Laura Grant is the Head of Digital Innovation for Public Sector at SAP.
S-20


































































































   18   19   20   21   22