Page 20 - GCN, August/September 2017
P. 20
DATA ANALYTICS
THE VALUE IS IN THE DETAILS
When agencies layer and share data at a granular level, they can uncover true insights.
SPONSORED CONTENT
LAURA GRANT
HEAD OF DIGITAL INNOVATION, PUBLIC SECTOR SAP
IN THE PAST, running queries and analysis at the most granular levels was cumbersome and difficult. Agencies had to break data into chunks for processing. Even then, systems would sometimes lock because the
queries were so big. Therefore, summarized and aggregate copies of data were created to query large datasets. Some of the data was drillable, but not always. These extra steps take time and are costly.
That’s simply not good enough anymore. Decisions need to be made rapidly, agencies need to access detailed data in the moment. They need to be able to find those nuggets of information that will drive program and policy decisions. Innovations, such as predictive analytics and machine learning, need granular level data. Fortunately, technological advances with in-memory computing, cloud computing, and related capabilities have made it affordable for agencies to consume data without the time- consuming traditional process of aggregation.
The More Data, the Better
Interesting discoveries and insights are uncovered when agencies can process and visualize detailed information at the lowest level and even add other layers of data. That’s when we can start to solve problems, update policies and programs, and make decisions in real time based on trusted data.
That’s also when agencies can start to move into automation and use machine learning to improve processes and outcomes. For example, machine learning can help agencies match
job candidates to requirements without the potential for unconscious and unintentional bias. This same technology can also remove the human error that inevitably occurs during invoice matching.
When we start merging Internet of Things sensor data with agencies’ transactional and historical data, we can predict and get ahead of
maintenance and inventory needs. In the very near future, artificial intelligence will drive digital assistants so employees can talk to a system at work the way many of us talk to Alexa or Siri at home.
However, those improvements only happen when the technology has access to all of data at the granular level.
Need for Trust and Collaboration
The need to interface with older, legacy systems is keeping some agencies from fully embracing these innovations. Agencies are comfortable with the way those older systems operate, but those systems often make it difficult to translate and share data at granular levels.
As agencies start to retire their legacy systems, they will have the opportunity to consolidate and re-imagine business processes. Can the process be simplified? Could things be done a different way? Were those processes built around a policy that is no longer relevant?
Simplifying processes and consolidating systems will give agencies more transparent data. And as they shut down those legacy systems, they will free up money for innovation and create an architecture that can support it.
Another big challenge is data trust. Government employees often don’t want
to share data with other agencies—or even sometimes within their own agency—unless mandated because they lack trust. However, we are starting to see more collaboration across agencies, and they are seeing the value in terms of outcomes and impact. And that success is triggering more data sharing.
The more agencies create trusted data sources and continue to collaborate, data analytics will have a much more significant impact.
Laura Grant is head of digital innovation for the public sector at SAP.
S-20