Page 18 - GCN, August/September 2017
P. 18
DATA ANALYTICS
MODERNIZE THE DATABASE
Back-end technology is becoming more flexible to keep pace with ever-expanding data ew insights in real time.
and the need to unlock nTO ACCOMMODATE THE volume and variety of data government agencies
justify. And developers gravitate to open-source technology because it helps optimize applications by examining the underlying code and exerts greater control over developing new features.
Anyone considering an open-source database should ensure a viable vendor or community stands behind the technology. Only a minority of the hundreds of open-source databases are running in serious production deployments. And only a small portion of those have an active community of users exchanging best practices or advancing the system. In other words, make sure you’re choosing a well- funded, well-supported platform with a roadmap for the future.
Help Users Glean Insights
Officials in Chicago have built an intelligent operations platform called WindyGrid on top of
a range of open-source technologies. It gathers
data from the city’s 15 most crucial departments, including police, transportation, and fire, and brings that data together into a map view. When officials started consolidating all that data—seven million new pieces every day—visualized onto a geospatial map, they found some unexpected connections.
For example, reports of failed garbage collection correlated with an increase in pest complaints. Now officials can more proactively manage city refuse collections. They have also found a number of opportunities to improve their response to health and weather emergencies, traffic congestion, and crime.
The resulting system makes it easier for non- technical users to glean insights from all that data. Chicago is using big data analytics to be proactive and smarter, and their users have no knowledge of the underlying database technology. Instead, the information is presented in a way that lets them concentrate on their business and mission without the need to become technology experts.
Mat Keep is director of product and market analysis at MongoDB.
SPONSORED CONTENT
MAT KEEP
DIRECTOR OF PRODUCT AND MARKET ANALYSIS, MONGODB
making it easier for agencies to take data management and analytics to the next level.
Many agencies are hampered by legacy data infrastructures that won’t scale, so if they want
to add new users or support more data, they
have to adapt applications and buy larger, more expensive hardware. Now we’re moving toward a new generation of non-relational databases that can automatically distribute data across many servers, whether those are running in a government data center or the public cloud. That approach gives agencies almost infinite scalability using low cost commodity systems.
Being able to spread data across numerous servers also helps achieve performance gains in the face of rapidly growing data volumes. And because the databases are distributed, they can automatically create copies of the data to maintain service continuity in the event of a failure. Technologies like MongoDB and other modern databases have been designed with distributed architectures and cloud computing in mind, which means high availability and low cost scalability are built in.
Faster, Better Application Development
Another driver for modernizing data infrastruc- tures is improving how quickly organizations can deploy new applications. Microservices, containers, and DevOps are helping organizations build more modular applications so multiple teams can
work on components in parallel and innovate
more quickly.
Some government agencies are also turning to open-source databases because they offer freedom from vendor lock-in and lower total cost of ownership (TCO). On the other hand, those benefits can also make projects difficult to
are collecting every day, they need to modernize their data infrastructures. Fortunately, new technologies are
S-18