You are here

December 6, 2016
Author

Federal agencies produce and collect a truly staggering amount of data that when analyzed and put to work can have a significant impact on public policy and an agency’s ability to meet its mission. Most agencies are just starting their data-driven journey and it can often be overwhelming to figure out how to take stored data out of silos and start putting it to work. 

David Vennergrund, Director of CSRA’s Data and Analytics solution, sat down with Jenna Sindle, Managing Editor of Federal Technology Insider, to talk about how federal government agencies can analyze data and use the results to inform public policy.  Today, with the right tools and personnel to drive expert analysis, data can help federal agencies measure and project need, anticipate change and highlight trends. All of this can be beneficial to helping create effective public policy.

In this conversation Vennergrund shares his insights on the origins of the data-driven revolution, best practices to move agencies out of the data storage business and in to data application, as well as some of the policy initiatives and mission successes that agencies have achieved in the era of big data.

Keep reading below to find out more about how disruptive technologies are turning data into a goldmine for federal agencies.

Jenna Sindle (JS): David we’re interested in learning about how data is changing the ways in which agencies meet their missions in ways from how policy is informed to how agencies operate more efficiently and effectively in this time of tight budgets.  Before we dive into the specifics, can you tell us about the origins of this data-driven revolution in the public sector?

David Vennergrund (DV): At its foundation the data-driven revolution for government agencies was caused by an over-abundance of data and needing the more powerful tools to leverage that data, to put it to work to make a positive impact on real world problems.  Not so many years ago we could only leverage a limited amount of data that we generated. We collected structured data, stored it in databases and elaborate data warehouses, and analyzed a tiny portion of it – at great expense in time and money. We analyzed even less of the semi-structured data that came from texts, emails, logs, and other sources.  All of this unexamined data had value – but was not leveraged. Now, thanks to inexpensive data storage and distributed processing frameworks we are able to leverage this ‘dark data’ as well, which makes the world a much more interesting place. Click here to read some of the more technical questions FTI asked David.

JS: How are federal agencies using data to meet the mission?

DV: There are literally hundreds of data science projects across federal agencies in production and under development. I will share a few examples we have contributed to: National Institutes of Health (NIH) genomic data analytics is leading to precision medicine; Centers for Medicare and Medicaid Services (CMS) uses data to improve healthcare delivery and reduce fraud, waste, and abuse; Federal Aviation Administration (FAA) integrates unstructured data, including weather information, flight data, and migratory bird patterns to improve flight safety;  and Veterans Affairs data analytics ensure that veterans receive their benefits in a timely manner.

Want to read more? You can read the full article on Federal Technology insider here.