You are here

October 30, 2017
Author

The MediCare Advantage improper payment rate was 10 percent in 2016, according to testimony by the Government Accountability Office’s Healthcare Director James Cosgrove to the House Ways and Means Oversight Subcommittee. That’s $16.2 billion lost in improper payments to private Medicare Advantage health plans and in billing errors by insurers.

This news comes at a time when the Administration is focused on identifying ways to reduce waste, inefficiencies, and redundancies—including cutting or eliminating entire programs.  In other words, the focus is on running the federal government more like a business.

Big data and analytics can and must factor heavily into this new reality—the quest for a higher level of public-sector operational efficiency. The MediCare Advantage program is a high-profile example of where real-time insight—supported through next generation big data, analytics, and even machine learning technologies—could prevent the improper expenditure of billions of taxpayer dollars. 

Today, analytics are being put to use in both innovative and unexpected ways. For example, the Department of Justice (DoJ) is leveraging analytics to help optimize the use of federal grants that support public safety and law enforcement programs. The DoJ currently has more than 100 project grants and cooperative agreements to state, tribe, and local agencies. These grants are wide-ranging and include support for dangerous-drug analysis, sexual assault victim services, domestic violence prevention and victim support, community violence analysis and prevention, and more.

All of these grants and agreements apply for assistance from DOJ but administer the programs locally.  DoJ requires regular reports on spending and an accounting of the outcomes achieved. However, not all grants achieve meaningful outcomes, and some become non-compliant with DoJ rules. DoJ has historical data on which grants remained compliant and which did not.  

By applying predictive analytics techniques to this data, DoJ built a model to forecast the risk of non-compliance for programs and can use this risk measure to help prevent grant compliance lapses. The end result is vital insight that allows the DoJ to make smarter decisions about funding, operating, and modifying these important programs. 

Big data and analytics can also play an important role in optimizing the federal government’s most important asset—its personnel. The possibilities are endless, spanning use cases as diverse as the deployment and training of military personnel, to staging of FEMA personnel and resources in advance of a specific hurricane, to identifying which cyber security talent requirements might be most acute in three years and where those personnel might best be deployed.

Analytics can also be used to identify emerging training requirements and mine talent pools to improve resource deployment and development. Maximizing resource utilization and potential are fundamental to the premise of applying business principles to the federal government, and analytics illuminate the best path forward. 

Analytics and efficiency play on other levels, as well. The federal government has vast expanses of data at its disposal. Historically, however, this data has been siloed, which has limited its use.  Data was often duplicated and gathered multiple times, slowing processes and multiplying costs.  As the federal government commits itself to putting this information to work (gathering once and using often), it stands to unlock unprecedented opportunities for greater efficiency. 

For example, the U.S. Environmental Protection Agency (EPA) is working to boost emissions compliance and engine manufacturer fraud detection using data sources already available through the EPA’s Central Data Exchange (CDX), automated laboratory data validation tools from other EPA programs, and new analytics. EPA is also focusing on reducing sampling costs and increasing the scope of data available for hazardous waste site assessments using multiple data sources and the “data lake” approach.

The key to improving government efficiency and effectiveness is focusing efforts on operationalizing big data and analytics. As with any discipline, best practices can help to speed the journey and ensure success. Consider the following: 

  • Be sure the analytics initiative is designed to meet the business need.  Is the initiative structured and engineered to deliver the insight that is needed to solve a particular business problem? 
  • Don’t underestimate the complexity of data.  Organizations undertaking analytics initiatives often misjudge the amount of time and effort needed to prepare data at the start of project.  Good data is the single most important factor in the success of an analytics initiative.  A rule of thumb that we share with clients is that you should expect to spend approximately 80 percent of the time preparing data, and 20 percent modeling it. CSRA is helping agencies to accelerate this process with advanced data prep strategies and tools that we’ve developed and refined through years of experience.
  • Match the tool to the data.  Not every analytics tool fits the bill for every project.  One area that is particularly challenging is selecting and fine tuning the algorithms that underpin the models.  Our data scientists are immersed daily in the world of algorithms, continually comparing and evaluating nuances that can make or break the success of an analytics program.

To learn more about CSRA’s Data & Analytics expertise, visit csra.com/data-analytics