Studies predict there will be 40 exabytes of data by 2020. In case you are wondering, that is a lot of data. As the volume of data grows, data analytics will become increasingly important in shaping the future of federal agency policies and services. Whether it’s the Defense Department using cyberattack data to make their networks more secure or the Health and Human Services Department merging clinical, genomic and other data to improve health outcomes, big data will drive the mission.
To get the most from big data analytics — and to make these and other far-reaching agency initiatives possible — federal agencies need better analytics. But collecting data means little if analytics are not in place, or the existing tools don’t have the ability to extrapolate actionable insights. Static reports, Excel spreadsheets and lightweight visualization tools no longer suffice when agencies are collecting billions of data records from multiple sources.
Given the importance of robust data analytics solutions, data analytics company Qlik commissioned PulsePollTM from Market Connections to assess how federal agencies are using analytics, and to what extent.
- The majority of respondents (70 percent) believe that as the amount of available data increases, their ability to analyze it is the primary barrier to progress.
- Only 15 percent of respondents have the appropriate tools in place to analyze all data.
- Nearly two thirds of respondents (65 percent) do not have a data analytics tool in place to consolidate data from multiple sources into one interface — a problem when almost all (97 percent) of agencies report that they must compile data from more than one data source.
To help agencies and the contractors who serve them, Qlik built a custom Qlik Sense web app for accessing the full insights from the survey results. Check it out at http://federalsurvey.qlik.com/survey/index.html.