How agencies can start putting government’s treasure trove of data to work

In Featured by ThePeopleLeave a Comment

This content is provided by Alteryx.

The federal government has more data at its fingertips than almost any other organization in the world. The problem agencies tend to run into, however, is what to do with it. While some agencies have already begun leveraging artificial intelligence and machine learning to perform analytics and begin putting that data to use, many others are still exploring use cases and only just becoming aware of the possibilities.

Two major challenges keep federal agencies from taking advantage of this data. The first is lack of expertise in data science. Much like cybersecurity, data science expertise is highly valued in both the public and the private sectors, and federal agencies have a hard time competing with industry for this talent. That’s why it’s frequently the subject of upskilling initiatives across government, and why agencies are often looking for low- or no-code solutions for data analytics, AI and ML. By democratizing data and upskilling analytic capability in this way, agencies are looking to foster data skills across their workforces and build a culture of evidence-based decision making.

The second challenge agencies face is insufficient tools and resources to properly analyze data at the kind of scale they need. Agencies and analytics teams that still rely on spreadsheets and manual processes simply can’t keep up with the massive amounts of data flowing into agencies or the demands for insights derived from that data.

That’s where agencies can turn to Analytic Process Automation (APA) to help pick up the slack. APA platforms enable the democratization of data by allowing users to create their own analytics and automate business processes without any specialized data science expertise. And the use cases are myriad.

For example, every federal agency has some sort of procurement function. But in many cases, these procurement functions rely heavily on manual processes that are inefficient at best. But in order to help streamline these processes, one federal revenue agency applied the natural language processing tools within the Alteryx APA platform to identify issues with procurement awards like unclear or incomplete information, and automatically send them back to be updated. It did this by automatically connecting to the Federal Procurement Data System and USASpending.gov. Alteryx estimated that if all federal agencies used this application, the government as a whole would save 866,000 labor hours each year.

Agencies looking to jump into the AI field can also use the Alteryx APA platform to build an ethical foundation. Automating workflows gives data experts more time to address and avoid issues with AI and ML models, like data bias. And Alteryx’s APA platform is specifically designed to be transparent about its processes so that humans can always audit and evaluate the decision-making processes, avoiding “black box” situations.

One of the most prevalent types of data government collects is geospatial. That data is used in situations ranging from military and intelligence operations to weather forecasting, and comes from satellites, drones, mobile phones, and numerous other kinds of sensors. With that many varied types of data, sources and applications, agencies looking to leverage geospatial data need the ability to aggregate diverse types of data in order to use it effectively.

For example, when two hurricanes hit Puerto Rico in 2017, Alteryx’s APA platform provided predictive analytics tools that were used to create damage assessments of more than 150,000 buildings, and prioritize the ones that were the worst affected. That involved aggregating diverse data such as building quality, structures and designs, flood and elevation levels, and wind speed, from sources including FEMA, the U.S. Army Corps of Engineers, the National Weather Service, NOAA and the European Union.

The U.S. Construction Indicator is a product of the U.S. Census Bureau and this agency required an end-to-end solution with a high degree of analytic automation at its core. Working with Reveal Global Consulting and Alteryx, Census created an advanced analytics and visualization capability that reduced the reliance on the manual collection and analysis of surveys and uses AI to analyze satellite image data showing construction activity over time.

The solution collects, filters, formats, and aggregates unstructured data from numerous Census and third-party sources. The data is ingested into the Alteryx platform. Alteryx provides a unified analytic capability that enables the creation and scheduling of workflows, that can collect satellite imagery and automate analysis using geospatial analysis and AI/ML capabilities to validate, compile, and create more accurate and timely construction indicator insight .

These are just a few of the use cases that agencies can get out of a unified analytics platform. Agencies looking to leverage the data they collect can upskill their workforces, gain insights into and automate business processes, and build a foundation for artificial intelligence.

Leave a Comment