Data Initiatives at DHS Helped Deliver Vaccines to Frontline Workers
Data sharing and interoperability are key for data quality and governance.
The Department of Homeland Security is prioritizing artificial intelligence, machine learning and data sharing to support data analytics for mission delivery — an effort that enabled the department to roll out COVID-19 vaccines rapidly to frontline workers earlier this year.
Mike Horton, the newly appointed chief data officer at DHS, said the department’s contribution to the Vaccinate Our Workforce (VOW) program was possible because of the department’s data analytics and interoperability initiatives.
“When vaccinations were a big priority for the administration, especially frontline workforce, we found ourselves in a situation where we could impact one portion of that: provide data on the people who got vaccinated,” Horton said during an ATARC webinar earlier this month. “We sorted all that data and found it came from a lot of different places. It helped us understand how disparate the data is in different components. … That coalition of data and managing that data on a daily basis to update those lists and get our employees vaccinated as quickly and efficiently as we could, working with multiple agencies across government, really highlighted how good data, (when) shared, can serve the people.”
DHS’ immigration data domain is another success story for data analytics and interoperability.
“That immigration data domain for us at DHS is the most secure and productive of the domains we have,” Horton said. “Getting that story told, and helping people understand that a culture of sharing data and pushing data up to provide product and decision-making for leadership, helps the mission.”
Damian Kostiuk, chief of the data analytics division at U.S. Citizenship and Immigration Services, said while the agency is deploying AI and machine learning for data analytics, internal culture is the top priority for quality and governance of that data.
“Trying to get trust and culture change inside your organization really bolsters all of that,” he said during the ATARC webinar. “One of the key things when we get to automation, you’re really talking about going from a group of people who are producing widgets tens per hour, to thousands per hour (which create a bigger likelihood of making a mistake). In order to buffer that, you need to have extremely fast recognition of data-quality error. It goes to the public trust component of what you’re doing.”
Data sharing and strong interoperability standards are also critical for improved data quality and governance, he added.
“If you’ve got a thousand mistakes that can be made very quickly because your data quality goes really bad, if you have different siloed data sets all over the place, how hard is that going to be to correct the data quickly to go back to your operations? It’s really difficult,” he said. “You want to have a single space, and that’s part of that culture change.”
This is a carousel with manually rotating slides. Use Next and Previous buttons to navigate or jump to a slide with the slide dots
-
IARPA Director: U.S. Must Lead in AI to Stay Ahead of Adversaries
IARPA Director Rick Muller highlights the agency’s role in advancing AI research and safeguarding U.S. interests amid global competition.
5m read -
AI Revolutionizes Cybersecurity by Doing What Humans Cannot
Leaders from NSA, GAO and industry say that artificial intelligence can augment the cybersecurity workforce, but the work must be auditable and explainable.
4m read -
Elevating Cybersecurity in the Intelligence Community
The Intelligence Community is developing strategies to protect data and strengthen resiliency against emerging cyber threats.
30m watch -
Agencies Prioritize Pilots, Literacy to Overcome AI Challenges
DOL and HUD focus on change management and AI literacy to overcome adoption challenges and boost operational efficiency.
5m read