Data Initiatives at DHS Helped Deliver Vaccines to Frontline Workers
Data sharing and interoperability are key for data quality and governance.
The Department of Homeland Security is prioritizing artificial intelligence, machine learning and data sharing to support data analytics for mission delivery — an effort that enabled the department to roll out COVID-19 vaccines rapidly to frontline workers earlier this year.
Mike Horton, the newly appointed chief data officer at DHS, said the department’s contribution to the Vaccinate Our Workforce (VOW) program was possible because of the department’s data analytics and interoperability initiatives.
“When vaccinations were a big priority for the administration, especially frontline workforce, we found ourselves in a situation where we could impact one portion of that: provide data on the people who got vaccinated,” Horton said during an ATARC webinar earlier this month. “We sorted all that data and found it came from a lot of different places. It helped us understand how disparate the data is in different components. … That coalition of data and managing that data on a daily basis to update those lists and get our employees vaccinated as quickly and efficiently as we could, working with multiple agencies across government, really highlighted how good data, (when) shared, can serve the people.”
DHS’ immigration data domain is another success story for data analytics and interoperability.
“That immigration data domain for us at DHS is the most secure and productive of the domains we have,” Horton said. “Getting that story told, and helping people understand that a culture of sharing data and pushing data up to provide product and decision-making for leadership, helps the mission.”
Damian Kostiuk, chief of the data analytics division at U.S. Citizenship and Immigration Services, said while the agency is deploying AI and machine learning for data analytics, internal culture is the top priority for quality and governance of that data.
“Trying to get trust and culture change inside your organization really bolsters all of that,” he said during the ATARC webinar. “One of the key things when we get to automation, you’re really talking about going from a group of people who are producing widgets tens per hour, to thousands per hour (which create a bigger likelihood of making a mistake). In order to buffer that, you need to have extremely fast recognition of data-quality error. It goes to the public trust component of what you’re doing.”
Data sharing and strong interoperability standards are also critical for improved data quality and governance, he added.
“If you’ve got a thousand mistakes that can be made very quickly because your data quality goes really bad, if you have different siloed data sets all over the place, how hard is that going to be to correct the data quickly to go back to your operations? It’s really difficult,” he said. “You want to have a single space, and that’s part of that culture change.”
This is a carousel with manually rotating slides. Use Next and Previous buttons to navigate or jump to a slide with the slide dots
-
Why Storytelling Belongs in a Federal Leader’s Toolkit
Lucinda Wade and Mahua Mitra discuss the art of crafting compelling narratives and how to use the technique to inspire others.
19m listen -
Navy Chief Points to More Autonomous Systems, Robotics by 2027
Adm. Lisa Franchetti's new plan prioritizes development of autonomous systems to prepare the Navy for growing aggression from China.
5m read -
Facing Evolving Cybersecurity Challenges
Hear from federal cybersecurity experts discuss strategies for staying informed about the latest threats, tools and policies.
30m watch -
GSA Taps Dovarius Peoples as Deputy CIO
Peoples previously served as CIO of the U.S. Army Corps of Engineers and oversaw the service's cloud migration and data modernization.
1m read