Data Partnerships Essential for AI Development
Federal agencies are focusing on data interoperability and knowledge-sharing partnerships to support increasingly advanced research initiatives.

Federal agencies are building knowledge and data-sharing partnerships as the foundation for further advancement in applied artificial intelligence capacities.
Speaking at the GovernmentCIO Media & Research AI Gov: Mastering Data event, representatives from the National Institutes of Health (NIH), and the departments of Energy (DOE) and Veterans Affairs (VA) discussed how their agencies are advancing their research projects through refining both their data processing capacities, as well as cross-organizational collaboration.
These agencies have taken a forefront role in applying AI toward their central missions, with NIH using the expanded scope of data processing to increase the sophistication of research into the pathology and treatment of various conditions — including better elucidating how to manage more severe COVID-19 infections.
“It has broadened the dimensionality of our data analysis capability, enabling a more comprehensive approach in understanding and managing diseases and conditions instead of the traditional siloed organ specific way. And, for example, because COVID-19 affects many organ systems, the Medical Imaging and Data Resource Center could enable clinical decision support tools to diagnose and monitoring both the acute COVID or long COVID cases in a systematic way,” said Qi Duan, program director at the NIH Division of Health Informatics Technologies.
While advanced data projects have been used to support diagnostics tools, partnerships both within and across agencies to analyze large quantities of data have also been essential for better understanding the pathologies and potential complications of a novel virus like COVID-19. This has included both clinical data-sharing partnerships, as well as sharing expertise and high-powered computing resources to partner agencies.
“Part of the thing that is very exciting about the VA is it’s a health system that is a learning health system that has nested research within it. That includes investigators who both have expertise in statistical analysis and AI, and a great deal of expertise in clinical epidemiology and clinical research, as well as how that data is recorded. Bringing those people together, having them have worked on questions before, enabled us to much more rapidly address these issues in the present,” said Dr. Amy Justice, a professor of public health at Yale University and leading Veterans Affairs clinical epidemiologist. “We had already reached out and began a very close collaboration with Department of Energy investigators using high-power computing and their in-house expertise on AI, and began to develop that language so that when COVID hit, we already had a collaboration that was well in place.”
This focus on data and knowledge-sharing partnerships extends across VA as a whole. Much of this centers on tech sprints overseen by VA’s National Artificial Intelligence Institute (NAII), including an upcoming initiative to explore the application of AI in the diagnosis and management of lung cancer.
“In the next tech sprint, we will pursue an imaging technology for accelerating or improving the accuracy of lung cancer scans that are routinely performed for either screening or diagnosing lung cancer in the veteran population. For a number of reasons, lung cancer is such a heavy burden. It is a high mortality cancer that disproportionately affects our veteran population, for instance, and there are a number of mature technologies behind it. Our intent is that we would like to provision a safe environment computationally within the VA where we can invite outside collaborators to pitch solutions,” said Rafael Fricks, lead AI tech sprints coordinator at VA.
Beyond the areas of public health and health care, federal agencies have begun applying AI capacities for infrastructure resiliency programs. This has been spearheaded in large part by the Department of Energy, using high-powered computing and AI models developed in partnership with its Oak Ridge National Laboratory to gauge the strain that natural disasters and extreme weather events might have on America’s vital infrastructure. These initiatives require large quantities of data sharing between various localities and organizations, and have shown promise for addressing issues like supply chain lapse.
“The other thing that AI really opens up is the ability to look at what we call the system scale. A lot of companies, organizations, cities and governments have the ability to look very effectively at local optimization. I can manage my own fleet, I can control my own traffic, and things like that. But at the scale of AI and with the computational power that we have today, we can really look at the whole system. We can look at the impact of the Port of Long Beach, on the manufacturing supply chain for some of the largest companies in the United States in ways that no single entity has the ability to do that will require a lot of sharing data,” said David Womble, AI program director at the Oak Ridge National Laboratory.
This is a carousel with manually rotating slides. Use Next and Previous buttons to navigate or jump to a slide with the slide dots
-
Agencies Tackle Infrastructure Challenges to Drive AI Adoption
Federal agencies are rethinking data strategies and IT modernization to drive mission impact and operational efficiency as new presidential directives guide next steps.
5m read Partner Content -
Generative AI Demands Federal Workforce Readiness, Officials Say
NASA and DOI outline new generative AI use cases and stress that successful AI adoption depends on strong change management.
6m read -
The Next AI Wave Requires Stronger Cyber Defenses, Data Management
IT officials warn of new vulnerabilities posed by AI as agencies continue to leverage the tech to boost operational efficiency.
5m read -
Federal CIOs Push for ROI-Focused Modernization to Advance Mission Goals
CIOs focus on return on investment, data governance and application modernization to drive mission outcomes as agencies adopt new tech tools.
4m read -
Fed Efficiency Drive Includes Code-Sharing Law, Metahumans
By reusing existing code instead of rewriting it, agencies could dramatically cut costs under the soon-to-be-enacted SHARE IT Act.
5m read -
AI Foundations Driving Government Efficiency
Federal agencies are modernizing systems, managing risk and building trust to scale responsible AI and drive government efficiency.
40m watch -
Navy Memo Maps Tech Priorities for the Future Fight
Acting CTO’s memo outlines critical investment areas, from AI and quantum to cyber and space, as part of an accelerated modernization push.
5m read -
DOD Can No Longer Assume Superiority in Digital Warfare, Officials Warn
The DOD must make concerted efforts to address cyber vulnerabilities to maintain the tactical edge, military leaders said at HammerCon 2025.
4m read -
New NSF Program Cultivates the Future of NextG Networks
The agency’s new VINES program looks to tackle key challenges like energy efficiency and future-proofing wireless tech.
21m watch -
DHA CDAO Spearheads Master Data Catalog to Boost Transparency
Jesus Caban plans to boost DHA's data maturity through a new master data catalog, governance frameworks and inventory of tech tools.
5m read -
Trump Orders Spark Government-Wide Acquisition Overhaul
As Trump pushes for a faster, simpler procurement system, agencies are leveraging AI and adapting strategies to meet new requirements.
5m read -
Inside Oak Ridge National Lab’s Pioneer Approach to AI
Energy Department’s Oak Ridge National Lab transforms AI vulnerabilities into strategic opportunities for national defense.
22m listen