Skip to Main Content Subscribe

HHS Officials Push Past Legacy IT to Strengthen Care Delivery

Share

HHS leaders said modern EHRs, AI security tools and interoperable data will overcome legacy challenges and support better care nationwide.

4m read
Written by:
IHS CIO Mitchell Thornbrugh speaks at the GovCIO Media & Research Health IT Summit on Sep. 23, 2025 in Bethesda, Maryland.
IHS CIO Mitchell Thornbrugh speaks at the GovCIO Media & Research Health IT Summit on Sep. 23, 2025 in Bethesda, Maryland. Photo Credit: Invision Events

Department of Health and Human Services (HHS) leaders are phasing out decades-old technology and adopting modern systems — from AI-powered cybersecurity to next-generation electronic health records — to improve patient care and strengthen data-driven health research, officials said Tuesday at GovCIO Media & Research’s Health IT Summit in Bethesda, Maryland.

“Ultimately, what we really want is a system that matches what’s available to the American public, so that we’re not lagging behind and using tools that are decades old,” IHS CIO and OIT Director Mitchell Thornbrugh said during the summit.

Thornbrugh noted that IHS’s EHR modernization is central to improving patient care for millions of patients nationwide, many in rural regions.

“We looked at our technology and what we need to operate these health care systems, it became apparent not only that we need the tools, but we need this to work for the folks in the field, because they’re very disconnected, they’re very far away,” said Thornbrugh. “We focused on governance first. That’s an interesting way to talk about technology, is to talk about governance.”

The new EHR – known as PATH for “patients at the heart” – aims to unify 200 separate databases into a sustainable, scalable enterprise solution for providers and patients, he added. The previous EHR, with its 200 separate databases, is a fragile, decades-old system that creates significant technology debt, Thornbrugh said, adding that developers spend more time fixing things than implementing new features, and the dispersed system requires thousands of IT professionals in remote areas to repeat the same work.

“We need a platform that can grow with us and be sustainable into the future. And that comes from technology, but it’s also the way we govern it, the way it can scale up and down with the changes in health care,” he said.

Hacking Security at ARPA-H

Andrew Carney, a program manager for resilient systems at ARPA-H, said that his background “hacking into systems of interest to the U.S. government, to bolster the way health care secures IT infrastructure.

“Most of what hacking really is, I’ll let you in on a secret, is just understanding a system really, really, really well, validating the assumptions of every developer and designer and customer and user of that system,” Carney said Tuesday. “At ARPA-H, my approach to changing the way that we secure medical devices, specifically legacy devices, in hospitals, in health care environments and software has been informed by nearly 20 years of breaking systems in this very sort of detail-oriented, focused way.”

Carney said that ARPA-H is working two key initiatives to better secure health IT systems. The first involves leveraging large language models in a “trustworthy, consistent way” to analyze software and hardware.

“We use program analysis, formal methods and traditional, deterministic techniques to validate and he probabilistic AI assertions and hypotheses,” Carney said.

He added that his team showcased AI’s ability to secure systems efficiently at DEFCON in Las Vegas earlier this year, analyzing “over 54 million lines of code” in the process. 

“Using this approach, for a few hundred dollars per zero-day vulnerability that was discovered,” Carney said. “This was incredible. Folks were very skeptical that we could do this, especially sort of at that price point and at that level of complexity.”

ARPA-H’s second initiative aims to create digital twins for clinical IT environments. Carney said that ARPA-H is on the cusp of its Upgrade program, which aims to build a vendor-agnostic, high-fidelity simulation that goes down to the level of a device’s hardware and software.

“[Upgrade] is a vendor-agnostic, facility-agnostic approach to characterizing and then lifting into a high fidelity simulation that goes down into the hardware and software so you can actually test new products, maybe agentic AI-based,” Carney said. “You can test all that software in a high fidelity environment that matches your needs and your specific sort of deployments.”

Breaking Down Data Silos to Accelerate Discovery

Sweta Ladwa, Chief of Scientific Solutions Delivery Branch at the National Heart, Lung, and Blood Institute (NHLBI) at the NIH, said that making siloed data interoperable is critical to better health in research environments.

“For years, researchers have struggled with fragmented health data,” Ladwa said. “We can’t easily compare or combine them, and scientists lose precious time cleaning data instead of making discoveries.”

She explained that life-saving discoveries are often locked away in silos because data from different studies is fragmented and cannot be easily compared. This challenge forces scientists to spend precious time on “cleaning data instead of making discoveries.”

Ladwa said that her team started a pilot project that selected approximately 150 variables across nine longitudinal cohort studies and involving more 200,000 participants and various data types, from clinical and genomic to multimodal imaging like ECGs and MRIs. Her team, she said, built a “biodata catalyst, harmonized model” to serve as a blueprint for making data interoperable. As a result, she said, researchers could finally ask multiple studies about a variable — like which participants are taking calcium channel blockers — and get a single, definitive answer.

“We created a tool chain powered by linkML, a standards ontology-based converter box to transform messy inputs into clean, comparable outputs, and we tested it, for example, unifying all those blood pressure medication responses into one harmonized variable,” she said.

“Phase two is about scaling,” she added, with a focus on making harmonized data available as soon it is ingested. Ladwa said that AI and automation are critical to this next phase.

“Harmonization happens faster with less manual work, but always with a human in the loop,” Ladwa said. “Think of it as moving from a handcrafted operation to a safe and secure, AI-driven scaling system capable of adapting to new studies, new variables and new research needs.”

She added that she looks forward to a third phase, in which harmonization becomes “the invisible backbone of biomedical discovery,” making breakthroughs easier and patient care more effective.

“Communities of scientists can extend the model themselves, customizing it for their own research needs and longitudinal data tracking people across decades become searchable and comparable at the click of a button,” she said. “This isn’t just convenience. It accelerates science. It means faster answers about heart disease, COVID recovery, chronic illness, obesity, sleep disorders and more.”

Technology, Sweda noted, is critical to health care not because of the tech itself, but because of the innovation that pushes patient care forward.

“Every hour we save in data wrangling is an hour gained for science and for patients waiting on the answers,” said Sweda.

Related Content
Woman typing at computer

Stay in the Know

Subscribe now to receive our newsletters.

Subscribe