Skip to Main Content

FDA Eyes Global AI Partnerships to Safeguard Patient Data

The Center for Devices and Radiological Health director warns Congress of national security implications if the U.S. restricts AI development in medical settings.

4m read
Written by:
hand points to a screen in the hospital
Securing medical devices starts with ensuring security in the design phase. Photo Credit: PeopleImages.com - Yuri A/Shutterstock

The Food and Drug Administration (FDA) is embracing artificial intelligence and working with international allies and partners to safeguard patient data and ensure the integrity of medical devices, agency experts told the Health Subcommittee Hearing on FDA Regulation of Drugs, Biologics and Devices last week.

Center for Devices and Radiological Health (CDRH) Director Jeff Shuren said FDA has addressed cybersecurity vulnerabilities by working with industry partners to update cybersecurity requirements for medical devices in compliance with the Consolidated Appropriations Act of 2022.

Despite progress, CDRH identified that laboratory-developed tests — in vitro diagnostic products manufactured by laboratories and used in a single clinical laboratory — remain vulnerable to cybersecurity threats.

There is a critical need for greater oversight of these tests to ensure the innovation of safe and effective medical devices, Shuren added.

“We monitor several [vulnerabilities] at any given time, but there is still a weakness in laboratory developed tests,” Shuren said. “We have put out communications where we found vulnerabilities in platforms being used by non-labs and labs, but we only found out about it because it was used by non-labs. We made the manufacturer tell the labs, otherwise they would’ve never known.”

CDRH is enhancing efforts to ensure that patches are provided to biomedical departments of hospitals and their service suppliers. Shuren said that working closely with industry partners is key to ensuring the implementation of necessary security measures and timely patching vulnerabilities.

“This begins with designing devices in a way that allows them to be patchable,” Shuren said. “That’s what we work on with companies to assure that they’ve got the right measures in place. Then as we learn about problems and patches, we help yield that out.”

Center for Drug Evaluation and Research (CDER) Director Patrizia Cavazzoni emphasized the need for greater transparency within the supply chain to hold stakeholders accountable and address impending shortages in a timely manner.

“We would welcome having more authorities that would allow us to have greater transparency on the supply chain, “ Cavazzoni said. “Greater transparency on the supply chain is certainly a tool that would really add to our limited toolbelt, so far.”

Shuren also noted that there are national security implications if the U.S. restricts AI development, emphasizing the need to embrace emerging technologies to remain the global leader in innovation. He said it’s important for FDA to facilitate AI development for medical systems in ways that are both safe and effective for patients.

He added that FDA is working with international partner governments to prevent duplication of requirements for companies that are marketing devices with AI.

“Much of our work for international harmonization on AI occurs in a group called the International Medical Device Regulators Forum,” Shuren said. “Typically in the AI space and digital health, when there are needs for changes in policies, we not only start here in the U.S., but we also take it to this group because all the countries are struggling with the same issues.”

Through efforts of fostering international collaboration, the group is now working on a globally harmonized policy regarding the lifecycle management approach for AI medical devices, Shuren said.

“We are trying to move [innovation] through as rapidly as we can while maintaining our gold standard,” added Center for Biologics Evaluation and Research Director Peter Marks. “We are always trying to do better, and that’s a commitment we have.”

Beyond collaboration with industry partners and international allies, Shuren also emphasized the importance of government to help safely harness AI technologies and locate cybersecurity vulnerabilities.

“At the end of the day, if we want industry to be innovative and get innovations to people who need them, government has to be innovative too,” Shuren said.

Related Content
Woman typing at computer

Stay in the Know

Subscribe now to receive our newsletters.

Subscribe