5 Takeaways from the Federal IT Efficiency Summit
Government and industry leaders explored strategies for optimizing IT operations, reducing costs and enhancing mission effectiveness. Attendees gained insights into cutting-edge technologies, process improvements, acquisition models and policy frameworks that are driving efficiency across federal IT agencies. This summit was a must-attend for agency decision-makers looking to streamline IT modernization efforts that save taxpayers money.
Smarter data sharing practices can prevent improper payments before they happen.
Preventing fraud and improving payment accuracy across federal programs requires proactive data strategies, not reactive fixes, said Rep. Pete Sessions, co-chair of the Congressional Department of Government Efficiency (DOGE) Caucus. Sessions emphasized that greater cross-agency collaboration and information sharing can stop waste before it occurs.
“We want to catch it before it goes out, rather than sending it out and trying to get back the money,” he said. “If we use the knowledge that has been gathered together and share it across government before these payments are made, we will be able to know that … it works.”
Sessions underscored that government efficiency is a bipartisan priority and advocated for dismantling organizational siloes that hinder communication and reform. He also praised efforts to inject private-sector expertise into public operations through the DOGE Caucus, calling it a “team effort” that includes federal and industry leaders alike.
“These are very savvy, bright people … We are challenging them, and they are challenging us,” he said. “We cannot become siloed. We have to talk to each other. We have to learn from each other.”
Strengthening the federal digital backbone is key to scaling AI and emerging tech.
As federal agencies adopt more data-intensive and AI-driven solutions, the underlying network infrastructure must be agile, secure and built for scale, said Michael Adams, vice president of public sector federal, civilian and national markets at Verizon.
“Alone we can do so little, together we can do so much,” he said, quoting Helen Keller to emphasize the importance of public-private collaboration in strengthening federal IT infrastructure. “Change is constant. The technologies we see emerging today are the technologies we will all be immersed with tomorrow.”
Verizon is prioritizing cloud-native platforms and AI-optimized network services tailored to evolving government needs. Offerings like Verizon Cloud and AI Connect are designed to securely support multi-cloud environments and high-demand AI workloads with dedicated infrastructure. SD-WAN services also allow agencies to manage high-traffic volumes dynamically and securely, reducing reliance on legacy systems.
“Every agency is in a different position,” he said. “But if we can work together to strengthen the digital backbone that supports each government agency … we’ll all be in a better place.”
The SHARE IT Act promises major savings by ending duplicative software development.
The SHARE IT Act is set to take effect this month. The law requires agencies to share custom-developed source code and associated metadata, reducing duplication and enabling massive cost savings across government.
“Reusing that code … has high potential for saving a ton of money,” said Remy DeCausemaker, open source lead at the Centers for Medicare & Medicaid Services (CMS). He noted that reusing just 1% of existing code could save $5 million — and scaling that across agencies could result in hundreds of millions in savings.
CMS is leading by example, aligning its acquisition policies with the new law and adopting industry best practices from CISA, NIST and the Open Source Security Foundation to securely reuse and manage source code.
“CMS is the first federal agency to have that private-sector style approach to open source,” DeCausemaker said, adding that 96% of commercial software already contains open-source components. “That represents about $8.8 trillion of value if we had to rewrite that from scratch.”
Modernizing procurement requires interoperability, cultural agility and scalable success.
Federal agencies are rethinking how they approach acquisition by modernizing data practices and simplifying procurement processes to drive greater efficiency. The new administration has pushed to evolve federal acquisition frameworks. Federal leaders are moving away from one-off purchases toward mission-wide, scalable investments.
“One of the biggest mistakes … is they think of it individually, like they’re buying something for them,” said DOD Principal Deputy CIO Leslie Beavers. “We need to look holistically.”
A key challenge is the lack of interoperability across acquisition systems. Agencies are tackling this by piloting data dictionaries and improving data quality across platforms. Leaders cited that these efforts could accelerate procurement timelines and reduce duplication.
“Data is one of the unsung heroes … the deliberate effort that is required to make data interoperable [is huge],” said Beavers.
AI risk management requires sector-specific assessments to protect against new vulnerabilities.
Agencies are turning their attention to risk, resilience and responsible scaling amid AI development. IT officials emphasized the need for stronger data protections and cross-sector vulnerability assessments to prepare for new vulnerabilities.
“With AI being embedded in so many of our tools and technologies, we are already finding that agencies are not able to really provide the additional protections for their data and our personally identifiable information,” said Jennifer Franks, director of GAO’s Center for Enhanced Cybersecurity.
GAO is conducting a comprehensive tech assessment to evaluate risks across 16 critical infrastructure sectors. Each sector, Franks said, will require distinct policy and security metrics, but their data will inevitably intersect, raising complex challenges around privacy and AI model transparency.
“A transportation sector is going to have different metrics than an energy sector,” she said. “However, they are going to intersect … based on the customers and the personally identifiable information that is built into those large language models.”