Federal AI Infrastructure Requires a Smarter Foundation
Federal AI depends on a smarter infrastructure, from managing environmental impacts to improving data quality and workforce readiness.
As federal agencies accelerate AI adoption to support into mission-critical systems, a successful AI infrastructure requires far more than the right algorithms, top IT officials explained during GovCIO Media & Research’s AI FedLab in Reston, Virginia on Tuesday.
From energy consumption to data readiness and workforce adoption, building an effective and sustainable AI ecosystem means addressing foundational issues holistically.
Environmental and Infrastructure Costs of Generative AI
The rise of large language models and generative AI introduces significant environmental considerations. The Government Accountability Office (GAO) released a new report in April highlighting the human and environmental costs of AI.
Kevin Walsh, director of Information Technology and Cybersecurity at GAO, emphasized the rapid pace at which generative AI systems consume and wear down hardware infrastructure.
“These large language models and generative AI chew through the hardware faster than I think people are realizing,” said Walsh.
The report features several recommendations as government continues to leverage generative AI, including disclosing of energy and water usage, developing mechanisms to assess the ongoing impacts and establishing guidelines for the responsible development and deployment of generative AI.
Data Is the ‘Core’ of AI Readiness
IT leaders stressed the importance of data quality, accessibility and governance as critical pillars of AI success. Jonathan Alboum, federal CTO at ServiceNow, said that simply implementing AI is not enough; agencies must have strong data foundations.
“If we don’t have strong data at the core of our systems, we don’t understand it, we don’t use the right models, we don’t have that sort of ethos around data management, we’re never going to get the results that we expected,” said Alboum.
Federal government has rigid requirements for tech adoption and data management, especially in mission-critical areas like at the Defense Department. Leaders are working to “untangle the [data] web” to adopt tech solutions faster, said Jacob Glassman, senior technical advisor for the Assistant Secretary of the Navy for Research, Development and Acquisition.
“In terms of the benefits, and why we in the acquisition community are pushing so heavy for AI, is our current acquisition system is painfully slow. … [data management and AI adoption] has now become a combat capability,” said Glassman.
Walsh added that as data discussions continue, they should be based on management requirements needed to train AI models in the best ways possible.
“[Data management is] a requirement for everybody in an agency that interacts with a program and interacts with systems to treat this data as the fuel that powers the agency and its ability to serve customers and employees inside the agency,” said Walsh.
“We need to circle our wagons around those technical people, that institutional knowledge. That is a national defense imperative,” Glassman added.
How Agencies are Using AI
The Department of the Navy (DON) has a “very good infrastructure to be able to host AI,” Glassman explained, and moving forward, Glassman is focused on overcoming intellectual property challenges when leveraging AI tools.
DON is running DONGPT, hosted in Flank Speed enterprise environment. To get to implementation, DON conducted alpha trials, or a 45-day test period where users measured where the tool was beneficial in key functional areas like acquisitions and logistics.
“We’re going to start small and scale rapidly,” Glassman said. “[DONGPT] is now in our enterprise system. So we can scale to every single person in the Navy and Marine Corps, if we needed to.”
NIH and the Department of Energy released a new AI use case on May 9 that could lead to early cancer detection. The agency worked with the supercomputing centers at Oak Ridge National Lab to publish their work on AI extraction of electronic pathways (ePath) reports.
“If you’re doing a diagnosis of cancer, it’s the ePath reports that are the key finding that give you that early diagnosis,” said Gregurick. “Well, your ePath reports are not connected to your EHR system, so being able to pull those out and link those to your EHR system and send them off to the cancer surveillance programs is really key for that very early diagnosis of cancer.”
The Bridge to Artificial Intelligence (Bridge2AI) program is also helping NIH generate large scale gold standard data sets in critical, disease-focused areas. NIH is collaborating with industry partners to provide researchers cloud computing capabilities in AI.
“It’s scaling out our partnership with DOE, with NSF and with our cloud providers, to really think about how we can enable the next generation of compute for AI,” said Gregurick. “And how we can fund researchers or incentivize in other ways for them to adopt AI and to now think about it in ways that are compatible with our goals in environment and energy.”
This is a carousel with manually rotating slides. Use Next and Previous buttons to navigate or jump to a slide with the slide dots
-
Pentagon Leaders Must Give ‘Top Cover’ for Risk-Taking, DOW Official Says
Bonnie Evangelista says acquisition reform will fail without leadership support that allows for experimentation without penalty.
3m read -
Digital GI Bill Automation Speeds VA Benefits Delivery
VA officials said Digital GI Bill upgrades have improved claims speed, enabling the department to process benefits with fewer staff.
3m read -
Who’s in Charge of AI at Every Federal Agency
New AI memos from the Trump administration prompt federal agencies to establish chief AI officers and OMB to launch a new CAIO AI Council.
7m read -
Federal CIO: ‘The Shackles are Off’ for AI Innovation in Government
Federal CIO Gregory Barbaccia said the PMA encourages faster tech adoption, AI experimentation and simpler digital services for citizens.
3m read