Skip to Main Content

Government Agencies Leverage AI Tools and Large Language Models

Generative AI tools are powerful, officials note, but come with a series of challenges.

7m read
Written by:
Large Language Models Are Powerful
Photo Credit: Shutterstock

Artificial intelligence (AI) and machine-learning technologies are expanding into government applications as agencies begin to use the powerful resources to fulfill their missions.

While large language models (LLMs) like ChatGPT, OpenAI, and Meta can be valuable, agencies officials said they are unable to use the models at scale due to security issues.

“The models are so large that bringing those models inside of our secure environment is probably cost prohibitive and also the company has not given us access to the models themselves,” said Lt. Col. Joseph Chapa, U.S. Department of the Air Force Chief Data and AI Officer, during ATARC’s AI Summit Wednesday.

Chapa said that large language models are not currently certified for the agencies level of data so there are several restrictions in place.

“Everything we do with the systems has to be completely unclassified and non-sensitive level,” Chapa said.

But Chapa said the agency is hopeful that smaller open-source models will provide tools that are secure for classified data.

In the meantime, agencies are also working internally to combat security issues and create a secure environment for data. Last year, for instance, the U.S. Coast Guard created a climate data branch that focuses on building new infrastructure to run AI and LLM applications.

“What I find most exciting right now is we are a little late to the game. But we also didn’t have to learn all the lessons that the early adopters have learned,” said Jonathan White, the Cloud and Data Branch Chief at the Coast Guard, during the summit.

White said large language models are only a slice of the AI pie and the agency is extremely “AI-curious” because it understands how resourceful the technology will be.

“With a limited workforce, it is extremely hard to place people in the right spots all the time, then we have a lot of policy and regulations and rules and everything else that goes along with those missions. Wouldn’t it be great if we could have a tool that can elicit an answer to your question that relates to those policies and regulations?” White explained. “If they’re in the middle of an inspection or middle of warning [and] they need to ask a question, wouldn’t it be great to have a digital assistant to surface the information that they need to conduct that inspection.”

However, experts said they can’t rely entirely on the technology because there must be a human in the loop to avoid errors.

“This is no different than how we conduct business in general. When we create a new project, we have to have reviewers, when you do any kind of research you have research reviewers,” said David Na, NASA Cloud AIML Foundations Lead during the summit. “We need reviewers for the content that you get generated by these large language models.”

As government agencies find value in large language models, Chapa said that they are aware that challenges may persist. “I know people are using these models on their own — at some point there will be some slips or some leaks,” he noted.

But Chapa said the best use of text generation models is when the basic information is already known, so it is easy to validate the facts.

“It’s going to be tempting for analysts to want to use the tool to gain insights about the world. And I’m not saying we shouldn’t do that. I’m saying we should recognize that if we’re going to use these tools in that capacity, we have to build an institutional fact-checking layers on top of that,” Chapa said.

Despite the challenges that large language models may create, agency officials said they are excited about the enhanced search experience, accessibility of data and the other possibilities AI tools will provide.

Woman typing at computer

Stay in the know

Subscribe now to receive our curated newsletters

Subscribe
Related Content
Woman typing at computer

Stay in the Know

Subscribe now to receive our newsletters.

Subscribe