Culture Remains a Hurdle in DOD AI Race
The Defense Department faces an uphill battle as they continue to tackle cultural and trust challenges while moving toward the adoption and implementation of AI.
Culture remains one of the biggest hurdles to successful implementation of artificial intelligence (AI) technologies at the Defense Department, according to DOD leaders at multiple events this week.
Some DOD officials believe people are becoming more comfortable using the term AI, but when it comes to understanding what AI really is, and the practical applications of it, not so much.
According to Col. Mike Teter, Chief Data Officer and Deputy Director of Digital Superiority and J6 C4/Cyber Directorate at U.S. Space Command, building cultural trust really comes through educating and training, but it also changes the paradigm and timelines associated with traditional reporting up the chain.
When you have the ability to have everyone in the chain from the lowest unit all the way to the president can see the same thing at the same time at scale, then it’s really a cultural shift on both sides, Teter said during ATARC’s ATARC’s Ethical Uses of AI with the Federal Government event.
“You’re not going to have all the answers right away when you see something in the data,” he said. “You don’t get to be subjective in it. So, that’s really where the trust within the AI, within the models and within the data, is. Understanding that piece is really the hardest part of the implementation across Space Command.”
Dr. Nelson Colon, Digital Service Expert at Defense Digital Service, Office of the Chief Digital and Artificial Intelligence Officer (CDAO) at DOD, feels one of the biggest challenges he’s encountered with AI is communicating what they’re building to the general public.
“The way we evaluate performance, it’s often hard for people who aren’t in the field to understand which creates trust issues and it takes more effort to educate them,” Dr. Colon said during ATARC’s event. “So how do I explain things to people without compromising the results? One of things I try is building that trust through knowledge.”
The U.S. Air Force’s Air University is also facing culture challenges when deploying AI.
When there are many disparate data sources, whether structured or unstructured data, it becomes difficult to mesh all of it together, said Air University CIO Jeff Lush during FedInsider’s Military AI: Foundation for Digital Transformation webinar.
“For us, one of the biggest challenges is cultural. With the staff themselves on what does that mean from a data input perspective and how do we get that data clean?” Lush said. “The first step is, asking what is real? We have to draw a line and say so much data is active and so much has been put in archives. We need to focus on active data and do it efficiently.”
Air University recently closed on a two-year cycle of bringing all the college’s data into a centralized structure. Lush said organizations should start with the basic pieces, understand what the data should look like, and then use automation to map the data.
“We need to shave off just a few hours a week to say, this is what we’re going to do to start cleaning up and normalizing our data. This is something you can do today,” Lush said. “Begin to look at where you want to go and have a group, everyone buys into it and then start to normalize data slowly and surely.”
The U.S. Government Accountability Office (GAO) has evaluated AI in a variety of ways over the past few years. GAO’s Defense Capability Management Team recently examined DOD’s strategy for developing AI and the department’s inventory of AI programs.
Jon Ludwigson, Director of the Contracting and National Security Acquisitions Team at GAO, said the team identified nearly 700 AI efforts within DOD and offered suggestions on what steps they should take to improve them.
“One of the things we found is that AI within the department isn’t captured by the traditional sort of program metrics its often not its own thing, it’s often a part of another weapon systems program so that was an important finding,” Ludwigson said during the FedInsider webinar. “We think there are some ways they can improve their strategy as it relates to operationalizing AI and getting everyone lined up so that they can do a good job as they pursue AI across the department.”
During the FedInsider webinar, Ludwigson discussed how AI was almost like having a child, you need to train the child and the data is how that AI system is going to be trained.
Training AI to look for an artillery piece, for example, is a challenge.
“At some point you have so much data to where you become paralyzed because you can’t do anything with it,” Ludwigson said. “So, AI offers the possibility to offload some of the work from the people to machines that can at least do that first-tier screening function and allow humans to focus on the things that they’re really good at.”
This is a carousel with manually rotating slides. Use Next and Previous buttons to navigate or jump to a slide with the slide dots
-
Trump's HHS Secretary Pick Eyes Transparency, Data Access
Nominee Robert Kennedy wants to improve transparency and data access to empower patients and enable innovation in health care technology.
4m read -
Federal Leaders Revamp Tech Workforce, Policy
Despite the rise in interest of emerging technology, federal leaders see data, policy and the workforce as a best vehicle for change.
4m read -
Looking Back at the First Trump Administration's Tech Priorities
In his first term, Donald Trump supported cybersecurity, space policy and artificial intelligence development.
4m read -
Labor CAIO Outlines Responsible and Ethical AI Priorities, Use Cases
Department of Labor Chief AI Officer Mangala Kuppa outlined how her role is shaping the agency’s artificial intelligence strategy.
20m watch