Jesus' Coming Back

An AI took on a human pilot in a DARPA-sponsored dogfight

A self-driving fighter jet powered by artificial intelligence took on a human pilot in a dogfight for the first time last year over California, a major development in the Pentagon’s effort to safely load AI onto its platforms. 

Who came out on top? Officials wouldn’t say. But the AI agents “performed well” in a variety of scenarios throughout the tests, said Lt. Col. Ryan Hefron, the program manager for DARPA’s Air Combat Evolution program, called ACE. 

“We had lots of test objectives that we were trying to achieve in that first round of tests. So asking the question of, I’ll say, who won? It doesn’t necessarily capture the nuance of the testing that we accomplished. But what I will say is that the purpose of the test was really to establish a pathway to demonstrate that we can safely test these AI agents in a safety critical air combat environment,” Hefron told reporters Friday.

DARPA revealed this week that an X-62A VISTA aircraft, which is an F-16 fighter jet modified to test and train AI software, engaged in a dogfight against a human pilot in another F-16 during a September test at Edwards Air Force Base. Through DARPA’s ACE program, the agency is developing trusted “combat autonomy” using human-machine collaborative dogfighting. The team flew 21 test flights from December 2022 through September 2023. 

An AI agent controlled the F-16 in various offensive and defensive combat sets during the September dogfighting test, but two human pilots remained in the autonomous fighter jet for safety. 

While officials wouldn’t give the win-to-lose ratios due to “national security reasons,” Hefron did detail some of lessons learned, such as how there are still gaps between simulated and real-world tests.  

Another lesson from the test is how quickly AI agents can develop, said Col. James Valpiani, commandant of the Air Force’s test pilot school. 

“We were able to upload the software changes to the aircraft while it was holding short, ready to take off, and even airborne, we’re able to transition between multiple versions of the same AI agent airborne between combat sets, and this is a paradigm change to how software development happens in industry and certainly in the aviation industry today,” Valpiani said. 

Employing and trusting AI in dogfighting scenarios is challenging because it’s inherently dangerous, Valpiani said, and human pilots are taught proven safety and ethical norms for flying. 

“Ultimately, when we talk about trust in the ACE program, what we’re talking about is compliance with these well-codified norms. Now we know that the AI agents are going to perform in some ways differently than humans are going to perform. They’re not humans, and we’ve seen that there are some behaviors that are different than what we expect to see. What we ultimately mean in trust, is…does it ultimately achieve the objectives of the dogfighting set while complying with those norms?” Valpiani said. 

This effort will feed into the Air Force’s effort to build autonomous drones that will fly alongside manned fighters, called collaborative combat aircraft, or CCAs. 

Edwards will test CCAs “in the near term,” Valpiani said, and the same people working on the ACE program will lead those testing efforts. 

Air Force Secretary Frank Kendall recently said he plans to fly in the autonomously-piloted plane later this year to personally experience the AI algorithms in action.

Kendall flying in the X-62 is a “strong indication that the work that we’re doing here directly feeds into and helps advance the collaborative combat aircraft effort,” Valpiani said. 

The program plans for more demonstrations of autonomous combat maneuvers through the year.

Defense One

Jesus Christ is King

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More