An AI-piloted F-16 flew the Air Force’s top leader in a dogfight

Secretary of the Air Force Frank Kendall rode along as the AI-flown F-16 pulled 5G-maneuvers during a simulated air-to-air dogfight over Edwards Air Forc
Secretary of the Air Force Frank Kendall flies in the X-62 VISTA in the skies above Edwards Air Force Base, California, May 2. The experimental plane was flown by an AI-powered system without input from Kendall or a safety pilot in the rear seat. Air Force photo by Richard Gonzales)

Share

An Air Force F-16 flown entirely by an AI-powered brain took the service’s top boss — Secretary of the Air Force Frank Kendall — through an air-to-air dogfight against a human-flown jet.

Kendall, 75 and 25 years removed from a career as an active duty Army and Army Reserve officer, sat in the front seat of the experimental F-16 as it twisted and snaked through maneuvers of up to 5Gs of force during a simulated air-to-air fight over Edwards Air Force Base in California.

The experimental F-16 — dubbed the X-62A VISTA by the Air Force — uses what the Air Force calls “machine learning and live agent integration” to fly and fight. During the hour-long flight, the Air Force said, neither Kendall nor a safety pilot in the X-62s rear seat touched the plane’s controls. The X-62 and its AI-powered brain is a collaboration of the research division of the U.S. Air Force Test Pilot School and the Defense Advanced Research Project Agency’s Air Combat Evolution program.

“The potential for autonomous air-to-air combat has been imaginable for decades, but the reality has remained a distant dream up until now,” Kendall said in an Air Force release.

The VISTA project, according to the Air Force, began four years ago with a basic intent to get the F-16 to simulate the flying characteristics of other aircraft. But in the years since the project has morphed into what the service says is the military’s first AI-flown fighter jet.

Kendall flew with a pilot from the

AI to fly and fight?

There have been other examples of pilot-less aircraft, like a joint DARPA/Army program that flew an H-60 helicopter without a pilot in 2022. But the tactics and maneuvers the plane chose to perform in the dogfight, said Kendall, were all decisions made by its AI system without live human inputs.

“AI is really taking the most capable technology you have, putting it together, and using it on problems that previously had to be solved through human decision-making. It’s automation of those decisions and it’s very specific,” Kendall said.

But the arrival of advanced AI in cockpits and in other military system is indicative of a growing debate in military and policy circles: can AI be trusted to pull a trigger?

Israel has come under fire for incorporating AI into the decision-making and targeting of air strikes in Gaza, using a system reportedly known as Lavendar to identify as many as 37,000 Palestinians as targets in the early weeks of that war.

The UN and the International Committee of the Red Cross (ICRC) have released joint statements calling on political leaders to establish international rules on autonomous weapon systems.

“In the current security landscape, setting clear international red lines will benefit all States,” they added, highlighting that “autonomous weapon systems – generally understood as weapon systems that select targets and apply force without human intervention – pose serious humanitarian, legal, ethical and security concerns,” UN Secretary-General António Guterres and ICRC President Mirjana Spoljaric said in a statement released last October. “Human control must be retained in life and death decisions.”

The latest on Task & Purpose