Get Task & Purpose in your inbox
Russia's Air Force Wants To Use Robots As 'Automated Forward Air Controllers'
Today’s smart bombs and artillery are astonishingly lethal. But for all their smartness, they need a pair of human eyes to tell the bombs where to hit. But that means those eyes—whether the forward air controllers (FACs) of Vietnam or the Joint Terminal Attack Controllers (JTACs) of today—have to get close enough to see the enemy. Which means the enemy can see them.
Hence Russia has an idea: have robots do the work and run the risks.
The Russian air force plans to develop automated forward air controllers, according to the newspaper Izvestia. They will include automated controllers on tracked ground robots and airborne controllers on drones.
“During battle, the air controller-robots will operate independently—more exactly, with minimal intervention by operators,” said the Izvestia article, translated by the U.S. Army’s Foreign Military Studies Office in its October OE Watch magazine . “It will be required only when difficult irregular situations arise. Airborne and ground robots will get reconnaissance and guidance equipment. The standard apparatus will be able to detect a target, define its parameters, and relay the coordinates to a command post or to Aerospace Forces airplanes. The equipment will include a laser rangefinder, a high-resolution video camera, a thermal imager, and a navigation system.”
“The robot will determine what kind of target is in front of it—a tank, a machine gun position, an artillery system, or a surface-to-air missile complex,” Izvestia explained. “The artificial intelligence will select the type of weapon with which it is best to destroy the adversary’s equipment and personnel, and will also distinguish our own personnel from theirs.”
The robot will then lase the target with a laser designator for the smart bombs to home in on. A Russian air force general also told Izvestia that the robo-FAC will need to be in direct contact with aircraft, rather than routing target data through a command post.
Russian controllers had a bounty placed on their heads by the mujahideen during the Soviet-Afghan War. In Syria in 2016, a Russian controller on the ground was surrounded by ISIS; he died after he called in an airstrike on his own position to avoid capture.
Historically, the Russian military has never had a particularly strong reputation for preserving the lives of its soldiers. Interestingly, an article in the Russian army journal Armeyskiy Sbornik suggests a concern about human life—and human performance. Putting FACs in harm’s way “threatens the life of the FAC both from the enemy as well as from possible friendly fire, and on the other hand such factors as stress, fatigue, fear, malaise, and others are inherent to a person.”
But can robots really be trusted to call in airstrikes and artillery fire? To some extent, they already do. The Pentagon likes to say that there is always a human in the loop to fire a missile, but the truth is that the degree of automation in missile defense, sensors, and missile-guidance systems is such that humans are already dependent on machines to make the right call.
But the job of a forward air controller, who may be calling in airstrikes within in a couple of hundred feet of friendly troops, is particularly difficult. Even in World War II, as many as 21 percent of casualties were from friendly fire, according to one estimate .
Charlie Heidal, a former Air Force Tactical Air Control Party specialist, is dubious about robot forward air controllers. “I think a robot is a waste of effort,” he told the National Interest . “The JTAC is an expert in the employment of airpower and needs to be there at the table when the ground commander is building his plan, be there when the fires officer is building out the fires plan, and needs to be there during the execution of the mission to work past the fog of war. I pretty much doubt that any robot built today could keep up with a Ranger company that is advancing on a target across various terrain, to include going up and over fences, walls, buildings, loose brick, and so on.”
But there is another problem with robot FACs. The mass warfare of World War II is probably over, replaced by insurgencies such as Afghanistan and Syria, or even a limited conventional war in a region like the South China Sea or Eastern Europe. Civilians will likely be present, perhaps with guerrillas hiding among them, and the twentieth century slaughter of civilians as “collateral damage” won't be acceptable in the twenty-first century.
Perhaps a robot can distinguish between a tank and a school bus, or an enemy versus a friendly tank. Though if it can’t, the first time that a Russian robot controller calls in an airstrike on friendly troops will be the last time that Russian soldiers trust the robots. And as always with people versus machines, there is an element of intuition that tells a human that maybe that group of people over there are farmers on their way to market, and not guerrillas.
We’ll have to see whether machines can make that distinction.
This article originally appeared on The National Interest.
More from The National Interest:
NASA is reportedly investigating one of its astronauts in a case that appears to involve the first allegations of criminal activity from space.
Hackers could have breached US bioterrorism defenses for years, records show. We'll never know if they did
The Department of Homeland Security stored sensitive data from the nation's bioterrorism defense program on an insecure website where it was vulnerable to attacks by hackers for over a decade, according to government documents reviewed by The Los Angeles Times.
The data included the locations of at least some BioWatch air samplers, which are installed at subway stations and other public locations in more than 30 U.S. cities and are designed to detect anthrax or other airborne biological weapons, Homeland Security officials confirmed. It also included the results of tests for possible pathogens, a list of biological agents that could be detected and response plans that would be put in place in the event of an attack.
The information — housed on a dot-org website run by a private contractor — has been moved behind a secure federal government firewall, and the website was shut down in May. But Homeland Security officials acknowledge they do not know whether hackers ever gained access to the data.
The State Department doesn't really care if its human rights training for partner security forces is working or not
By law, the United States is required to promote "human rights and fundamental freedoms" when it trains foreign militaries. So it makes sense that if the U.S. government is going to spend billions on foreign security assistance every year, it should probably systematically track whether that human rights training is actually having an impact or not, right?
Apparently not. According to a new audit from the Government Accountability Office, both the Departments of Defense and State "have not assessed the effectiveness of human rights training for foreign security forces" — and while the Pentagon agreed to establish a process to do so, State simply can't be bothered.
A Kansas VA hospital police supervisor reported 'dangerous' deficiencies among his officers. Now he says he faced retaliation
The Kansas City VA Medical Center is still dealing with the fallout of a violent confrontation last year between one of its police officers and a patient, with the Kansas City Police Department launching a homicide investigation.
And now Topeka's VA hospital is dealing with an internal dispute between leaders of its Veterans Affairs police force that raises new questions about how the agency nationwide treats patients — and the officers who report misconduct by colleagues.
A New Mexico woman was charged Friday in the robbery and homicide of a Marine Corps veteran from Belen late last month after allegedly watching her boyfriend kill the man and torch his car to hide evidence.