- Sep 13, 2002
- 97,790
- 203,831
- 113
We are a doomed species.
On a cloudless morning last May, a pilot took off from the Niagara Falls International Airport, heading for restricted military airspace over Lake Ontario. The plane, which bore the insignia of the United States Air Force, was a repurposed Czechoslovak jet, an L-39 Albatros, purchased by a private defense contractor. The bay in front of the cockpit was filled with sensors and computer processors that recorded the aircraft’s performance. For two hours, the pilot flew counterclockwise around the lake. Engineers on the ground, under contract with darpa, the Defense Department’s research agency, had choreographed every turn, every pitch and roll, in an attempt to do something unprecedented: design a plane that can fly and engage in aerial combat—dogfighting—without a human pilot operating it.
The exercise was an early step in the agency’s Air Combat Evolution program, known as ace, one of more than six hundred Department of Defense projects that are incorporating artificial intelligence into war-fighting. This year, the Pentagon plans to spend close to a billion dollars on A.I.-related technology. The Navy is building unmanned vessels that can stay at sea for months; the Army is developing a fleet of robotic combat vehicles. Artificial intelligence is being designed to improve supply logistics, intelligence gathering, and a category of wearable technology, sensors, and auxiliary robots that the military calls the Internet of Battlefield Things.
Algorithms are already good at flying planes. The first autopilot system, which involved connecting a gyroscope to the wings and tail of a plane, débuted in 1914, about a decade after the Wright brothers took flight. And a number of current military technologies, such as underwater mine detectors and laser-guided bombs, are autonomous once they are launched by humans. But few aspects of warfare are as complex as aerial combat. Paul Schifferle, the vice-president of flight research at Calspan, the company that’s modifying the L-39 for darpa, said, “The dogfight is probably the most dynamic flight program in aviation, period.”
A fighter plane equipped with artificial intelligence could eventually execute tighter turns, take greater risks, and get off better shots than human pilots. But the objective of the ace program is to transform a pilot’s role, not to remove it entirely. As darpa envisions it, the A.I. will fly the plane in partnership with the pilot, who will remain “in the loop,” monitoring what the A.I. is doing and intervening when necessary. According to the agency’s Strategic Technology Office, a fighter jet with autonomous features will allow pilots to become “battle managers,” directing squads of unmanned aircraft “like a football coach who chooses team members and then positions them on the field to run plays.”
Stacie Pettyjohn, the director of the Defense Program at the Center for a New American Security, told me that the ace program is part of a wider effort to “decompose our forces” into smaller, less expensive units. In other words, fewer humans and more expendable machines. darpa calls this “mosaic warfare.” In the case of aerial combat, Pettyjohn said, “these much smaller autonomous aircraft can be combined in unexpected ways to overwhelm adversaries with the complexity of it. If any one of them gets shot down, it’s not as big of a deal.”
All told, the L-39 was taken up above Lake Ontario twenty times, each sortie giving the engineers and computer scientists the information they need to build a model of its flight dynamics under various conditions. Like self-driving cars, autonomous planes use sensors to identify discrepancies between the outside world and the information encoded in their maps. But a dogfighting algorithm will have to take into account both the environment and the aircraft. A plane flies differently at varying altitudes and angles, on hot days versus cold ones, or if it’s carrying an extra fuel tank or missiles.
“Most of the time, a plane flies straight and level,” Phil Chu, an electrical engineer who serves as a science adviser to the ace program, explained. “But when it’s dogfighting you have to figure out, O.K., if I’m in a thirty-degree bank angle, ascending at twenty degrees, how much do I have to pull the stick to get to a forty-degree bank angle, rising at ten degrees?” And, because flight is three-dimensional, speed matters even more. “If it’s flying slowly and you move the stick one way, you get a certain amount of turn out of it. If it’s flying really fast and you move the stick the same way, you’ll get a very different response.”
In 2024, if the ace program goes according to plan, four A.I.-enabled L-39s will participate in a live dogfight in the skies above Lake Ontario. To achieve that goal, darpa has enlisted three dozen academic research centers and private companies, each working on one of two problem areas: how to get the plane to fly and fight on its own, and how to get pilots to trust the A.I. enough to use it. Robert Work, who was the Deputy Secretary of Defense during the Obama Administration, and pushed the Pentagon to pursue next-generation technologies, told me, “If you don’t have trust, the human will always be watching the A.I. and saying, ‘Oh, I’ve got to take over.’ ”
There is no guarantee that ace will succeed. darpa projects are time-limited experiments, typically lasting between three and five years. Schifferle, at Calspan, told me, “We’re at the ‘walk’ stage of a typical ‘crawl, walk, run’ technology maturation process.” Still, it seems increasingly likely that young pilots will one day wonder how their fighter jet acquired the skills of a Chuck Yeager. When they do, they will be told about a refurbished Soviet-era warplane that was flown high above Lake Ontario by old-school pilots who were, in a way, writing their own obituaries.
As part of the effort to devise an algorithm that can dogfight, darpa selected eight software-development companies to participate in the AlphaDogfight Trials, an A.I. competition that culminated with three days of public scrimmages in August, 2020. The prize was a flight helmet worn by Colonel Dan (Animal) Javorsek, who was in charge of the program until he returned to the Air Force last year. The contest was supposed to be held in front of a live audience near Nellis Air Force Base, in Nevada, but the pandemic relegated the action to an online event, hosted by the Applied Physics Lab at Johns Hopkins, and broadcast via a YouTube channel called darpatv. Justin (Glock) Mock, an F-16 pilot, offered play-by-play commentary. At one point, he told the five thousand or so viewers that the objective was simple: “Kill and survive.”
REST OF ARTICLE:
The Rise of A.I. Fighter Pilots
Artificial intelligence is being taught to fly warplanes. Can the technology be trusted?On a cloudless morning last May, a pilot took off from the Niagara Falls International Airport, heading for restricted military airspace over Lake Ontario. The plane, which bore the insignia of the United States Air Force, was a repurposed Czechoslovak jet, an L-39 Albatros, purchased by a private defense contractor. The bay in front of the cockpit was filled with sensors and computer processors that recorded the aircraft’s performance. For two hours, the pilot flew counterclockwise around the lake. Engineers on the ground, under contract with darpa, the Defense Department’s research agency, had choreographed every turn, every pitch and roll, in an attempt to do something unprecedented: design a plane that can fly and engage in aerial combat—dogfighting—without a human pilot operating it.
The exercise was an early step in the agency’s Air Combat Evolution program, known as ace, one of more than six hundred Department of Defense projects that are incorporating artificial intelligence into war-fighting. This year, the Pentagon plans to spend close to a billion dollars on A.I.-related technology. The Navy is building unmanned vessels that can stay at sea for months; the Army is developing a fleet of robotic combat vehicles. Artificial intelligence is being designed to improve supply logistics, intelligence gathering, and a category of wearable technology, sensors, and auxiliary robots that the military calls the Internet of Battlefield Things.
Algorithms are already good at flying planes. The first autopilot system, which involved connecting a gyroscope to the wings and tail of a plane, débuted in 1914, about a decade after the Wright brothers took flight. And a number of current military technologies, such as underwater mine detectors and laser-guided bombs, are autonomous once they are launched by humans. But few aspects of warfare are as complex as aerial combat. Paul Schifferle, the vice-president of flight research at Calspan, the company that’s modifying the L-39 for darpa, said, “The dogfight is probably the most dynamic flight program in aviation, period.”
A fighter plane equipped with artificial intelligence could eventually execute tighter turns, take greater risks, and get off better shots than human pilots. But the objective of the ace program is to transform a pilot’s role, not to remove it entirely. As darpa envisions it, the A.I. will fly the plane in partnership with the pilot, who will remain “in the loop,” monitoring what the A.I. is doing and intervening when necessary. According to the agency’s Strategic Technology Office, a fighter jet with autonomous features will allow pilots to become “battle managers,” directing squads of unmanned aircraft “like a football coach who chooses team members and then positions them on the field to run plays.”
Stacie Pettyjohn, the director of the Defense Program at the Center for a New American Security, told me that the ace program is part of a wider effort to “decompose our forces” into smaller, less expensive units. In other words, fewer humans and more expendable machines. darpa calls this “mosaic warfare.” In the case of aerial combat, Pettyjohn said, “these much smaller autonomous aircraft can be combined in unexpected ways to overwhelm adversaries with the complexity of it. If any one of them gets shot down, it’s not as big of a deal.”
All told, the L-39 was taken up above Lake Ontario twenty times, each sortie giving the engineers and computer scientists the information they need to build a model of its flight dynamics under various conditions. Like self-driving cars, autonomous planes use sensors to identify discrepancies between the outside world and the information encoded in their maps. But a dogfighting algorithm will have to take into account both the environment and the aircraft. A plane flies differently at varying altitudes and angles, on hot days versus cold ones, or if it’s carrying an extra fuel tank or missiles.
“Most of the time, a plane flies straight and level,” Phil Chu, an electrical engineer who serves as a science adviser to the ace program, explained. “But when it’s dogfighting you have to figure out, O.K., if I’m in a thirty-degree bank angle, ascending at twenty degrees, how much do I have to pull the stick to get to a forty-degree bank angle, rising at ten degrees?” And, because flight is three-dimensional, speed matters even more. “If it’s flying slowly and you move the stick one way, you get a certain amount of turn out of it. If it’s flying really fast and you move the stick the same way, you’ll get a very different response.”
In 2024, if the ace program goes according to plan, four A.I.-enabled L-39s will participate in a live dogfight in the skies above Lake Ontario. To achieve that goal, darpa has enlisted three dozen academic research centers and private companies, each working on one of two problem areas: how to get the plane to fly and fight on its own, and how to get pilots to trust the A.I. enough to use it. Robert Work, who was the Deputy Secretary of Defense during the Obama Administration, and pushed the Pentagon to pursue next-generation technologies, told me, “If you don’t have trust, the human will always be watching the A.I. and saying, ‘Oh, I’ve got to take over.’ ”
There is no guarantee that ace will succeed. darpa projects are time-limited experiments, typically lasting between three and five years. Schifferle, at Calspan, told me, “We’re at the ‘walk’ stage of a typical ‘crawl, walk, run’ technology maturation process.” Still, it seems increasingly likely that young pilots will one day wonder how their fighter jet acquired the skills of a Chuck Yeager. When they do, they will be told about a refurbished Soviet-era warplane that was flown high above Lake Ontario by old-school pilots who were, in a way, writing their own obituaries.
As part of the effort to devise an algorithm that can dogfight, darpa selected eight software-development companies to participate in the AlphaDogfight Trials, an A.I. competition that culminated with three days of public scrimmages in August, 2020. The prize was a flight helmet worn by Colonel Dan (Animal) Javorsek, who was in charge of the program until he returned to the Air Force last year. The contest was supposed to be held in front of a live audience near Nellis Air Force Base, in Nevada, but the pandemic relegated the action to an online event, hosted by the Applied Physics Lab at Johns Hopkins, and broadcast via a YouTube channel called darpatv. Justin (Glock) Mock, an F-16 pilot, offered play-by-play commentary. At one point, he told the five thousand or so viewers that the objective was simple: “Kill and survive.”
REST OF ARTICLE:
The Rise of A.I. Fighter Pilots
Artificial intelligence is being taught to fly warplanes. Can the technology be trusted?
www.newyorker.com