Col. John Boyd, an Air Force fighter pilot and military theorist, put forward the key concept of aerial combat known as the Observation, Orientation, Decision, Action (OODA) loop. Within this theory, the key adversary is not the capabilities of the opposing pilot or his aircraft, but time.
He proposed that a pilot achieved the advantage in aerial combat by executing these four stages in the loop more quickly than their opponent:
- Observation: collecting situational data
- Orientation: analyze and synthesize that data to form an accurate perception of the engagement
- Decision: determine a course of action based on the current mental perspective
- Action: physically execute the determined course of action
“Time is the dominant parameter. The pilot who goes through the OODA cycle in the shortest time prevails because his opponent is caught responding to situations that have already changed,” Boyd said.
His analysis of aerial combat between the F-86 Sabre and the MiG-15 during the Korean War concluded that the Sabre’s 10-1 kill ratio was primarily achieved by two factors – the combat experience of U.S. pilots during WWII allowed them to complete the first three components of the OODA loop more quickly and accurately and the hydraulically assisted controls of the F-86 allowed the pilot to initiate the decided upon maneuver more quickly than his adversary in the MiG.
The utilization of OODA loop optimization can be seen in the later development of the Lightweight Fighter Program, culminating in the F-16, which has computer-assisted control surfaces allowing for the quicker initiation of maneuver, and the development of data systems integration in the F-22 and F-35 to speed up the first part of the loop and allow that information to be shared across a joint force in multiple domains.
The command and control (C2C) challenge for the Air Force of the future is how to utilize, across multiple domains, the extremely large, and ever increasing data sets provided by the exponential growth of ISR data-acquisition technology in air and space imaging, cyberspace activity and signals intelligence.
It is a concept termed “Fusion Warfare” by Lt. Gen. VeraLinn “Dash” Jamieson, now the Deputy Chief of Staff for Intelligence, Surveillance and Reconnaissance, Headquarters U.S. Air Force, Washington, D.C. when she spoke before the Air Force Association in 2015 as the senior intelligence officer at Air Combat Command.
Jamieson described Fusion Warfare as “an asymmetric decision advantage, integrating and synchronizing multi-source, multi-domain information in a specific time and space,” to benefit the command and control decision-making loop of tactical, operational and strategic leaders.
“Fifth-generation technology is going to increase the amount of data available – it’s going to produce a hub of data that’s going to flood us, known as “Big Data”. Fusion warfare provides an opportunity for a much faster pace via air, space and cyber multi-domain operations,” Jamieson said.
Fusion Warfare is a concept that Chief of Staff of the Air Force Gen. David Goldfein embraces as a way for Airmen to quickly adapt and respond to evolving operations as outlined in the Air Force Future Operating Concept for 2030.
“We create volumes of data,” said Goldfein. “The question is how do we take the data and turn it into decision speed, and create effects? We do multi-domain operations, but we don’t achieve the speed I think we need.
“When we take a look at future conflict and the attributes we have to prepare for, I would offer that speed of warfare, speed of decision making is going to be key to success for the future … As a joint force and team, the military needs to focus on collecting enough information so commanders can make decisions and move forces at a pace the enemy can never keep up with.”
Information is the next great “contested domain,” according to Morley Stone, chief technology officer of the Air Force Research Laboratory (AFRL).
“Data is going to be the coin of the realm of the future. And he who knows how to control the data, exploit the data the fastest, is going to win,” Stone said. “Looking out into the future, you can almost guarantee that warfare will be completely different, not only because of the speed at which it will be taking place, but the platforms on which he’ll be executing it will look vastly different.
“The fighter being the primary, pointy end of the spear … may end up being a secondary or tertiary end of that spear; meaning that the battlefield of that future is going to be so dominated by non-kinetic effects, so dominated by things like informational warfare. The very nature of warfare, I think, is going to be almost unrecognizable to us 70 years from now.”
The amount of data made available by advances in ISR operations and technology is getting larger and faster as illustrated by the history, trends and future of imaging.
The earliest examples of aerial photoreconnaissance were drawings made of enemy positions by observers lifted above the battlefield in hot air balloons. By WWI, the invention of airplanes and improvements in film cameras enabled the daily imaging of the Western Front, supplying commanders with more timely and accurate information on enemy positions and movement.
Imaging recorded on film by advanced aircraft, such as the U-2 and the SR-71 dominated the Vietnam War era, but by the 1970s, electro-optical camera technology and data communications began being utilized to the point where it is now collected by aerospace and space platforms and the transfer of that raw visual imaging has become nearly instantaneous.
Daniel LeMaster, technical advisor for the Electro-optical Target Detection and Surveillance Branch at AFRL, points out that the advance in digital camera technology from 1957 (the first digitally scanned image) until today follows an exponential trend.
“All the change that we’ve seen from 1957 until now, where we’ve gone from something that was just barely recognizable as modern technology to some really amazing capabilities, we can expect that much change again by the time my career ends,” he said.
LeMaster’s exponential trend predicts that the total accumulation of advances in imaging technology will double 13 more times by 2087. As an example of technology’s pace, he produces a consumer-grade infrared camera no bigger than a quarter, which can be purchased online for about $300. It includes technology that even the U.S. military did not possess several decades ago.
“The technology is going to keep getting better faster, and it’s going to keep getting cheaper faster,” LeMaster said.
The coming explosion of imagery data will be supplied by AFRL’s research into hyperspectral imaging, laser radar, polarization imaging and laser vibrometry and hypertemporal imaging.
“In multispectral or hyperspectral imaging, what you’re fundamentally looking at is not the shape of the object, but rather what is the content of the light in terms of color,” said LeMaster. “From this color fingerprint, you can identify the target material, whether that’s a tank or concrete or canopy.
“Laser radar brings not only a conventional two-dimensional image that we’re used to seeing, but actually three-dimensional details of what you’re looking at and even possibly the ability to look through some things,” he added. “In polarization imaging, it’s not shape or color, but the orientation of the electromagnetic fields that make up the light as well.”
Hypertemporal imaging and laser vibrometry are frontier technologies that will use electo-optical sensors to measure small fluctuations in light, including those caused by sound. Microphones work by transferring vibrations of a filament inside caused by sound waves into an electrical signal, which can be recorded and then reconstituted into audible sound. Laser vibrometry accomplishes a similar result by recording the slight movements of an object caused by vibrations, even those as small as thousandths of a digital pixel.
“Laser vibrometry and hypertemporal imaging are two separate things but related,” LeMaster said. “The idea is that by capturing images at really high frame rates, much faster than any slow motion video you’ve ever seen, you can detect small changes in the fluctuation of light levels.”
It is a phenomenon demonstrated by an MIT graduate student during a TED talk, where silent images recorded through sound proof glass of the student yelling at a bag of potato chips were used to extract the sound of him reciting “Mary Had a Little Lamb.”
These are just some of the technologies that combined with an exponential increase in data gathered from space, cyberspace and signal monitoring will require a corresponding increase in the processing and storage of “Big Data” to make it all an effective component of the C2C loop.
“Analysis of Big Data is fundamentally what Google does when you put in a search query. Not only does it analyze what you’re asking for, but it has curated a bunch of unstructured information, made it ready and available and searchable in a way that it can find the patterns you need,” LeMaster said.
LeMaster believes machine learning will advance to the point where not only do you train algorithms to do some task for you, but as it’s going, it will learn from its mistakes and learn from its successes and improve.
“So processes like hyperspectral imaging and hypertemporal imaging generate big data,” said Dr. Wellesley Pereira, a senior research physical scientist with the Air Force Research Lab’s Space Vehicles Directorate at Kirtland AFB, New Mexico. “What we want is to invest into our ability to handle big data and display this big data for the benefit of the Airmen.”
Pereira believes there is little doubt that the process of making these extremely large data sets actionable by Airmen will need advanced computers and algorithms to reveal patterns, trends, and associations, especially relating to human behavior and interactions.
The development of Big Data processing and establishing a uniform methodology and technology across domains and military branches is a priority key to establishing a multi-domain command and control structure that will make the U.S. military and its partners so omnipresent that an adversary’s OODA loop will be impossible to execute.
Goldfein’s priority of multi-domain C2C supported by advanced Big Data acquisition, analysis, synthesis and near-instantaneous utilization by the integrated total force is a keystone to the future of warfighting.
“We have to be able to create so many dilemmas for the adversary that in and of itself it becomes deterrents in the 21st century,” he said.