by Eric Schafer, Simulation Architect at Iris Automation
Although the term Unmanned Aerial Vehicle (UAV) is used almost interchangeably with “drone,” it doesn’t necessarily mean there are no people on board, it just means there is no pilot on board. Autonomous UAVs, which don’t need to be remote controlled from the ground either, need to be able to Sense and Avoid other flying things. Some aircraft may be cooperative and broadcast helpful information like location, but most airborne objects don’t reach that level of safety. Whether it’s a news helicopter or a hot air balloon, passenger jet or recreational drone, Sense and Avoid (SAA) technology helps an autonomous UAV get out of the way.
Sense and Avoid in Human Pilots
The primary sense a human pilot would use to Sense other aircraft is sight. Since clear vision is limited to the narrow fovea in the center of the human eye, looking for aircraft means systematically scanning the horizon, whenever eyes are not busy inside the cockpit. Since eyes are limited, other sensors and systems have been developed over the years—RADAR, ADS-B, TCAS, ACAS, UTM, Remote ID—to help cooperative aircraft avoid each other. Most aircraft don’t have any of these life-saving options; too expensive, too power-hungry, too heavy, too new. Whatever the reasons for being non-cooperative, the punchline of the joke, “The great thing about standards is that there are so many to choose from,” means that eyes are always required equipment.
Visual Flight Rules (VFR) are the regulations human pilots use to See and Avoid other aircraft without relying on instruments. Weather needs to be clear enough to see far away. The pilot needs to look out the window for other aircraft at least 75% of the time, scanning the horizon in 10° increments. VFR means the pilot needs to avoid other aircraft, rather than relying on Air Traffic Control (ATC) to maintain separation.
Sense and Avoid in UAVs
Do Visual Flight Rules apply to autonomous UAVs? Clear enough weather makes sense, but scanning the horizon 75% of the time (ish) in 10° (ish) increments? We can do better than that.
A symptom of how limited human eyes are as sensors is that, according to NTSB statistics , mid-air collisions are more likely to happen between pilots flying VFR, even in clear weather and broad daylight . Excellent visibility doesn’t cancel out the fact that human pilots are busy doing other things and have eyes with a narrow field of view, looking out a window at only a slice of the world around them.
In contrast, a visual Sense and Avoid system, such as Iris’ Casia 360, allows a UAV to look in all directions all the time. Visual coverage is limited only by the number of cameras and computing power that the UAV can accommodate. Unlike small recreational UAVs, large commercial fixed-wing drones have many more options for increasing size, weight, and power (SWaP). A wide field of view, no distractions, no fatigue, fast reaction time—what could be better?
More sensors could be better. Although the specs on a visual Sense and Avoid system have advantages over human eyes, the human brain is still better at discriminating between real planes and other “false positives,” especially in new situations the computer vision may not have seen before; is that a flock of geese or The Blue Angels? So additional sensors in a Sense and Avoid system help narrow down what is or isn’t a real problem, and additional risk mitigation is always a good thing, when practical.
Automatic dependent surveillance-broadcast (ADS-B) is a radio signal planes can send, identifying their plane, location, heading, and speed. Anyone with an ADS-B receiver can see those cooperative planes on a map, and an autonomous UAV can make an immediate decision about whether an avoidance maneuver is necessary. Iris’ Casia includes ADS-B for extra safety, but ADS-B is insufficient as a standalone Sense and Avoid solution. This is because, although mandated by the FAA by 2020, most planes still don’t have ADS-B transmitters. But thank you to those who have installed it for making the skies safer.
Active vs Passive Sense and Avoid
Computer Vision and ADS-B receivers are examples of passive sensors; they observe the light or signals already in the world and use that information. Active sensors, on the other hand, send out energy in hopes of sensing something in return.
RADAR is an active sensor that sends radio pulses and listens for reflections. RADAR on large aircraft allows them to sense more distant aircraft (or birds) even through fog and clouds, which is especially important for giving fast aircraft more time to react. Long-range RADAR uses too much power for small to mid-sized UAVs, but on planes (autonomous or not), RADAR can provide a valuable impression of what’s ahead in the distance.
More complicated Sense and Avoid systems have been adopted over the years, also called Detect, Sense, and Avoid (DSA) systems. Traffic Alert and Collision Avoidance System (TCAS) alerts pilots of planes with TCAS transponders of other planes near them, particularly if they are flying near the same altitude. TCAS, also known as an Airborne Collision Avoidance System (ACAS), verbally tells the pilot to climb or descend when a near mid-air collision (NMAC) is imminent. Commercial airliners use TCAS as a last resort in case verbal air traffic management fails.
UAVs don’t use TCAS, or verbal Air Traffic Control (ATC) for that matter. But an Air Traffic Management (ATM) ecosystem which is gaining acceptance for UAVs is the Unmanned Aircraft System Traffic Management (UTM). UTM is a collaboration between NASA, the FAA, and the growing UAV industry to autonomously coordinate UAV flights and avoid potential collisions. One goal is to take the burden of deconfliction off of human Air Traffic Controllers, while still providing them situational awareness of UAVs in the same National Airspace (NAS) as the manned aircraft they advise. This is accomplished through an automated network of data services, to relay flight plans and airspace constraints between UAVs and their operators.
Many other active Sense and Avoid systems exist or are under development, similar in concept to those described above but differing in the details. They all share the goal of reducing the risk of collisions.
The primary benefit of a Sense and Avoid system onboard an autonomous UAV is that it can be trusted to fly Beyond Visual Line of Sight (BVLOS). More than one type of sensor may be necessary to gain sufficient confidence in BVLOS operation, such as cameras plus ADS-B on a fixed-wing drone, or UTM plus RADAR on an air taxi. But the fact that these sophisticated systems are now possible on autonomous UAVs means we are about to witness an exponential eruption of pent up industries wanting to fly long distances autonomously and cost effectively:
- Urban Air Mobility
- Package delivery
- Emergency medical services
- Search and rescue
- Power line inspection
- Railroad and pipeline inspection
- Wildlife management
as well as new industries we haven’t yet realized. Exciting times.
 2018 NTSB US Civil Aviation Accident Statistics (data updated on September 2, 2020) https://www.ntsb.gov/investigations/data/Pages/aviation_stats.aspx
 Visual Scanning & Collision Avoidance https://www.cfinotebook.net/notebook/aircraft-operations/visual-scanning-and-collision-avoidance
 DSA Lit Review DOT/FAA/AR-08/41 http://www.tc.faa.gov/its/worldpac/techrpt/ar0841.pdf
 Unmanned Aircraft System Traffic Management (UTM), FAA https://www.faa.gov/uas/research_development/traffic_management/
Figure 1. Envision the Future: Getting From A to B