Drone Sensors: The Different Types You Should Know About

Jul 24, 2020

Fundamentally, a sensor requires information passing from the thing you’re trying to detect to your detector. When you choose a sensor, you’re choosing the physical phenomena you are attempting to capture. In essence, you’re choosing the modality of information transported. There are two main types of sensors, active and passive, and they both have significant advantages and drawbacks. Here are the main sensor modalities for Detect and Avoid (DAA). There are more exotic modes of sensors, but here are the main ones relevant for use in drones.

Active vs passive sensors

Active modalities of sensing need to spend energy in projecting some form of wave that can be reflected into the environment. This means that the energy expended comes at a cost: the devices will consume more power and are generally larger because they need both hardware for sensing and for projecting. However, they can operate in situations where there’s little or noisy information available, such as at night or in water. They are able to work in environments where passive information isn’t present or reliable.

Passive sensors capture information already present in the environment. For example, they can use light from the sun or soundwaves reflecting off of objects or sound generated by airborne objects, i.e. engine noise. Because they don’t need the hardware or energy expenditure required for sensing, they are typically able to be lighter weight solutions.


Electro-optical (cameras)

Information from aircraft to be detected is captured in the form of light on the image sensor in a camera. It uses information freely available in the environment; that information just needs to be captured to make that detection or sense the object. Potential airborne objects that don’t generate any noise, such as parachutists and hot air balloons, can be seen using an image sensor, since everything is reflecting light.


  • Extremely low (Cost to Size, Weight, and Power) C-SWaP
  • Leverages economies-of-scale: as camera technology improves, so does the sensor
  • Ubiquitous UAS sensor (most UAS are already equipped with a camera)
  • Ability to classify intruders and thus be able to determine likely speed and direction
  • Mimics the human eye which is the current standard for aviation: private pilots without (Instrument Flight Rules (IFR) training is the biggest proportion of pilots in the US


  • Visual Flight Rules (VFR) ability only 
  • Shorter detection ranges
  • A lot of data is captured at the same time and sophisticated methods are required to obtain the useful information


Acoustic sensing solutions use sound and an array of microphones to detect the sound from other aircraft to determine where and how far away it is. Unlike an image sensor, acoustic sensing systems wouldn’t be able to detect anything without an engine, including parachutists and hot air balloons.


  • Low C-SWaP


  • Directional accuracy only
  • Ownship noise (the noise of the drone the system is mounted can interfere with the system’s ability to detect) and other interference and clutter issues
  • Short detection range



Sonar is used to detect objects underwater by projecting sound into the environment in order to detect the energy that is reflected when those sound waves come into contact with an object. For example, submarines use a “boop” noise and an array of microphones to determine the size and location of nearby objects.


Lidar uses light by projecting lasers into the environment. The light beams out of the system and that light is reflected off of an object and captured by the sensor. Self-driving vehicles typically use lidar for navigation.


  • High resolution at close range


  • Requires high power to detect at long ranges, such as being able to avoid Near Mid Air Collision (NMAC) through DAA
  • The lasers have the potential to hit the eyes of pilots in manned aircraft


Radar projects radio waves into the environment and then listens to the reflection of that energy from the object with the receiver. With an array of receivers, you can determine the position and velocity attributes of what you’re sensing.


  • Long range
  • Weather resistant
  • The legacy technology of aerospace


  • High C-SWaP
  • Low resolution
  • Hard to classify intruders because the information captured is very limited: main information is usually location, velocity, and cross-section (very rough size)

Casia, the electro-optical Detect and Avoid solution 

At Iris, we chose a passive sensing modality because of its ability to be small, lightweight, and low cost, which are very important considerations for integrating with drones. Casia can also differentiate between different types of flying objects. Ours is the only available sensor that is useful for small UAS.

The benefits of low Cost to Size, Weight, and Power (C-Swap) has been the main driving force in the DAA drones race. Radar and Lidar haven’t been able to meet the requirements for integration on UAS where payload and available battery power are limited. 

Limitations of image sensors, which detect visual wavelengths of light, are that it’s unable to detect in situations where there’s low illumination, such as at night. One way to overcome the challenges of a standard image sensor is by combining sensors across the frequency spectrum, such as thermal. In this way, you’re able to overcome the challenges of any one sensor to manage any environment while maintaining the advantages of passive sensors over active sensors.