Vegetation Measurement and Monitoring

13.2 Sensor Types and Platforms

Learning Guide

In this module we will look at some different types of remote sensors and platforms and introduce the concept of resolution and scale. I want to stress again that this is a necessarily brief overview of these topics. If you would like more information on these topics, please see the additional learning resources module.

Sensors

First, we need to define what a sensor is. Simply put, in remote sensing, a sensor is a device that records an image or other data. In this regard, a sensor is much like a camera, a document scanner, or photo copier.

Image credit: https://crisp.nus.edu.sg/~research/tutorial/optical.htm

There are many different types of sensors that can be used in remote sensing, but some of the most common ones are: panchromatic, RGB or true-color, multispectral, hyperspectral, and LiDAR. We’ll look briefly at each one of these. But first, we need to understand the electromagnetic spectrum and how sensors capture (or sample) light.

The Electromagnetic Spectrum (EMS)

In basic terms, electromagnetic radiation is a term that describes energy waves that travel through space. Depending on their wavelength, these waves can be either radio waves, x-rays, heat, or light. The Electromagnetic Spectrum (or EMS) is the range of all possible wavelengths of electromagnetic radiation – from gamma rays to radio waves. The portion of the EMS that we can normally see with our eyes is only a small portion of the EMS. Cameras and other remote sensors, however, can record information about other portions of the EMS. How much light or other EM radiation a surface reflects gives information about that surface.

Illustration of the Electromagnetic Spectrum. Image credit: NASA’s Imagine the Universe.

Plants, soil, rocks, and water all reflect different amounts of EM radiation from different portions of the EMS. We can use that information to help us figure out what different objects or surfaces are or to tell us about the condition of vegetation. In particular, plants reflect green light and absorb red wavelengths (that’s why vegetation looks green). However, vegetation also reflects much of the near infra-red light that hits it. So if we could see into the near-infrared wavelengths, plants would be that color instead of green! The amounts of light reflected by different types of vegetation, or different growth stages of a plant can be described, and we can use that knowledge to help identify vegetation from remotely-sensed data.

Plants reflect characteristic amounts of light in different regions of the EMS

With that introduction of the EMS under our belt, we can now talk about different kinds of sensors.

Main Sensor Types

Panchromatic Sensors

The first type of sensor to consider is a panchromatic sensor. A panchromatic sensor captures reflected light from across the entire visible light spectrum. Think of black-and-white photographs. In some areas, panchromatic aerial photographs are available all the way back to the 1930’s. Why would we want black-and-white imagery of an area today? Well, you can see a lot of detail in panchromatic photos. If you were solely interested in mapping or counting something that was easy to detect – like buildings, trees in a rangeland, or water bodies – you may only need panchromatic imagery. It is also cheaper to develop a single panchromatic sensor than to develop multiple sensors for different colors. And because it is more expensive to develop higher resolution sensors, many satellite companies will pair a high-resolution panchromatic sensor with lower-resolution multispectral sensors and fuse the image products together.

Examples of panchromatic imagery.

RGB or True Color Sensors

The kinds of images that we’re most used to seeing come from RGB or True Color cameras. These sensors sample the visible portion of the EMS in three color channels (or bands): red, green, and blue. The camera in your cellphone is a RGB camera, and the cameras on most drones are RGB cameras. RGB imagery is useful for many remote sensing applications where color or texture is important. With the increasing popularity of drones, it’s now possible to get extremely high resolution RGB imagery very easily. However, because it is sampling only within the visible light portion of the EMS, RGB imagery does not contain a lot of spectral information for discriminating different types of vegetation or for looking at plant health.

True color images of the same area on Rock Creek Ranch, Idaho taken from different sensors at different altitudes.

Multispectral Sensors

Multispectral sensors are an extension of the concept behind RGB sensors – they are simultaneously recording information from multiple regions of the EMS. These regions are called bands. The difference with a multispectral sensor is that the bands are specifically defined (in terms of their location and width on the EMS). Most multispectral cameras sample into the infra-red regions of the EMS to provide more information on vegetation.

Multispectral sensors collect image data from multiple, specific portions of the EMS

Hyperspectral Sensors

Hyperspetral sensors just take the idea of multispectral sensors to the extreme. A hyperspectral sensor collects image information continuously over a large portion of the EMS using a set of very narrow, adjacent bands. Many hyperspectral sensors have over 200 bands. This provides near-continuous spectral measurements across the EMS that can be sensitive to subtle variations in reflected light or radiation. In turn, hyperspectral data can give greater potential to detect differences in vegetation types, plant types, or plant health.

LiDAR Sensors

The last sensor type that we’ll consider here is LiDAR. LiDAR stands for Light Detection and Ranging and a LiDAR sensor measures the reflections of laser pulses off of a surface to determine the distance from the sensor to that surface. LiDAR sensors are used to measure elevation or height and are very useful in that multiple laser pulse returns can be detected to determine the height of vegetation above the ground surface. LiDAR can be flown in a manned aircraft, mounted on a drone, or even collected from the ground. LiDAR is also different from the other sensor types we’ve considered in that it is an active sensor – meaning the sensor creates its own EM radiation and measures its reflectance. The previous sensors we considered were all passive sensors – they record EM radiation that originates from another source (typically the sun!).

LiDAR data are used to detect or measure surface elevation or vegetation height. LiDAR can be acquired from piloted aircraft, drones, or on the ground.

Platforms

Now that we have sensors, we need something to carry those sensors so we can acquire remote imagery. A platform is whatever is carrying the sensor. Platforms are usually satellites, piloted aircraft, or drones, but lots of other things can be used as sensor platforms. Historically, balloons, and kites (and even birds!) have been used as sensor platforms. Each platform has its advantages, costs, and limitations that need to be considered. The key is to recognize that the combination of a sensor and a platform defines the scale of a remote sensing image or product.

 

Spatial Scale in Remote Sensing

Scale is one of those topics that can quickly get very complicated. For the purposes of this module, we will restrict our definition to spatial scale and define it simply as a characteristic of a set of observations (an image in this case) that defines the minimum and maximum objects that can be observed. Think of a making measurements with a ruler – you can’t really measure anything bigger than the ruler without pulling some funny dance moves, and you can’t get precise measurements of anything smaller than the finest graduations on your ruler.

 

Scale in remote sensing is defined by the resolution and extent of your image.

 

Resolution is the size of the smallest discernable element of an image (i.e., the size of a pixel in ground units). Resolution is also sometimes called grain or ground-sampling distance.

 

Extent, on the other hand, is the maximum dimensions of an image (sometimes also called a scene). This can either be for a single image like in this Landsat scene, or it can be for several images that are stitched together.

 

Given the number of different sensors and platforms that are available, some consideration should be given to what scale is best for a particular application.

 

Examples of extent and resolution (grain) from a Landsat 8 scene.

When selecting imagery for a remote-sensing project, it is useful to consider how big the objects are that you are trying to detect or measure and whether a low-resolution image is sufficient or a high-resolution image is necessary. The relationship between the size of an object on the ground and the resolution of an image determines whether the image can be considered high  or low resolution. For example, if you are interested in mapping the occurrence of forest stands or sagebrush habitat, a 30m resolution Landsat image may be considered sufficiently high resolution. However, if your project focuses on mapping bare ground patches within sagebrush or quantifying site-scale Sage Grouse seasonal habitats, the Landsat image may be too coarse. Matching the image resolution to the features being studies generally yields the best results for remote sensing. Keep in mind, though, that higher-resolution is not always better. As resolution increases (i.e., pixel sizes getting smaller), variability of the image pixels increases, the effects of shadows become more pronounced, and file sizes get very large. All these effects make image analysis more challenging.

Whether an image is high- or low-resolution depends entirely on how you plan to use it. You should select image resolutions to match your intended application.