Intro to Multispectral Imaging

0 300


Originally developed for military target identification and reconnaissance, multispectral imaging uses specific wavelengths of the electromagnetic spectrum to document environmental features of interest. Early NASA imaging of Earth from space incorporated multispectral imaging technology to map details related to oceans, landforms, and vegetation. More recently, modern weather satellites now produce diagnostic imagery using multispectral sensors. Today, multispectral imaging is used by the U.S. military to detect landmines and underground missiles. By analyzing disturbed soil features with multispectral imaging, different physical and chemical properties can be detected. Likewise, multispectral imaging of the invisible radiation emitted during intercontinental ballistic missile launches can track their trajectories. Multispectral imaging is also used to interpret ancient papyri and other ancient documents by imaging the documents in the infrared range. Typically, writing on these documents appears to the naked eye as black ink on dark paper. But when viewed with a multispectral imaging camera, the difference between ink and paper is more distinct due to the way ink and paper reflect infrared light. Natural resource managers are starting to use drones with multispectral sensors to monitor sensitive lands and preserves, including vegetated areas, wetlands, and forests. These data provide unique identification characteristics that can be measured and studied over time. Farmers are using multispectral sensors on drones to gather data and help manage crops, soil, fertilizing, and irrigation. This is part of a process called “precision agriculture.” Color-infrared (CIR) imagery renders green vegetation as red due to its high reflection in the near-infrared (NIR) spectrum. While NIR energy is invisible to the human eye, it is highly diagnostic of plant vigor and health. January/February 2020 57 Photo by Zapp2Photo/Shutterstock HOW IT WORKS Multispectral imaging captures light from a narrow range of wavelengths across the electromagnetic spectrum. Multispectral images are captured with special cameras that separate these wavelengths using filters, or with sensors that detect specific wavelengths. These wavelengths include lights from frequencies that are invisible to the human eye, such as infrared and ultraviolet light. A traditional digital camera uses a filter to block the invisible light, and only captures the visible light that falls onto the sensor. The sensor uses a Bayer filter mosaic to arrange the red, green, and blue colors on a square grid of photosensors. A multispectral camera instead captures information that may either be in the visible part of the spectrum, or invisible to the human eye. For example, at certain wavelengths soil reflects more energy than green vegetation, while at other wavelengths it absorbs more energy. Multispectral cameras can distinguish various objects from each other by these differences in reflectance. When more than two wavelengths are used to collect data, the resulting images tend to show more separation among the objects. A good example would be to look at different objects through red lenses, or only blue or green lenses.

 

PRECISION AGRICULTURE

Multispectral cameras mounted on drones capture light in green, red, and near-infrared wavelengths to produce color and color-infrared images of crops and vegetation. Multispectral imaging helps farmers minimize the use of pesticides, fertilizers, and irrigation while increasing the yield from their fields. Changes in reflectance can indicate that crops have become stressed, prompting field teams to investigate and intervene before a small-scale problem becomes more widespread. There are huge benefits to both the farmer and the environment, especially when the use of pesticides, fertilizers, and water is minimized while the yield from crops is simultaneously increased. The multispectral image data are processed with specialized software into useful information such as canopy cover, greenness, and disturbance maps. This soil and crop data allow the grower to monitor, plan, and manage the farm more effectively, saving time and money while reducing the costs associated with the use of excess water, pesticides, and fertilizer. The Normalized Difference Vegetation Index (NDVI) is a mathematical transformation of red and near-infrared (NIR) reflection used to assess the density of green vegetation cover. Computer simulation of the relative health of the crops based on multispectral data collected with a drone. 58 RotorDronePro.com Multispectral images are a very effective tool for evaluating soil productivity and analyzing plant health. When it comes to large agricultural operations, viewing the health of soil and crops and determining problem areas is a challenging task for the naked eye. Multispectral sensor technology allows farmers to see invisible issues before they become critical problems. The information derived from multispectral image data can identify pests, disease, and weeds. It can also help with land management in determining when to take agricultural land in or out of production, change to arable, or rotate crops. Analysis of multispectral data can be used to count plants and determine population or spacing issues, estimate crop yield, and measure irrigation. Multispectral imaging can also detect crops that may have been damaged by farm machinery, and thus determine which problematic machinery needs to be replaced or repaired.

 

SPECTRAL BANDS

The typical human eye is sensitive only to wavelengths between 380 to 740 nm, which is known as the visible spectrum. Humans can perceive a variety of colors ranging from violet to red. In this visible portion of the vegetation spectrum, the reflectance curve of a healthy plant exhibits the greatest reflectance in a green waveband (in the range of 550 nm). This is why plants appear green to us. A chemical compound in leaves called chlorophyll strongly absorbs radiation in the red and blue wavelengths, but reflects green wavelengths. Leaves appear greenest to us in the summer, when chlorophyll content is at its maximum. In autumn, less chlorophyll in the leaves means that there is less absorption and more reflection of the red wavelengths. This is why leaves appear red or yellow (yellow is a combination of red and green wavelengths) to the naked eye in the fall. Healthy vegetation absorbs blue and red light energy to fuel photosynthesis and create chlorophyll. A plant with more chlorophyll will reflect more green and near-infrared energy than an unhealthy plant. Thus, analyzing a plant’s spectrum of both absorption and reflection of visible and infrared wavelengths can provide information about the plant’s health and productivity. A multispectral image sensor captures image data at specific frequencies across the electromagnetic spectrum. The wavelengths can be separated by filters or with the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond our visible spectrum, such as infrared. Along with visible imaging at red, green, and blue wavelengths, the camera records data in the near-infrared and the red edge region. This red edge spectral band is particularly important for detecting changes in the reflectance of vegetation, since chlorophyll in the leaves absorbs visible light but becomes almost transparent at wavelengths greater than 700 nm. The red edge is a very narrow band (700 to 730 nm) that corresponds to the entry point of near-infrared. It is the point of sudden change in reflectance, from strong absorption of red to substantial reflection of near-infrared. This band is very sensitive to plant stress and provides information on the chlorophyll. Near-infrared spectral bands correspond to the wavelengths in the 700 to 1300 nm range and have the strongest reflectance of the bands studied. There is a very strong correlation between this reflectance and the level of chlorophyll in the plant. A highly significant variation of the reflectance in this band is produced when a plant is under stress. Along with the red spectral band, near-infrared is extensively used for compiling most of the vegetation indices in agriculture. Near-infrared spectral bands are also useful for analyzing soil properties and moisture content.

 

VEGETATION INDICES

Vegetation Indices are constructed from reflectance measurements in two or more wavelengths. These mathematical calculations are used to analyze specific characteristics of vegetation, such as total leaf area and water content. The most popular vegetation index is the normalized difference vegetation index (NDVI). NDVI is an index of plant greenness, or photosynthetic activity, and is one of the most commonly used vegetation indices. Vegetation indices are based on the observation that various surfaces reflect types of light differently. Healthy vegetation absorbs most of the red light that hits it while reflecting much of the near-infrared light. Vegetation that is dead or stressed reflects more red light and less near-infrared light. By taking the ratio of red and near-infrared bands from a remotely sensed image, an index of vegetation “greenness” can be defined. Vegetation indices like NDVI make it possible to compare images over time in order to detect significant agricultural and ecological changes.

Each of MAPIR’s Kernel camera modules are connected by their patented signal-distributing array link boards.

MULTISPECTRAL DRONES

Several companies now design and manufacture multispectral sensors for drones. MicaSense, Parrot, Sentera, and SlantRange, just to name a few, have developed multispectral sensors specifically designed for precision agriculture. Some of their products can be integrated into commercially available drones by DJI, while other companies have emerged with fixed-wing, hybrid, and multirotor platforms for these sensors. Recently, DJI announced their P4 Multispectral drone, a sign of the growing demand for this type of technology. DJI’s multispectral camera array is integrated into a three-axis stabilized gimbal, which is designed to collect data right out of the box using their new TimeSync system, real-time positioning data on images captured by all six cameras provides centimeter-level spatial accuracy. While all of these companies offer multispectral sensors with fixed wavelengths designed for precision agriculture, San Diego-based MAPIR, Inc. offers a modular multispectral camera for other applications as well. With new applications for multispectral imaging, such as infrastructure inspection, evolving, I wanted to learn more about this company.

A user can easily customize their MAPIR Kernel camera array. Seen here are three color and three black and white sensor modules.

MAPIR

I contacted the CEO of MAPIR, Nolan Ramseyer, who generously agreed to arrange a tour of the company’s manufacturing facility. One of the first things that caught my eye was a framed copy of the 2015 RotorDrone feature about Ramseyer’s first company, Peau Productions, hanging on the office wall. During the early days of multirotor development, Ramseyer saw the need for better aerial imaging, and so created high-quality, non-fisheye lenses for GoPro cameras. Peau started out modifying GoPro and other cameras, but by the summer of 2015 it evolved into the founding of MAPIR with the launch of MAPIR’s Survey camera. Ramseyer had recognized the need for an affordable multispectral camera, and providing the market with that product paved the way for his company’s further growth. In the fall of 2017, MAPIR started shipping Kernel, a modular, multispectral array camera. Kernel’s unique design allows a user to select each module’s sensor, lens, and filter to customize the camera array for their specific needs. Instead of connecting an array of sensors to a single computer, each sensor has its own brain and the capacity to easily connect to others. The ability to quickly change the modules on their multispectral camera opens up a world of new possibilities for service providers and researchers in the field. While other manufacturers have nonadjustable sensors, filters, and lenses, MAPIR offers a very customizable tool. This approach has many advantages since there are sure to be applications for multispectral imaging that have not yet been discovered. MAPIR’s philosophy of customization and modularity is reflected in their use of “small batch” manufacturing and 3D printing. MAPIR recognizes that customers may need flexibility in their cameras with all the developing applications for this technology. Each “module” is a complete camera with Linux computer and on-board storage. Modularity is important in that it affords the user the ability to customize with the exact tools needed for their application. New modules can be introduced in the future with the same architecture.

Each MAPIR Kernel camera can be configured with 30+ filter options.

THE TAKEAWAY

Multispectral imaging with the use of drones is a relatively new technology that builds on a long history of aerial and satellite remote sensing. Fueled by relatively low cost and greater user accessibility, it is making sweeping changes in agriculture, land conservation, and other fields. The ability to collect and analyze data that is invisible to the human eye has tremendous potential in many fields. Many time-consuming tasks that have been done the same way for decades can now be accomplished from the air at a fraction of the time and cost. There are a number of multispectral sensors and drones on the market, and the trend and selections are sure to continue. As new industries recognize the benefits of multispectral imaging, there will be an increasing demand for this technology in the future.

 

BY GUS CALDERON; PHOTOS BY GUS CALDERON & RICHARD MCCREIGHT



Source

قالب وردپرس

You might also like More from author

Leave A Reply

Your email address will not be published.