Open Access
27 January 2018 Method of radiometric quality assessment of NIR images acquired with a custom sensor mounted on an unmanned aerial vehicle
Damian Wierzbicki, Anna Fryskowska, Michal Kedzierski, Michalina Wojtkowska, Paulina Delis
Author Affiliations +
Abstract
Unmanned aerial vehicles are suited to various photogrammetry and remote sensing missions. Such platforms are equipped with various optoelectronic sensors imaging in the visible and infrared spectral ranges and also thermal sensors. Nowadays, near-infrared (NIR) images acquired from low altitudes are often used for producing orthophoto maps for precision agriculture among other things. One major problem results from the application of low-cost custom and compact NIR cameras with wide-angle lenses introducing vignetting. In numerous cases, such cameras acquire low radiometric quality images depending on the lighting conditions. The paper presents a method of radiometric quality assessment of low-altitude NIR imagery data from a custom sensor. The method utilizes statistical analysis of NIR images. The data used for the analyses were acquired from various altitudes in various weather and lighting conditions. An objective NIR imagery quality index was determined as a result of the research. The results obtained using this index enabled the classification of images into three categories: good, medium, and low radiometric quality. The classification makes it possible to determine the a priori error of the acquired images and assess whether a rerun of the photogrammetric flight is necessary.

1.

Introduction

Currently, low-altitude photogrammetry and remote sensing are two of the most evolving fields.13 There are many studies on the analysis of radiometric imaging quality obtained from low altitudes used in remote sensing products.46 In photogrammetry, most of the research is focused on the geometry of images acquired from low altitudes.79 The research concerning the accuracy of low-altitude photogrammetric studies does not take into account the impact of the radiometric quality.1012 However, as shown in many studies,1315 the processing and evaluation of low-altitude data obtained in the visible and near-infrared (NIR) range are still current and significant research problems.

Low-altitude NIR imagery data can include numerous distortions and radiometric inhomogeneities, such as changes in the radiation source (the Sun), terrain relief, the directionality of radiation reflection or emissions from the Earth’s surface, absorption and dispersion in the atmosphere. In practice, weather conditions (clouds and precipitation) and lighting conditions during imaging and sensitivity of the NIR camera sensor are also of importance.

However, so far the radiometric quality, which depends mainly on the factors mentioned above, has not been analyzed, especially in the case of NIR images. Previous research proves that there is a need to develop objective radiometric quality indices of images acquired using UAVs.16 Moreover, the artifacts mentioned above in the radiometric quality of images often make the visual analysis and interpretation of images difficult or impossible. In addition, a low radiometric quality of images results in a deterioration of the final accuracy of photogrammetric and remote sensing products.

Contemporary photogrammetry solutions and computer vision complement each other. They provide solutions for three-dimensional modeling, Earth surface mapping, navigation, and camera calibration. The contemporary photogrammetric software relies on computer vision solutions—dense image matching and to some extent enabling the automation of low-altitude imagery data processing (Match-AT, Pix4D, Trimble UAS Master, Agisoft Photoscan, and open source software, e.g., SURE17 and VisualSFM18). However, the available tools are still limited by optical images matching and nonoptical images matching defects (e.g., thermal) and defects of low radiometric quality images other than NIR. So far, traditional methods based on matching portions of the image (area-based matching) or matching image features (feature-based matching)19,20 are commonly used. New methods are based on matching every pixel of the image (image dense matching), e.g., using the semiglobal matching-based stereo method.21 Currently, applied algorithms are based on structure from motion22,23 using point descriptors that support the process of automated detection of corresponding features in the images.24 Usually, the following detectors are used to combine images: smallest univalue segment assimilating nucleus,25 efficient maximally stable extremal region,26 random sample consensus,27 SIFT (scale invariant feature transform), which helps to find homologous points that have shifted with respect to each other and are of a different scale28 in images, ASIFT (affine SIFT),29 and SURF (speeded up robust features)30 algorithms. Unfortunately, all these algorithms are inefficient if the scale differences are significant; the images are rotated with respect to each other,31 and there are significant differences in the radiometric quality. Development of these algorithms is supported by open source solutions implemented, e.g., in the OpenCV library.

2.

Related Works

Contemporary UAV photogrammetry research is concerned mainly with issues of low-altitude image geometry. The radiometric quality of multispectral photogrammetric images is currently the main topic of numerous research projects worldwide. The best-known project dealing with these issues was initiated by Euro-SDR. It concerns the essence of radiometric quality of photogrammetric images.32 In the case of classic multispectral aerial images, the problem of radiometric quality was solved by applying large format digital aerial cameras with a radiometric resolution of 12 bits, equipped with advanced low aberration optics. The UAV custom compact cameras are usually equipped with arrays with an 8-bit radiometric resolution and optics, which cause aberration. Moreover, the lenses of sensors acquiring NIR images are equipped with orange longpass filters and black IR-only longpass filters, which significantly reduce the radiometric quality of the images.

Image quality may be defined using various parameters: radiometric resolution and accuracy represented by the noise level, or ground resolution and sharpness described by the modulation transfer function (MTF).33 The radiometric quality of digital images can be defined as a detailed mapping of local irradiation changes recorded by the imaging system while maintaining a continuum of brightness adequate for the mapped scene. The internal radiometric quality of digital images is formed by the local image contrast, tonal range, random noise, and radiometric resolution.32,34 Radiometric quality of an image is also influenced by the sensor’s features determined by sharpness, contrast, or resolution.

The research carried out so far concerns the issues of radiometric quality of images from UAVs to a limited degree only. The results of research concerning applications of the signal-to-noise ratio value of the images and the radiometric accuracy and stability of the sensor related to the quality of aerial triangulation and DSM are available in published papers.3538 Image quality is also described by the MTF,39 the point spread function and the line spread function.4042 However, these parameters cannot fully describe the image quality43 and they are mainly used in industrial image processing, satellite imagery quality assessment, and for optoelectronic sensor production purposes.44

In digital image processing, image quality assessment is a multiaspect problem depending on the requirements of the application of the processed image. The images represented as arrays are subject to objective and subjective quality assessments. Most classic methods of image quality assessment belong to the group of comparative, i.e., the image reconstructed after compression is compared to the original image. Thus, reliable measures of image quality assessment should be investigated.45 Peak signal-to-noise ratio is commonly used as the quality measure. The mean square error (MSE),46 i.e., the value of the MSE for individual pixels of the image, is another index often used for quality assessment of video data.47 The structural similarity (SSIM)46 index taking into account three types of image distortions (image luminance, contrast, and structure) is often used for the assessment of the image signal quality. For an original signal in the form of x={xi|i=1,2,,N} and a distorted signal in the form of y={yi|i=1,2,,N}, the result of applying the SSIM index to image fragments defined by an N×N pixels mask has the form of an image quality map of a resolution that is smaller by N1 rows and N1 columns than the image. It is recommended to apply a two-dimensional Gauss window of 11×11  pixels size.46

3.

Proposed Method

The paper presents a new index for objective quality assessment of NIR images acquired from low altitudes in various weather and lighting conditions. Since there are no objective methods of quality assessment of digital NIR images acquired from low altitudes, the authors claim that the developed method will make it possible to increase the reliability of photogrammetric studies concerning NIR data from UAVs. In order to prove this claim, the authors developed an experiment for determining and analyzing the statistical values for images acquired in the NIR range (690 to 1050 nm). Image processing and statistical algorithms were used to analyze images in order to determine the objective quality index.

The analysis of the results was based on traditional measures applied in image quality assessments. The proposed new method makes it possible to develop the objective image quality index. It was proven that analyses of the authors’ index make it possible to classify images into three groups of radiometric quality.

4.

Data Acquisition

4.1.

Test Areas

The low-altitude NIR images were acquired from four test areas (Fig. 1). The first test area (Liwiec) was located in north-eastern Poland. The terrain is flat and partly forested, with single building development. The low-altitude imagery data were acquired during a few flight missions in various weather and lighting conditions in July 2015.

The second test area (Opatow) was covered in agricultural fields. The terrain around the fields was undulating. It is covered by numerous cultivated fields, grassland, and low shrubs. There are single buildings. There is a lake of about 2.5 ha in the central part of the test area. The northern, western, and southern shores of the lake are steep with rocky cliffs. The eastern shore is gentle and situated close to a local road. There is a small forest and single trees north of the lake. The imagery data were acquired in July 2015.

The third test area (Nadarzyce) was a low urbanized flat area. The location is characterized by moderate amounts of forested areas, sparsely scattered buildings, agricultural fields, and a lake in the central part of the imaged area.

The fourth test area (Tylicz) was located close to a village. The terrain has a significantly more diverse orography (there are greater altitude differences). The acquired images encompassed a fragment of a hill with a ski lift (southern part) and a village with a closely built-up area (northern part). There are forests along the southern side of the river bed. The hill was covered with low grass while the area of the ski slope was covered with snow. The built-up area in the northern part of the study area has a low level of urban development. There are detached houses, road infrastructure, single trees, shrubs, and prevailing grass. The photogrammetric flight was carried out in March 2016.

All the images of the four test areas were taken between 11 am and 3 pm.

Fig. 1

Location of study areas.

JARS_12_1_015008_f001.png

4.2.

Description of the Platform and Sensors

The NIR images were acquired using the fixed-wing Trimble UX5. The UX5 comes with a fully automated take-off, flight, and landing control. Its flight time (endurance) is 50 min. It is an UAV customized for high ground resolution imagery data acquisition. The ground dimensions of the pixel and the area covered in the image increase with the flight altitude. The UX5 is highly rated among mini UAVs. The onboard GPS/INS system allows for fully autonomous flight at a tasked altitude and coverage along and across the flight route. The flight route may be monitored in real time by the flight controller. The UX5 platform enables automatic release of the camera shutter for the acquisition of imagery of the Earth’s surface. The UAV takes off from a particular launcher. The system may be operated at wind speeds of up to 18  m/s and in weather conditions no worse than light intensity precipitation. A Sony NEX-5T camera was installed onboard the Trimble UX5 for imagery data acquisition. It is a compact digital camera equipped with a 16.1-megapixels CMOS array providing maximum image resolution of 4912×3264  pixels. It enables ground sampling distance (GSD) of 0.024 to 0.24 m depending on a flight altitude of 75 to 750 m. The images are recorded in JPEG format with lossy compression of data. The camera was equipped with a Voigtlander lens with a fixed focal length of 15 mm and a maximum aperture of f/4.5.48 The NEX-5T camera was modified in a way that makes it possible to acquire images in the full range of the array’s sensitivity. The filter, which was originally located in front of the array for reducing the spectral range down to 690 nm, was removed. A black filter that blocked visible light, restricting sensor sensitivity to the NIR range of 690 to 1050 nm only, was applied. The sensor records radiation to a maximum wavelength of about 1050 nm. A black filter (B+W 092) that blocked visible light, restricting sensor sensitivity to the NIR range of 690 to 1050 nm only,49,50 was applied.

4.3.

Flight Campaigns

The photogrammetric flights were conducted in various weather and lighting conditions at altitudes of 75 to 650 m in four different test areas of diverse topography. Application of the filter means that the images contain only NIR wavelengths. The blue pixels (band 3) are used to record nothing but pure NIR (roughly 800 to 1050 nm), while the red band (band 1) in the images obtained with this filter is, in fact the red edge, roughly from 690 to 770 nm.49 All the acquired images were recorded with an 8  bit/channel radiometric resolution. Table 1 provides a description of the imaging data used to develop the quality index. Table 2 contains image data obtained for verifying the quality index.

Table 1

Characteristics of the four flight campaigns: experiment data.

Flight mission nameFlight mission I: July 2015 LiwiecFlight mission II: July 2015 OpatowFlight mission III: March 2016 TyliczFlight mission IV: July 2015 Opatow
Photographic conditionsFavorable weatherModerate weatherAdverse weatherModerate weather
Wind speed (m/s)2.24.36.24.2
Shutter speed (s)1/25001/25001/25001/2500
Altitude (m)10015015075
GSD (m)0.040.050.050.024
Number of images10903754191102
Forward and side overlap between images (%)75807580

Table 2

Characteristics of the four flight campaigns: verification data.

Flight mission nameFlight mission V: September 2015 NadarzyceFlight mission VI: July 2015 OpatowFlight mission VII: July 2015 OpatowFlight mission VIII: July 2015 Opatow
Photographic conditionsModerate weatherModerate weatherFavorable weatherFavorable weather
Wind speed (m/s)7.45.24.34.2
Shutter speed (s)1/32001/25001/40001/4000
Altitude (m)65030075150
GSD (m)0.210.100.0240.05
Number of images2211211299358
Forward and side overlap between images (%)75808080

All flights were planned using the Trimble Aerial Imaging software. The lens sharpness setting in the camera was set to infinity while ISO sensitivity was set to AUTO. The weather and lighting conditions for the specific measurement campaigns (Tables 1 and 2) were characterized according to the following criteria:

  • Favorable weather and lighting conditions: clear sky or few clouds. Average light intensity: 14,000 lux.

  • Moderate weather and lighting conditions: scattered or broken clouds and mist. Average light intensity: 1200 lux.

  • Adverse weather and lighting conditions: overcast and intensive fog. Average light intensity: 200 lux.

5.

Experiments

The research consisted of three phases. The first phase of the experiment was based on the method proposed by Kędzierski and Wierzbicki.16 Each test image was divided into 100 equal segments. According to the characteristics of data recording, band 1 is the red band (roughly 690 to 770 nm), band 2 is the green band, and band 3 contains only NIR, where the blue pixels are used to record nothing but pure NIR (roughly 800 to 1050 nm).49 In this configuration, pixels covered by a blue filter in the Bayer CFA receive only NIR.50 The mean values of the pixel brightness and standard deviation (SD) of the pixel brightness were determined for each channel of the images.

In the second phase, the same statistic values—mean value and standard deviation (SD)—were determined for each of the 100 image fragments. The SD values are related (but not limited) to: state of crops, spatial variability, and also shadows.

In the third phase, the value of the WNIR index (proposed new image quality index) was determined for all test images, and its immunity to a false assessment of the radiometric quality of images was studied.

5.1.

Examination of the Gridded Standard Deviations Maps: Analysis of the SD Values Distribution

The size of each segment was 1/10 of the resolution. In this case, it was 491×326  pixels. The distribution of standard deviation values is a schema, and therefore each represented fragment is a square. The digital number (DN) and value distribution images are shown in Fig. 2, and Fig. 3 is a graphical representation of these calculations, with the colors resulting from an interpolation of the standard deviation values calculated for the center of each segment.

Fig. 2

Distribution of SD values (DN value) in the NIR images acquired in favorable weather conditions.

JARS_12_1_015008_f002.png

Fig. 3

Distribution of SD values (DN value) in the NIR images acquired in moderate weather conditions.

JARS_12_1_015008_f003.png

Figure 2 contains an analysis of the NIR images acquired in favorable lighting conditions. All the images have been acquired during sunny days with clear skies. An analysis of the SD values distribution shows that SD values range from 11 to 50 DN in over 30% of the image area for flight mission I, 16 to 50 DN in over 50% of the image area for flight mission VII, and 18 to 45 DN for the majority of the image area with a maximum value of 54 DN for flight mission VIII.

Figure 3 contains an analysis of the NIR images acquired in moderate lighting conditions. An analysis of the SD values distribution shows that the SD values range from 11 to 37 DN in over 50% of the image area for flight mission II. 16 to 47 DN in over 50% of the image area for flight mission IV, and 2 to 6 DN for most parts of the image area. Over 50% of the SD values were within 12 to 30 DN for the majority of the image area with a maximum value of 38 DN for flight mission VI.

Figure 4 contains an analysis of the NIR images acquired in adverse lighting conditions. An analysis of the SD values distribution shows that the SD values range from 7 to 13 DN in over 50% of the image area for flight mission III. The gridded stats refer to initial research and show only one SD dependency on the fragment of the test image. Based on the previous studies, the assumption of SD dependence on image quality was transferred to the whole image.16

Fig. 4

Distribution of SD values (DN value) in the NIR images acquired in adverse weather conditions.

JARS_12_1_015008_f004.png

5.2.

Examination of SD of G-R-NIR by Image Sequence

Using the initial results, the SD values were then determined for all images in three channels. The image number is the successive image number obtained during a photogrammetric flight over a study area (Figs. 5Fig. 6Fig. 78).

Fig. 5

Distribution of SD values of DN for three channels in 1090 NIR images acquired in Flight mission I.

JARS_12_1_015008_f005.png

Fig. 6

Distribution of SD values of DN for three channels in 375 NIR images acquired in Flight mission II.

JARS_12_1_015008_f006.png

Fig. 7

Distribution of SD values of DN for three channels in 419 NIR images acquired in Flight mission III.

JARS_12_1_015008_f007.png

Fig. 8

Distribution of SD values of DN for three channels in 1299 NIR images acquired in the fourth mission.

JARS_12_1_015008_f008.png

For images of flight mission I (Fig. 5), the SD values for band 3 (Pure NIR) were in the range from 10.1 to 36.9 DN while the mean value was 16.3 DN. For band 2, the SD values were in the range from 14.1 to 50.4 DN while the mean value was 24.1 DN. For band 1, the SD values were in the range from 2.2 to 60.4 DN while the mean value was 18.0 DN.

For images of flight mission II (Fig. 6), the SD values for band 3 (pure NIR) were in the range from 20.8 to 67.9 DN while the mean value was 33.2 DN. For band 2, the SD values were in the range from 18.8 to 59.7 DN while the mean value was 27.6 DN. For band 1, the SD values were in the range from 19.0 to 66.3 DN while the mean value was 32.0 DN.

For images of flight mission III (Fig. 7), the SD values for band 3 (pure NIR) were in the range from 4.3 to 45.7 DN while the mean value was 28.1 DN. For band 2, the SD values were in the range from 1.9 to 18.0 DN while the mean value was 6.1 DN. For band 1, the SD values were in the range from 5.4 to 66.0 DN while the mean value was 34.8 DN.

For images of the flight mission IV (Fig. 8), the SD values for band 3 (pure NIR) were in the range from 21.6 to 75.1 DN while the mean value was 31.9 DN. For band 2, the SD values were in the range from 18.9 to 61.2 DN while the mean value was 26.6 DN. For band 1, the SD values were in the range from 20.0 to 77.7 DN while the mean value was 30.1 DN.

The conducted experiments indicated that the SD values (Figs. 5, 6, and 8) for images acquired in favorable weather conditions, clear sky or few clouds, are high (about 30 DN) for band 3 (pure NIR). The values for images acquired in moderate or adverse weather conditions are lower (by about 30%). Additionally, the SD values for band 2 are significantly lower than the other channels (by about 50%). This observation is an additional confirmation that the images were acquired in worse weather and lighting conditions (Figs. 6 and 8). The SD indicates images of potentially poor radiometric quality. The regular peaks visible on the plots (Figs. 58) are a result of a turning maneuver of the UAV. At the moment of performing this maneuver, there is a sudden change in the direction and angle of imaging. This leads to a sudden deterioration in image quality. The observed relations are closely related to Rayleigh scattering for which the scattering coefficient and the scattered light intensity are inversely proportional to the fourth power of the light wavelength. This indicates that the scattering of radiation at 1000 nm wavelength in the NIR band is about 40 times weaker than of the blue light at 400 nm wavelength. Observations of test images indicate that the Rayleigh scattering influence is reduced in the NIR band, and therefore the amount of information is increased. The information in the visible band is scattered by fog or precipitation. Using these observations and experience from previous experiments,16 the authors developed the following equation for determining the value of the radiometric quality index of NIR images acquired from low altitudes:

Eq. (1)

WNIR=inwi·(μiσi)inwi,
where WNIR is an NIR radiometric quality index, μi is an average DN pixel value in a given band [DN in an 8-bit scale (0 to 255)], σi is a pixel brightness standard deviation value in a given band, wi is a weight value of a given band determined empirically on the basis of the relative luminance value, i is a band sequence number, and n is a number of bands.

The relative luminance of an image may be defined as the result from the three bands. A conversion of color information from RGB to luminance was made. The luminance (L) can be calculated from linear RGB components as follows:51

Eq. (2)

L=0.2126R+0.7152G+0.0722B,
where L is an image luminance, R is a red band of the image, G is a green band of the image, and B is a blue band of the image.

For further considerations, it was assumed that the red and NIR components transfer the most information. Figures 68 illustrate the relationship of SD to band 1 on individual images concerning what is in them. Figures 57 represent the SD value for all images acquired in a given area. Based on this analysis and the luminance value, the result is an image quality index—WNIR.

Therefore, for the test images, the values of the weights were assumed in a different order than in the classic relative luminance distribution:

The values of the three weights (0.2126; 0.0722; 0.7152) for each band were determined empirically using the modified luminance value.

The determined weight values result from the transmission of each of the bands in the SONY NEX-5T camera equipped with a black IR-only longpass filter (Fig. 9). When this filter is used, the transmission is lowest for band 2 (G)—lowest weight, next for band 1 (R), and finally band 3 (NIR). The graph in Fig. 9 gives a rough idea about the cut-on and cut-off wavelengths for camera bands and the black filter. In practice, the black filter may transmit residual amounts of green and red radiation.

Fig. 9

A conceptual graph to give a rough idea about cut-on and cut-off wavelengths for the Sony NEX 5T camera bands and the black filter.49

JARS_12_1_015008_f009.png

The modified equation for the image quality index will have the following form:

Eq. (3)

WNIR=(μband1σband1)·wband1+(μband2σband2)·wband2+(μband3σband3)·wband3.
Note that the denominator in Eq. (1) will equal 1 because wband1+wband2+wband3=1.

5.3.

Examination of the WNIR Metric Using the “Experimentation” Image Sequence

The following figures present the WNIR index diagrams for each sample of images:

Analysis of the figure (Fig. 10) shows that the WNIR index value for this data sample (images acquired in favorable weather conditions) is in the range from 3.4 to 29.1. The index value does not exceed 10.0 for the majority of the images.

Fig. 10

WNIR index values for 1090 images acquired from low altitudes during flight mission I.

JARS_12_1_015008_f010.png

The WNIR index value for images acquired in moderate weather conditions (clouds and mist) is in the range from 2.2 to 8.4 (Fig. 11). The average value of the index is 5.4 for this data sample.

Fig. 11

WNIR index values for 375 images acquired from low altitudes during flight mission II.

JARS_12_1_015008_f011.png

The WNIR index value for images acquired in adverse weather conditions (clouds, mist, and light rain) is in the range from 1.2 to 5.0 (Fig. 12). The average value of the index is 2.9.

Fig. 12

WNIR index values for 419 images acquired from low altitudes during flight mission III.

JARS_12_1_015008_f012.png

The WNIR index value for images acquired in moderate weather conditions (few clouds) is in the range from 1.9 to 8.2 (Fig. 13). The average value of the index is 5.6.

Fig. 13

WNIR index values for 1102 images acquired from low altitudes during flight mission IV.

JARS_12_1_015008_f013.png

Results of visual analysis of the images and their histograms for the specific bands were used to determine the WNIR index value ranges for classifying the images as good, medium, or low radiometric quality. The low radiometric quality images should be rejected from further photogrammetric processing. The WNIR index value ranges (Table 3) were determined empirically. The results compiled in the graphs (Figs. 710) were used to determine the intervals.

Table 3

Classification of radiometric quality of images with respect to WNIR value.

Radiometric quality of imagesWNIR ranges
Good radiometric qualityWNIRε [4.9; 19.6)
Medium radiometric qualityWNIRε [4.0; 7.2)
Low radiometric qualityWNIRε [1.1; 4.0)

As mentioned, the NIR images’ quality index value ranges were determined empirically. The analysis showed that it is difficult to define a one-value limit between images of good and medium radiometric quality. It is probably a result of the technique of recording NIR images acquired by the sensor. This leads to a conclusion that the NIR information in the test images was not only recorded in band 3 but probably partially also in the two other bands. The results compiled on the graphs (Figs. 1013) were used to determine the intervals.

Moreover, the type of ground surface, i.e., many areas can be characterized as highly humid, in the photographed areas affected the signal in the NIR range. As a result, a lower DN value occurred in the image, i.e., there were local decreases in brightness.

6.

Results

6.1.

Examination of the WNIR Metric Using the “Validation” Image Sequence

In order to verify the proposed NIR images’ quality index, a separate set of test data was used to calculate the index values. A set of data acquired during flight mission V in the area of Nadarzyce at an altitude of 650 m was used as the first sample for verification. Most of the images were acquired in favorable weather conditions with few clouds (moderate weather).

The WNIR index value for images acquired in these conditions (Fig. 14) is in the range from 4.4 to 17.0. Using the WNIR index, 212 images of the test sample may be assigned to the good radiometric quality category but, half may also be classified as medium radiometric quality images. This ambiguity may be caused by the properties (heterogeneous texture) of the area of the images’ acquisition and demonstrates the vulnerability of the WNIR index.

Fig. 14

WNIR index values for 221 images acquired from an altitude of 650 m.

JARS_12_1_015008_f014.png

Another sample for verification contained imagery data acquired during flight mission VI from an altitude of 300 m in the area of Opatow in moderate weather conditions. The index value was determined for a sample of 121 images.

The WNIR index value for images acquired in these conditions (Fig. 15) is in the range from 2.9 to 7.5. The average value of the index is 5.1 for this sample. This sample contains 76% of images that may be classified as medium radiometric quality ones, 20% as of low quality, and the remaining 4% as good quality.

Fig. 15.

WNIR index values for 121 images acquired from an altitude of 300 m.

JARS_12_1_015008_f015.png

The WNIR index value for images acquired during flight mission VII is in the range from 1.2 to 8.9 (Fig. 16). Results of visual analysis of the images and the determined values of the index classify 1010 images as a medium, 29 images as good, and 260 as low radiometric quality. In most cases, images classified to the last group covered water or forested areas.

Fig. 16

WNIR index values for 1299 images acquired from an altitude of 75 m.

JARS_12_1_015008_f016.png

The WNIR index value for images acquired in favorable lighting conditions (Fig. 17) is in the range from 2.0 to 7.5. The average value of the index is 4.8 for this sample. Results of test data analysis and comparison with the determined values of the index classify 291 images as medium, 4 images as good, and 50 as low radiometric quality. As in the previous sample, low radiometric quality images covered forested areas and water zones.

Fig. 17

WNIR index values for 358 images acquired from an altitude of 150 m.

JARS_12_1_015008_f017.png

6.2.

Verification WNIR Metric Using Other Image Quality Index

The SSIM index was proposed for verification of the WNIR index. It was selected because it is similar to the method of visual assessment of quality and it is partly based on statistical values of the compared images.

Twenty sample images for three representative areas were selected for the tests: agricultural field (I), urban area (II), and forest (III). Selected images were obtained as part of flight mission I. The first group of images (I) contained images acquired over cultivated areas, fields, and wasteland. Acquisition angles ranged from 1.1 deg to 3.1 deg. The second group of imagery data (II) contained data acquired over inhabited areas. In the photographed area, there were family houses in terraced and scattered-side housing. Acquisition angles ranged from 2.2 deg to 4.3 deg. The third group of images (III) contained images acquired over forested areas. In the photographed area, there were forests and wooded and bushy areas. Acquisition angles ranged from 1.3 deg to 4.7 deg.

These areas were selected on the basis of visual assessment and the WNIR index value.

The SSIM index value (Fig. 18) determined for three categories of 20 images each confirms the efficiency of the authors’ index for classifying images, into specific groups. The determined SSIM index values confirmed the assumption that the quality of images of forested areas would be the lowest. An image of very good quality (WNIR=17.4) was selected as a reference for tests. The average similarity between the reference image and the ones classified to three representative groups was 73% (agricultural), 62% (low urban area), and 56% (forest). It is convergent with predictions by the new index of NIR images quality, WNIR.

Fig. 18

SSIM quality index for three landscapes.

JARS_12_1_015008_f018.png

7.

Discussion

The presented experiments and their results prove the efficiency of the developed WNIR index of radiometric quality assessment of images acquired from low altitudes. Analyzing the numerical value of SD, the authors observed that with its increase, the radiometric quality of the images improved. As it had been proven on the basis of the conducted research, the developed index of radiometric quality assessment may be applied to low altitude NIR images of agricultural areas.

In order to confirm the validity of the proposed WNIR index, the authors decided to compare the obtained results, with similar analyses performed using the SSIM index. The new index is dedicated to assessing the quality of images in photogrammetric and remote sensing UAV studies. The classification of UAV image quality according to the new index has the potential to increase the accuracy of the UAV workflow. The proposed method of quality assessment of NIR imagery can help identify images that have a smaller number of tie points or, a lower accuracy of remote sensing classification. This is directly applicable in the field of precision agriculture.

The developed method is limited to NIR images and it is time consuming for data samples containing a few thousand images. The research results and work done by other researchers35,52,53 indicate that the proposed method of quality assessment of NIR images for UAV photogrammetry and remote sensing will help reject low radiometric quality images from photogrammetric processing. It will make it possible to increase the quality and accuracy of photogrammetric studies, especially for precision agriculture and other remote sensing studies. The acquired imagery data may be subject to an initial assessment of whether a valid photogrammetric or remote sensing study is feasible or a repeated flight in better weather conditions is required.

Based on the defined WNIR index value ranges, it will be possible to identify images of potentially low radiometric quality, which in turn can have an impact on the accuracy of photogrammetric products, especially in precision agriculture.

The topic of radiometric quality is a great concern for the implementation of photogrammetric and digital image processing. The proposed WNIR index can be used in a vast number of ways. One of these is to detect and eliminate poor quality images acquired using a UAV for photogrammetric processing purposes. The quality index WNIR has been tested for typical agricultural areas. The effectiveness of the indicator for high-urban areas requires further research.

8.

Conclusions

The results of developing an objective index for the quality assessment of NIR images acquired from low altitudes for photogrammetric and remote sensing studies are presented and discussed in the paper. Almost 5000 images acquired in various weather and lighting conditions from altitudes of 75 to 650 m were analyzed in the experiments. The results of the tests made it possible to develop an objective index for the quality assessment of NIR images acquired from low altitudes on the basis of statistical analyses of images and a derived relationship between the image quality and relative luminance. Further research of the authors will focus on developing other image quality assessment indices for images acquired from UAVs in other spectral ranges.

Acknowledgments

This paper has been supported by a grant cofinanced by the Military University of Technology, the Faculty of Civil Engineering and Geodesy, Geodesy Institute.

References

1. 

G. S. C. Avellar et al., “Multi-UAV routing for area coverage and remote sensing with minimum time,” Sensors, 15 27783 –27803 (2015). http://dx.doi.org/10.3390/s151127783 SNSRES 0746-9462 Google Scholar

2. 

M. Shahbazi et al., “Development and evaluation of a UAV-photogrammetry system for precise 3D environmental modeling,” Sensors, 15 27493 –27524 (2015). http://dx.doi.org/10.3390/s151127493 SNSRES 0746-9462 Google Scholar

3. 

I. Colomina and P. Molina, “Unmanned aerial systems for photogrammetry and remote sensing: a review,” ISPRS J. Photogramm. Remote Sens., 92 79 –97 (2014). http://dx.doi.org/10.1016/j.isprsjprs.2014.02.013 IRSEE9 0924-2716 Google Scholar

4. 

R. Hruska et al., “Radiometric and geometric analysis of hyperspectral imagery acquired from an unmanned aerial vehicle,” Remote Sens., 4 (9), 2736 –2752 (2012). http://dx.doi.org/10.3390/rs4092736 RSEND3 Google Scholar

5. 

J. Kelcey and A. Lucieer, “Sensor correction and radiometric calibration of a 6-band multispectral imaging sensor for UAV remote sensing,” in Int. Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, 393 –398 (2012). Google Scholar

6. 

A. Orych et al., “Impact of the cameras radiometric resolution on the accuracy of determining spectral reflectance coefficients,” in Int. Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, 347 (2014). Google Scholar

7. 

D. Turner, A. Lucieer and C. Watson, “An automated technique for generating georectified mosaics from ultra-high resolution unmanned aerial vehicle (UAV) imagery, based on structure from motion (SfM) point clouds,” Remote Sens., 4 1392 –1410 (2012). http://dx.doi.org/10.3390/rs4051392 RSEND3 Google Scholar

8. 

M. Kedzierski et al., “Detection of gross errors in the elements of exterior orientation of low-cost UAV images,” in Baltic Geodetic Congress (Geomatics), 95 –100 (2016). http://dx.doi.org/10.1109/BGC.Geomatics.2016.26 Google Scholar

9. 

A. Fryskowska et al., “Calibration of low cost RGB and NIR UAV cameras,” in Int. Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences, 817 –821 (2016). Google Scholar

10. 

S. Mikrut, “Classical photogrammetry and UAV–selected aspects,” in Int. Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, 947 –952 (2016). Google Scholar

11. 

P. Burdziakowski et al., “A modern approach to an unmanned vehicle navigation,” in 16th Int. Multidisciplinary Scientific GeoConf. SGEM 2016, 747 –758 (2016). Google Scholar

12. 

M. Przyborski et al., “Photogrammetric development of the threshold water at the dam on the Vistula River in Wloclawek from unmanned aerial vehicles (UAV),” in SGEM2015 Conf. Proc., (2015). Google Scholar

13. 

E. R. Hunt et al., “Acquisition of NIR-green-blue digital photographs from unmanned aircraft for crop monitoring,” Remote Sens., 2 (1), 290 –305 (2010). http://dx.doi.org/10.3390/rs2010290 RSEND3 Google Scholar

14. 

F. Bachmann et al., “Micro UAV based georeferenced orthophoto generation in VIS+ NIR for precision agriculture,” in Int. Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, 11 –16 (2013). Google Scholar

15. 

F. Garcia-Ruiz et al., “Comparison of two aerial imaging platforms for identification of Huanglongbing-infected citrus trees,” Comput. Electron. Agric., 91 106 –115 (2013). http://dx.doi.org/10.1016/j.compag.2012.12.002 CEAGE6 0168-1699 Google Scholar

16. 

M. Kędzierski and D. Wierzbicki, “Radiometric quality assessment of images acquired by UAV’s in various lighting and weather conditions,” Measurement, 76 156 –169 (2015). http://dx.doi.org/10.1016/j.measurement.2015.08.003 0263-2241 Google Scholar

17. 

M. R. K. Wenzel, “SURE—photogrammetric surface reconstruction from imagery,” (2017) http://www.ifp.uni-stuttgart.de/publications/software/sure/index.en.html January ). 2017). Google Scholar

18. 

C. Wu, “VisualSFM: a visual structure from motion system,” (2017) http://ccwu.me/vsfm/ January ). 2017). Google Scholar

19. 

W. Förstner and E. Gülch, “A fast operator for detection and precise location of distinct points, corners and centers of circular features,” in Proc. of the ISPRS Intercommission Workshop on Fast Processing of Photogrammetric Data, 281 –305 (1987). Google Scholar

20. 

A. Grün, “Adaptive least squares correlations: a powerful matching techniques,” S. Afr. J. Photogramm. Remote Sens. Cartography, 14 (3), 175 –187 (1985). Google Scholar

21. 

H. Hirschmuller, “Stereo processing by semiglobal matching and mutual information,” IEEE Trans. Pattern Anal. Mach. Intell., 30 (2), 328 –341 (2008). http://dx.doi.org/10.1109/TPAMI.2007.1166 ITPIDJ 0162-8828 Google Scholar

22. 

B. K. P. Horn, “Relative orientation,” Int. J. Comput. Vision, 4 59 –78 (1990). http://dx.doi.org/10.1007/BF00137443 IJCVEQ 0920-5691 Google Scholar

23. 

D. P. Robertson and R. M. Varga, “Structure from motion,” Practical Image Processing and Computer Vision, 13 1 –49 Halsted Press, New York (2009). Google Scholar

24. 

D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vision, 60 (2), 91 –110 (2004). http://dx.doi.org/10.1023/B:VISI.0000029664.99615.94 IJCVEQ 0920-5691 Google Scholar

25. 

S. M. Smith and J. M. Brady, “SUSAN—a new approach to low level image processing,” Int. J. Comput. Vision, 23 (1), 45 –78 (1997). http://dx.doi.org/10.1023/A:1007963824710 IJCVEQ 0920-5691 Google Scholar

26. 

J. Matas et al., “Robust wide-baseline stereo from maximally stable extremal regions,” Image Vision Comput., 22 (10), 761 –767 (2004). http://dx.doi.org/10.1016/j.imavis.2004.02.006 IVCODK 0262-8856 Google Scholar

27. 

M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM, 24 381 –395 (1981). http://dx.doi.org/10.1145/358669.358692 CACMA2 0001-0782 Google Scholar

28. 

D. G. Lowe, “Object recognition from local scale-invariant features,” in Proc. of the Seventh IEEE Int. Conf. on Computer Vision, 1150 –1157 (1999). http://dx.doi.org/10.1109/ICCV.1999.790410 Google Scholar

29. 

J. M. Morel and G. Yu, “ASIFT: a new framework for fully affine invariant image comparison,” SIAM J. Imag. Sci., 2 (2), 438 –469 (2009). http://dx.doi.org/10.1137/080732730 Google Scholar

30. 

H. Bay et al., “Speeded-up robust features (SURF),” Comput. Vision Image Understanding, 110 (3), 346 –359 (2008). http://dx.doi.org/10.1016/j.cviu.2007.09.014 CVIUF4 1077-3142 Google Scholar

31. 

S. Zancajo-Blazquez et al., “An automatic image-based modelling method applied to forensic infography,” PLoS One, 10 e0118719 (2015). http://dx.doi.org/10.1371/journal.pone.0118719 POLNCL 1932-6203 Google Scholar

32. 

E. Honkavaara et al., “The EuroSDR project radiometric aspects of digital photogrammetric images-results of the empirical phase,” in Int. Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences, (2011). Google Scholar

33. 

M. Crespi and L. De Vendictis, “A procedure for high resolution satellite imagery quality assessment,” Sensors, 9 3289 –3313 (2009). http://dx.doi.org/10.3390/s90503289 SNSRES 0746-9462 Google Scholar

34. 

K. Pyka, “The use of wavelets for evaluation of loss in radiometric quality in the orthophoto mosaicking process AGH UWND,” Polish university, (2005). Google Scholar

35. 

N. Haala, M. Cramer and M. Rothermel, “Quality of 3D point clouds from highly overlapping UAV imagery,” in Int. Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, 183 –188 (2013). Google Scholar

36. 

N. Haala, “Comeback of digital image matching,” Photogrammetric Week ’09, 289 –301 Wichmann Verlag, Heidelberg, Germany (2009). Google Scholar

37. 

T. Rosnell and E. Honkavaara, “Point cloud generation from aerial image data acquired by a quadrocopter type micro unmanned aerial vehicle and a digital still camera,” Sensors, 12 (1), 453 –480 (2012). http://dx.doi.org/10.3390/s120100453 SNSRES 0746-9462 Google Scholar

38. 

M. Rothermel and N. Haala, “Potential of dense matching for the generation of high quality digital elevation models,” in ISPRS Hannover Workshop 2011: High-Resolution Earth Imaging for Geospatial Information, (2011). Google Scholar

39. 

F. D. Java, F. Samadzadegan and P. Reinartz, “Spatial quality assessment of pan-sharpened high resolution satellite imagery based on an automatically estimated edge based metric,” Remote Sens., 5 6539 –6559 (2013). http://dx.doi.org/10.3390/rs5126539 RSEND3 Google Scholar

40. 

E. Honkavaara et al., “Hyperspectral reflectance signatures and point clouds for precision agriculture by light weight UAV imaging system,” in ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, (2012). Google Scholar

41. 

E. Honkavaara et al., “Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture,” Remote Sens., 5 5006 –5039 (2013). http://dx.doi.org/10.3390/rs5105006 Google Scholar

42. 

E. Honkavaara, Calibrating Digital Photogrammetric Airborne Imaging Systems Using a Test Field, Finnish Geodetic Institute, Helsinki University of Technology, Espoo, Finland (2008). Google Scholar

43. 

T. Kim, H. Kim, H. Kim, “Image-based estimation and validation of NIIRS for high-resolution satellite images,” in Int. Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, 1 –4 (2008). Google Scholar

44. 

L. Li, H. Luo and H. Zhu, “Estimation of the image interpretability of ZY-3 sensor corrected panchromatic nadir data,” Remote Sens., 6 (5), 4409 –4429 (2014). http://dx.doi.org/10.3390/rs6054409 RSEND3 Google Scholar

45. 

Q. Huynh-Thu and M. Ghanbari, “Scope of validity of PSNR in image/video quality assessment,” Electron. Lett., 44 (13), 800 (2008). http://dx.doi.org/10.1049/el:20080522 ELLEAK 0013-5194 Google Scholar

46. 

Z. Wang et al., “Image quality assessment: from error measurement to structural similarity,” IEEE Trans. Image Process., 13 600 –612 (2004). http://dx.doi.org/10.1109/TIP.2003.819861 Google Scholar

47. 

A. Bhat, I. Richardson and S. Kannangara, “A new perceptual quality metric for compressed video,” in IEEE Int. Conf. on Acoustics, Speech and Signal Processing (ICASSP ’09), (2010). http://dx.doi.org/10.1109/ICASSP.2009.4959738 Google Scholar

48. 

M. Kedzierski, A. Fryśkowska and D. Wierzbicki, Photogrammetric Studies from Low Altitude, 13 –14 WAT, Warsaw (2014). Google Scholar

49. 

Trimble UAS, “Trimble UX5 aerial imaging solution vegetation monitoring frequently asked questions,” (2013) http://surveypartners.trimble.com April 2017). Google Scholar

50. 

K. Pauly, “Towards calibrated vegetation indices from UAS-derived orthomosaics,” in Proc. of the 13th Int. Conf. on Precision Agriculture, (2016). Google Scholar

51. 

S. A. Genchi et al., “Structure-from-motion approach for characterization of bioerosion patterns using UAV imagery,” Sensors, 15 (2), 3593 –3609 (2015). http://dx.doi.org/10.3390/s150203593 SNSRES 0746-9462 Google Scholar

52. 

F. J. Mesas-Carrascosa et al., “Accurate ortho-mosaicked six-band multispectral UAV images as affected by mission planning for precision agriculture proposes,” Int. J. Remote Sens., 38 (8–10), 2161 –2176 (2016). http://dx.doi.org/10.1080/01431161.2016.1249311 IJSEDK 0143-1161 Google Scholar

53. 

Y. Yang, Z. Lin and F. Liu, “Stable imaging and accuracy issues of low-altitude unmanned aerial vehicle photogrammetry systems,” Remote Sens., 8 (4), 316 (2016). http://dx.doi.org/10.3390/rs8040316 Google Scholar

Biography

Damian Wierzbicki is an assistant professor at the Military University of Technology. He received his PhD in photogrammetry from the Military University of Technology in 2015. His current research interests include UAV photogrammetry and digital image processing. He is the author of several articles and papers presented at national and international conferences.

Anna Fryskowska is an assistant professor at the Military University of Technology. She received her PhD in photogrammetry from the Military University of Technology in 2013. Her current research interests include terrestrial laser scanning and UAV photogrammetry. She is the author of several articles and papers presented at national and international conferences.

Michal Kedzierski is a professor at the Military University of Technology. His research interests are UAV photogrammetry, digital image processing, and airborne and terrestrial laser scanning. He is the author of a few dozen articles and papers presented at national and international conferences.

Michalina Wojtkwoska is an assistant professor at the Military University of Technology. She received her PhD in photogrammetry the from Military University of Technology in 2014. Her current research interests include terrestrial laser scanning.

Paulina Delis is an assistant professor at the Military University of Technology. She received her PhD in photogrammetry from the Military University of Technology in 2016. Her current research interests include UAV photogrammetry and digital image processing. She is the author of several articles and papers presented at national and international conferences.

© 2018 Society of Photo-Optical Instrumentation Engineers (SPIE) 1931-3195/2018/$25.00 © 2018 SPIE
Damian Wierzbicki, Anna Fryskowska, Michal Kedzierski, Michalina Wojtkowska, and Paulina Delis "Method of radiometric quality assessment of NIR images acquired with a custom sensor mounted on an unmanned aerial vehicle," Journal of Applied Remote Sensing 12(1), 015008 (27 January 2018). https://doi.org/10.1117/1.JRS.12.015008
Received: 25 July 2017; Accepted: 5 January 2018; Published: 27 January 2018
Lens.org Logo
CITATIONS
Cited by 17 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Near infrared

Image quality

Unmanned aerial vehicles

Sensors

Data acquisition

Light sources and illumination

Cameras

Back to Top