Open Access
17 December 2018 Improved relative radiometric normalization method of remote sensing images for change detection
Yepei Chen, Kaimin Sun, Deren Li, Ting Bai, Wenzhuo Li
Author Affiliations +
Abstract
Relative radiometric normalization (RRN) of remotely sensed images is often a preprocessing step during time series analysis and change detection. Conventional RRN methods may lessen the radiation difference of changed pixels in images during the RRN process, thus reducing the accuracy of change detection. To solve this problem, we propose a relative radiometric correction method based on wavelet transform and iteratively reweighted multivariate alteration detection (IR-MAD). A wavelet transform is applied to separate high and low frequency components of both the target image and reference image. The high frequency components remain unprocessed to preserve high frequency information. We use the IR-MAD algorithm to normalize the low frequency component of the target image. A reverse wavelet transform reconstructs the radiometrically normalized image. We tested the proposed method with traditional histogram matching, mean variance, the original IR-MAD method, and a method combining wavelet transform and low-pass filtering, and change detection was conducted to evaluate the RRN quality. The experiments show that the proposed method can not only effectively eliminate the overall radiation difference between images but also enable higher accuracy of change detection.

1.

Introduction

Divergences in the reflectance of remote sensing images for an area can indicate land cover change. Thus, the existing change detection methods often determine whether there is change, according to the radiometric differences between the images.1,2 The physically unchanged ground object in two images acquired at different times can lead to variation in spectral values.3 The reason is that acquisition conditions are different, such as status or posture of a sensor, solar illuminance, observation angles, atmospheric scattering, and absorption.4,5 This problem presents challenges in multitemporal image processing and analysis.6 Therefore, in practical remote sensing applications, radiometric normalization is conducted to eliminate the radiometric discrepancy between images caused by acquisition conditions rather than actual changes in ground objects.7,8

Radiometric normalization can be divided into two types: absolute radiometric correction and relative radiometric normalization (RRN).912 Absolute radiometric correction based on single image reveals actual surface response by removing the influence of the atmosphere.5,13 However, to accurately estimate atmospheric effects, it is necessary to obtain atmospheric properties at the time of data collection, such as air temperature, relative humidity, atmospheric pressure, visibility, altitude, and elevation, which tends to be field measured or acquired through other data.1316 To the contrary, relative radiometric normalization aims at minimizing the radiometric differences caused by inconsistencies of acquisition conditions between images. It is used for multitemporal images. In a relative radiometric normalization, one of the images is considered as the reference image and all other images are normalized in such a way that they become radiometrically similar to the reference image.12,17 Difficulty to collect the necessary ancillary data, along with the lack of historical data, has further reduced the opportunity to perform absolute radiometric correction on multitemporal images. Fortunately, since RRN does not require any ancillary data and is easier to achieve than absolute correction, it has been widely used to normalize images obtained at different times.18,19 In some applications like change detection and classification, it has been demonstrated that some RRN methods performed better than absolute radiometric corrections.13,18

Commonly used RRN methods consist of two types: nonlinear normalization and linear normalization.20 Nonlinear methods include histogram matching (HM).8 This approach can cause gray scale loss and a disordered overall radiation distribution since it achieves correction by matching the histogram of a target image with that of the reference image.20 Linear methods include minimum–maximum (MM),9 mean-standard (MS) deviation,9 haze correction (HC),21 image regression (IR),22,23 pseudoinvariant feature (PIF),11,2426 dark set-bright set (DB),17 and no-change set (NC)27 techniques. Most of these methods (HM, MM, MS, HC, and IR) use all of the pixels in the estimation of normalization coefficients.28 Such methods do not often perform as well as methods using invariant pixels8,28,29 and can lead to low change detection accuracy since the radiometric difference caused by physical ground change is normalized. Those methods using unchanged pixels, however, are time and labor consuming due to the selection of invariant pixels, and furthermore, the quality of the chosen samples directly affects the relative radiometric correction results.30,31

To control the quality of selected invariant pixels and reduce the time and labor cost, some researchers have proposed improved methods.3234 These invariant pixels selection methods include slow feature analysis,31 Kauth–Thomas transformation,17 scattergram-controlled regression,27 temporally invariant cluster,18 principal component analysis (PCA),4 multivariate alteration detection (MAD),35 and iteratively reweighted multivariate alteration detection (IR-MAD).36 These methods can increase the quality and number of invariant pixels, as well as reduce human intervention and subjectivity. Canty et al.35 applied MAD to define automatically the invariant pixels within multispectral images of the same area collected at two different times. Their results showed that automatically obtained invariant features generated better results than those produced by manual selection. To improve the sensitivity of MAD, Nielsen et al.36 proposed IR-MAD. Not only does this method automatically select invariant features but also determine an adaptive threshold through an iterative process. As a consequence, IR-MAD is an effective method to select unchanged pixels. Mateos et al.37 radiometrically normalized multitemporal remote sensing images using the IR-MAD algorithm. Canty and Nielsen38 applied IR-MAD for the normalization of LANDSAT and ASTER multitemporal images. These RRN methods assume the invariant pixels’ values in the target image linearly related to those of the reference image because of simplification of modeling. However, in fact, the relation does not follow a linear model typically. Thus, the linear assumption will negatively affect the normalization results. Furthermore, these methods can result in loss of high frequency details.

Frequency domain transforms, such as the Fourier,39 wavelet,40 and contourlet41 transforms, were applied in relative radiometric normalization to overcome the limitations of conventional methods. Biday and Bhosle42 used Fourier and wavelet transforms to separate high and low frequency components in images; the effectiveness was validated in comparative experiments with two other relative radiometric normalization methods. Sun et al.43 proposed an RRN method based on wavelet transform and low pass filter (WLPF) that effectively improved the change detection accuracy. Li et al.20 presented a method for relative radiometric consistency processing, based on object-oriented smoothing and contourlet transforms, concluding that the proposed method can improve the visual effects of normalized images thus increasing the accuracy of change detection.

To overcome the limitations of existing methods, we propose a relative radiometric normalization method based on wavelet transform and IR-MAD (WIRMAD). Wavelet transform is used to divide images into spatial high frequency and low frequency components. We automatically extract invariant features from the low frequency components, and then a linear regression equation is used to normalize the low frequency component of the target image to the reference image. A reverse wavelet transform is applied to reconstruct the final normalized image.

We compared the proposed method visually and empirically to the traditional HM, MS, IR-MAD, and WLPF methods in experiments with three pairs of images in China at different spatial resolutions. Furthermore, change detection was conducted on these images to evaluate the RRN quality of the proposed method. The remainder of this paper is as follows: Sec. 2 describes the datasets and Sec. 3 details the proposed relative radiometric normalization method based on wavelet transform and IR-MAD, followed by the experimental results and analysis in Sec. 4. Section 5 discusses the application and limitations of the proposed method, and the conclusion is given in Sec. 6.

2.

Datasets

Four pairs of bitemporal images at different spatial resolutions were employed in the experiments, one from Nanjing for normalization and three for change detection from Shenzhen, Wuhan, and Guangxi in China, respectively. The information of the datasets used in this paper is summarized in Table 1.

Table 1

Description of data sets using in this paper.

Image pairsSensorDateBandSpatial resolution (m)SizeLocationApplication
1SPOT52002,2007Single101600×1600Nanjing
32.05°N,
118.61°E
Radiometric normalization
2Landsat5 TM1993,2011Multispectral301024×1024Shenzhen
23.38°N,
113.08°E
Change detection
3GF1 WFV2013,2017Multispectral161600×1600Wuhan
31.06°N,
114.67°E
Change detection
4GF2 PMS2015,2016Multispectral13200×3200Guangxi
24.39°N,
109.55°E
Change detection

For each image pairs, we geometrically registered the target image and reference image with error controlled to within one pixel before normalization and change detection. To evaluate the change detection accuracy, true change maps were produced by visual interpretation. Auxiliary images from Google Earth were used to ensure the accuracy of true change maps.

3.

Methodology

In this paper, we present an RRN method combined wavelet transform and IR-MAD algorithm. Figure 1 illustrates the relative radiometric normalization approach used in this study in a general way; we discuss the major steps in detail in Secs. 3.13.3.

Fig. 1

The flow chart of the proposed method.

JARS_12_4_045018_f001.png

3.1.

Wavelet Transform

The low frequency of remote sensing image corresponds to the holistic background radiation information, and the high frequency is the foreground target, texture information. Therefore, the purpose of global radiometric correction can be achieved by eliminating the difference between the low frequency components of the target and reference images. Keeping the high frequency components unprocessed can preserve high frequency information.

Wavelet transform is one of the commonly used algorithms in frequency domain processing, developed in signal processing theory to help extract information from many different kinds of data.44 In this paper, wavelet transform is used to separate the spatially low and high frequencies.

There are different types of wavelet basic functions, whose qualities vary according to several criteria. In this study, Haar wavelet, the simplest wavelet and one of the first studied, is used to achieve the separation of low and high frequencies. According to the analysis in Li’s study,20 we decompose the target image and reference image using a four-level wavelet transform.

3.2.

IR-MAD Algorithm

After wavelet transforms, we used IR-MAD to select invariant pixels from the low-frequency components of target and reference images. IR-MAD is an effective method to select pixels with high no-change probability between images.3

For two n-band images, acquired at times t1 and t2, represent them by two vectors F=(F1,F2,,Fn)T and G=(G1,G2,,Gn)T. Two linear combinations are constructed for all spectral bands:

Eq. (1)

U=aTF=a1F1+a2F2++anFn,

Eq. (2)

V=bTG=b1G1+b2G2++bnGn,
where U and V are called the canonical variates; n is the number of bands; a and b are constant vectors, maximizing the variance of UV.

In this way, the difference image UV will show maximum change information, referred to as the MAD variates, where i is number of bands:

Eq. (3)

MADi=Uni+1Vni+1=ani+1TFbni+1TG,i=1,,n.

Assuming that no ground reflectance changes have occurred in two images of a scene, in such a case, the sum of the squares of the standardized MAD variates will approximately follows a chi-square distribution with n degrees of freedom (χ2(n)):

Eq. (4)

Z=i=1n(MADiσMADi)2χ2(n),
where Z represent the sum of the squares of the standardized MAD variates and σMADi is the variance of MADi.

The MAD variates associated with change observations, however, will deviate more or less strongly from such a multivariate normal distribution.38 Therefore, to improve the sensitivity of the MAD transformation, Nielsen et al.36 weight observations by the probability of no change though an iteration scheme:

Eq. (5)

Pr(no change)=1Pχ2;n(Z),
where Pr(no change) is the weight, representing the probability of no change, and Pχ2;n(Z) is the quantile of chi-square distribution.

Iterations are continued until the largest absolute change in the canonical correlations is smaller than a preset small value, e.g., 106.36 For radiometric normalization purposes, we can select all pixels that satisfy Pr(no change)>t, where t is a decision threshold, typically 95%.36 The invariant pixels will be used to estimate the normalization coefficients though regression fit and then normalize the target image to the reference image.

3.3.

Relative Radiometric Normalization

In this paper, we apply the IR-MAD algorithm to normalize the low frequency component of the target image after separation of high frequency and low frequency components by using wavelet transform. For a specific band, assume that f1 is the image to be normalized and f2 is the reference image. The specific steps of relative radiometric normalization based on wavelet transforms and IR-MAD are as follows:

  • 1. Apply wavelet transform to the target image f1 and reference image f2 to obtain the low-frequency components of the images, f1L and f2L, and the high-frequency components of the images, f1H and f2H.

  • 2. Apply IR-MAD algorithm to select the corresponding invariant pixels from the two low-frequency components f1L and f2L.

  • 3. Perform the orthogonal linear regression on the selected invariant pixels to determine the relative radiometric normalization coefficients (slope a and intercept b).

  • 4. Normalize the low frequency f1L of the target image f1 to the low frequency component f2L of the reference image f2 through the follow equation:

    Eq. (6)

    f1L=a·f2L+b.

  • 5. Replace f1L with f1L.

  • 6. Apply the wavelet reverse transform to the low and high frequencies of f1 to obtain the radiometrically normalized result.

4.

Experiments and Analysis

4.1.

Relative Radiometric Normalization

In order to verify the proposed method WIRMAD for relative radiometric normalization, it was compared with the HM, MS, IR-MAD, and WLPF methods. The results are shown in Fig. 2. The radiation differences between RRN results, target image, and reference image were calculated, and Fig. 2 depicts the results.

Fig. 2

The RRN results obtained from the proposed WIRMAD method and HM, MS, IRMAD, and WLPF methods, and the differences between the results and the original image: (a) reference image, (b) target image, (c) HM, (d) MS, (e) IR-MAD, (f) WLPF, (g) WIRMAD, (h) difference between (a) and (b), (i) difference between (a) and (c), (j) difference between (a) and (d), (k) difference between (a) and (e), (l) difference between (a) and (f), and (m) difference between (a) and (g).

JARS_12_4_045018_f002.png

Visual inspections of Figs. 2Fig. 34 show that the HM, MS, IR-MAD, WLPF, and WIRMAD methods significantly reduce the radiation difference between the target and reference images. The overall brightness and color of RRN results are similar with the reference image and the radiometric consistency is significantly improved compared with the target image in Figs. 2(a)2(g). The differences between RRN results and the reference image demonstrate that the results of WLPF and WIRMAD are more consistent with the reference image than that other methods, as shown in Figs. 2(h)2(m). The differences between RRN results and the reference image demonstrate that the results of WLPF and WIRMAD are more consistent with the reference image than that other methods, as shown in Figs. 2(h)2(m). The result of WIRMAD method, however, was more consistent with the reference image, a result not evident through a visual inspection.

Fig. 3

Subgraphs of the upper red box in Fig. 2: (a) reference image, (b) target image, (c) HM, (d) MS, (e) IR-MAD, (f) WLPF, and (g) WIRMAD.

JARS_12_4_045018_f003.png

Fig. 4

Subgraphs of the lower red box in Fig. 2: (a) reference image, (b) target image, (c) HM, (d) MS, (e) IR-MAD, (f) WLPF, and (g) WIRMAD.

JARS_12_4_045018_f004.png

To quantitatively evaluate and compare our WIRMAD method and the HM, MS, IR-MAD, and WLPF methods, the mean, standard deviation, and correlation between the resulting images, the target image and reference image were calculated to assess the performance of these methods, as shown in Table 1.

As shown in Table 2, from the mean value, we can see that the traditional HM (103.0872), MS (102.5441), and IR-MAD (103.1381) methods yielded results closer to the reference image as compared with WLPF (95.8877) and WIRMAD (93.6330). The standard deviation of the results derived by WLPF (14.5668) and WIRMAD (14.4600) methods, however, is closer to the target image than that of HM (19.0636), MS (19.0603), and IR-MAD (18.3545) methods, which indicates that method based on wavelet transform retains more of the texture information in the original image.

Table 2

Comparison using statistical parameters of RRN results in Fig. 2.

Mean valueStandard deviationCorrelation coefficient with target imageCorrelation coefficient with reference image
Target image67.431513.54771.00000.4788
Reference image102.843119.08130.47881.0000
HM103.087219.06360.99510.4788
MS102.544119.06030.99790.4799
IR-MAD103.138118.35451.00000.4788
WLPF95.887714.56680.82180.5201
WIRMAD93.633014.46000.80700.5849

In terms of correlation, as compared with HM, MS, and IR-MAD methods, the results obtained by WLPF and WIRMAD methods show lower correlation, 0.8218 and 0.8070, respectively, with the target image and higher correlation, 0.5201 and 0.5849 with the reference image, especially the proposed WIRMAD method (0.5849), which makes WIRMAD particularly advantageous in change detection applications.

4.2.

Change Detection

Change detection experiments based on three pairs of bitemporal images at high and mid-high resolutions were carried out to assess further the proposed method. Three different change detection methods were used, object oriented and pixel based change vector analysis (CVA), PCA, and the iterated conditional model based on Markov random fields (ICM-MRF),45,46 to avoid the contingency caused by specific data and change detection methods.

The RRN results of image pairs over Shenzhen are shown in Fig. 5. The pixel-based change detection results using CVA are also displayed in Fig. 5.

Fig. 5

Results of relative radiometric normalization and change detection using CVA: (a) reference image, (b) target image, (c) HM, (d) MS, (e) IR-MAD, (f) WLPF, (g) WIRMAD, (h) true change, (i) change detection of (a) and (b) using CVA, (j) change detection of (a) and (c) using CVA, (k) change detection of (a) and (d) using CVA, (l) change detection of (a) and (e) using CVA, (m) change detection of (a) and (f) using CVA and (n) change detection of (a) and (g) using CVA.

JARS_12_4_045018_f005.png

It can be seen from Fig. 5, using the same CVA change detection method on pixel level, more accurate change detection results with radiometric correction are obtained as compared to the raw data without normalization. The change detection results from WLPF and WIRMAD normalization were significantly improved over conventional HM, MS, and IR-MAD methods.

The change detection results were analyzed; the results are shown in Table 3. Omission, false alarm, overall accuracy, and kappa parameters were calculated to evaluate the accuracy of change detection based on RRN results derived by our WIRMAD method, HM, MS, IR-MAD, and WLPF methods.

Table 3

Evaluation of change detection accuracy.

Change detection methodsRRN methodsOmissionFalse alarmOverall accuracyKappa
CVAOriginal0.63700.75650.71960.1251
HM0.58220.65460.78170.2472
MS0.58250.62340.79760.2748
IR-MAD0.63780.62540.80260.2514
WLPF0.59900.57160.81980.3079
WIRMAD0.59780.45660.85130.3782
PCAOriginal0.62380.76950.70140.1107
HM0.58480.65940.77930.2418
MS0.58830.63140.79440.2658
IR-MAD0.64070.62750.80200.2486
WLPF0.60350.57600.81850.3027
WIRMAD0.57300.48860.84410.3750
ICM-MRFOriginal0.62210.75740.71370.1265
HM0.54720.65560.77610.2571
MS0.54450.62770.79150.2847
IR-MAD0.60630.61850.80230.2697
WLPF0.55750.57620.81580.3231
WIRMAD0.54830.46690.85000.4019

In Table 3, we can see that the change detection results were similar regardless of the method. Change detection accuracy, however, was significantly increased with RRN as compared to the original image without normalization. The omission and false alarm rates decline, the overall accuracy and kappa coefficients rise. It can be concluded that RRN is crucial for change detection.

After RRN using WLPF and our WIRMAD method, the change detection results are similar and effectively improved over those derived by conventional HM, MS, and IR-MAD methods. These results suggest that dividing low and high frequencies using wavelet transform can improve the change detection accuracy.

Among the five RRN methods, our WIRMAD method derives change detection results with the lowest omissions and false alarms, and the highest overall accuracy and kappa coefficient, indicating that the proposed WIRMAD method can avoid reducing information about change and thus increase the accuracy of change detection.

Pixel-based change detection experiments were conducted to assess the results of RRN of images over Wuhan, China. The RRN results using the proposed WIRMAD method and conventional methods are shown in Fig. 6, as well as the change detection results using PCA derived from these RRN results.

Fig. 6

Results of relative radiometric normalization and change detection using PCA: (a) reference image, (b) target image, (c) HM, (d) MS, (e) IR-MAD, (f) WLPF, (g) WIRMAD, (h) true change, (i) change detection of (a) and (b) using PCA, (j) change detection of (a) and (c) using PCA, (k) change detection of (a) and (d) using PCA, (l) change detection of (a) and (e) using PCA, (m) change detection of (a) and (f) using PCA and (n) change detection of (a) and (g) using PCA.

JARS_12_4_045018_f006.png

It can be seen from Fig. 6, change detection on GF-1 WFV images with RRN produces more accurate results than those obtained on an original image without normalization, especially for false negative results [Figs. 6(i)6(n)]. The change detection results derived from RRN using the proposed WIRMAD and WLPF methods are closer to the true change map than results obtained after RRN using conventional HM, MS, and IR-MAD methods.

Table 4 shows the evaluation of change detection omission, false alarm, overall accuracy, and kappa parameter results for the two GF-1 WFV images.

Table 4

Evaluation of change detection accuracy.

Change detection methodsRRN methodsOmissionFalse alarmOverall accuracyKappa
CVAOriginal0.35310.92660.60550.0531
HM0.39620.71290.91230.3482
MS0.38670.70580.91400.3574
IR-MAD0.47000.67370.92760.3676
WLPF0.42490.67120.92600.3820
WIRMAD0.41840.68380.92240.3720
PCAOriginal0.41980.94030.55790.0266
HM0.52170.65780.93330.3646
MS0.48030.63180.93650.3984
IR-MAD0.50050.62610.93810.3957
WLPF0.43910.61890.93750.4220
WIRMAD0.45280.60980.93950.4244
ICM-MRFOriginal0.33960.92820.58920.0502
HM0.34850.71090.90970.3594
MS0.31750.73010.89980.3433
IR-MAD0.32760.73790.89720.3327
WLPF0.40390.70440.91440.3553
WIRMAD0.35360.69820.91560.3718

As shown in Table 4, change detection results after RRN have much lower false alarm rate and higher coverall accuracy than raw images without radiometrical correction. Although the omission rate is slightly higher, RRN before change detection significantly improves the results. Change detection with RRN using the proposed WIRMAD method and WLPF method produced more accurate results than HM, MS, and IR-MAD methods to normalize images. The WIRMAD method, especially, had the highest overall accuracy (0.9395, 0.9156) and kappa coefficient (0.4244, 0.3718) when detecting change using PCA and ICM-MRF.

Object oriented change detection, including CVA, PCA and ICM-MRF, were applied to assess the RRN results of images over Guangxi, China. Figure 7 displays the RRN results and ICM-MRF change detection results.

Fig. 7

Results of relative radiometric normalization and change detection using ICM-MRF: (a) reference image, (b) target image, (c) HM, (d) MS, (e) IR-MAD, (f) WLPF, (g) WIRMAD, (h) true change, (i) change detection of (a) and (b) using ICM-MRF, (j) change detection of (a) and (c) using ICM-MRF, (k) change detection of (a) and (d) using ICM-MRF, (l) change detection of (a) and (e) using ICM-MRF, (m) change detection of (a) and (f) using ICM-MRF and (n) change detection of (a) and (g) using ICM-MRF.

JARS_12_4_045018_f007.png

The change detection results derived from RRN using the proposed WIRMAD method and WLPF method are closer to the true change map than results obtained after RRN using conventional HM, MS, and IR-MAD methods, as inferred from Fig. 5, since the omission rate is significantly lower.

The change detection results for GF-2 PMS images were analyzed and displayed in Table 5. We calculated omission, false alarm, overall accuracy, and kappa parameters to evaluate the accuracy of change detection by our WIRMAD method, HM, MS, IR-MAD, and WLPF methods, after RRN.

Table 5

Evaluation of change detection accuracy.

Change detection methodsRRN methodsOmissionFalse alarmOverall accuracyKappa
CVAOriginal0.73680.55360.70990.1614
HM0.73890.38430.75390.2438
MS0.63250.48670.73710.2637
IR-MAD0.65560.57670.69310.1787
WLPF0.46730.29830.81410.4870
WIRMAD0.36730.35500.80820.5083
PCAOriginal0.76260.57680.70360.1344
HM0.75240.38170.75300.2341
MS0.64070.49460.73410.2538
IR-MAD0.66670.57960.69270.1720
WLPF0.51660.29730.80680.4534
WIRMAD0.37920.35600.80640.5009
ICM-MRFOriginal0.71620.59250.69200.1425
HM0.77850.347650.75540.2232
MS0.67240.45900.74530.2581
IR-MAD0.77270.53130.71890.1557
WLPF0.50350.24950.82090.4885
WIRMAD0.38870.30700.82330.5321

As shown in Table 5, the omission, false alarm, overall accuracy, and kappa for the change detection results using three different methods (CVA, PCA, and ICM-MRF) show the same trend. After RRN, the omission and false alarm rates decreased, and the overall accuracy and kappa coefficient increased as the omission and kappa values improved significantly. This indicates that the radiation difference between multitemporal images must be reduced though RRN, before change detection. Our WIRMAD method produces change detection results at the highest overall accuracy and kappa, with the lowest value for omissions, suggesting that the separation of low and high frequencies of images contribute to increased change detection accuracy.

5.

Discussion

In this paper, we propose an RRN method based on wavelet transform and the IR-MAD algorithm. Wavelet transform separates the high and low frequency components while IR-MAD radiometrically normalizes the low frequency components of a target image.

IR-MAD is a linear relative radiometric normalization method. However, even if the multitemporal images are very similar, it is impossible to have a complete linear relationship between images.28 Extracting low frequency components of images by wavelet transform eliminates the effects of nonlinear factors, such as texture and small changes in ground objects,42 exposing a higher linear correlation and gives full play to IR-MAD.

The IR-MAD algorithm extracts invariant pixels from the low frequency components, linearly correcting the low frequencies of the target image to the low frequencies of the reference image. This protects the radiometric difference of changed objects in the low frequencies, therefore improving change detection results.8

The RRN results obtained by the proposed WIRMAD method were quite similar to those derived by the WLPF method and more consistent with the reference image than conventional RRN methods. The change detection results derived from RRN using the proposed WIRMAD were the closest to the true change with the highest overall accuracy and kappa coefficient, and lowest omission or false alarm rate, among the tested methods. Furthermore, it can be applied for pixel level or object level change detection regardless method. Moreover, except for change detection, it can be also applied for image dodging when mosaic images with overlapping,29 gap filling and bad line removing based on a referenced image, and time series analysis.6,19

However, there are limitations of the proposed method. It is not suitable for the multitemporal images with high nonlinear correlation since this method linearly normalize the low frequency of the target image. In addition, if the changed ground objects occupied a large proportion of the image, the normalized result will differ from the reference image visually.

6.

Conclusions

Multitemporal images have radiation differences due to sensor and atmosphere conditions, even over the same area, creating challenges in multitemporal images processing and analysis. Conventional RRN methods, however, often reduce the difference caused by the change of the ground objects in the process of normalization. This negatively affects the change detection results and time series analysis. In order to solve this problem, we propose an RRN method based on the wavelet transform and IR-MAD algorithm. Wavelet transform is applied to separate the high frequency and low frequency components of both the target and reference images. We use the IR-MAD algorithm to normalize the low frequency component of the target image. Wavelet reverse transform is conducted to reconstruct the radiometrically normalized image.

Experimental results show that our WIRMAD method can not only achieves radiometric consistency of the target and reference image but also improves the accuracy of change detection. WIRMAD method applies wavelet transform to preserve high frequency information. In addition, low frequency of the target image is normalized using unchanged pixels selected by the IR-MAD algorithm, thereby improving change detection accuracy, making it more suitable than other RRN methods for change detection at pixel or object level.

Acknowledgments

This work was supported by the National Natural Science Foundation of China (NO. 41471354) and National Key Research and Development Program of China (NO. 2016YFB0502602). We would like to thank the China Center for Resources Satellite Data and Application for providing the GF-1 image data, and the USGS for providing Landsat-5 image data. The reviewer’s comments are valuable which help much to improve this manuscript, and their efforts are also greatly appreciated. The authors declare no conflict of interest.

References

1. 

P. Coppin et al., “Review Article: Digital change detection methods in ecosystem monitoring: a review,” Int. J. Remote Sens., 25 (9), 1565 –1596 (2004). https://doi.org/10.1080/0143116031000101675 IJSEDK 0143-1161 Google Scholar

2. 

P. M. Teillet, K. Staenz and D. J. Williams, “Effects of spectral, spatial, and radiometric characteristics on remote sensing vegetation indices of forested regions,” Remote Sens. Environ., 61 (1), 139 –149 (1997). https://doi.org/10.1016/S0034-4257(96)00248-9 Google Scholar

3. 

S. Tuominen and A. Pekkarinen, “Local radiometric correction of digital aerial photographs for multisource forest inventory,” Remote Sens. Environ., 89 (1), 72 –82 (2004). https://doi.org/10.1016/j.rse.2003.10.005 Google Scholar

4. 

Y. Du, P. M. Teillet and J. Cihlar, “Radiometric normalization of multitemporal high-resolution satellite images with quality control for land cover change detection,” Remote Sens. Environ., 82 (1), 123 –134 (2002). https://doi.org/10.1016/S0034-4257(02)00029-9 Google Scholar

5. 

P. M. Teillet, “Image correction for radiometric effects in remote sensing,” Int. J. Remote Sens., 7 (12), 1637 –1651 (1986). https://doi.org/10.1080/01431168608948958 IJSEDK 0143-1161 Google Scholar

6. 

S. Vicenteserrano, F. Perezcabello and T. Lasanta, “Assessment of radiometric correction techniques in analyzing vegetation variability and change using time series of Landsat images,” Remote Sens. Environ., 112 (10), 3916 –3934 (2008). https://doi.org/10.1016/j.rse.2008.06.011 Google Scholar

7. 

L. Paolini et al., “Radiometric correction effects in Landsat multi‐date/multi‐sensor change detection studies,” Int. J. Remote Sens., 27 (4), 685 –704 (2006). https://doi.org/10.1080/01431160500183057 IJSEDK 0143-1161 Google Scholar

8. 

X. Yang and C. P. Lo, “Relative radiometric normalization performance for change detection from multi-date satellite images,” Photogramm. Eng. Remote Sens., 66 (8), 967 –980 (2000). Google Scholar

9. 

Y. Ding and C. D. Elvidge, “Comparison of relative radiometric normalization techniques,” ISPRS J. Photogramm. Remote Sens., 51 (3), 117 –126 (1996). https://doi.org/10.1016/0924-2716(96)00018-4 IRSEE9 0924-2716 Google Scholar

10. 

M. M. Rahman et al., “A comparison of four relative radiometric normalization (RRN) techniques for mosaicing H-res multi-temporal thermal infrared (TIR) flight-lines of a complex urban scene,” ISPRS J. Photogramm. Remote Sens., 106 82 –94 (2015). https://doi.org/10.1016/j.isprsjprs.2015.05.002 IRSEE9 0924-2716 Google Scholar

11. 

A. N. Bao et al., “Comparison of relative radiometric normalization methods using pseudo-invariant features for change detection studies in rural and urban landscapes,” J. Appl. Remote Sens., 6 (10), 063578 (2012). https://doi.org/10.1117/1.JRS.6.063578 Google Scholar

12. 

Q. Xu, Z. Hou and T. Tokola, “Relative radiometric correction of multi-temporal ALOS AVNIR-2 data for the estimation of forest attributes,” ISPRS J. Photogramm. Remote Sens., 68 69 –78 (2012). https://doi.org/10.1016/j.isprsjprs.2011.12.008 IRSEE9 0924-2716 Google Scholar

13. 

C. Song et al., “Classification and change detection using Landsat TM data—when and how to correct atmospheric effects,” Remote Sens. Environ., 75 (2), 230 –244 (2001). https://doi.org/10.1016/S0034-4257(00)00169-3 Google Scholar

14. 

Y. Chen et al., “Radiometric cross-calibration of GF-4 PMS sensor based on assimilation of landsat-8 OLI images,” Remote Sens., 9 (8), 811 (2017). https://doi.org/10.3390/rs9080811 Google Scholar

15. 

J.-C. Padró et al., “Radiometric correction of simultaneously acquired landsat-7/landsat-8 and sentinel-2A imagery using pseudoinvariant areas (pIA): contributing to the Landsat time series legacy,” Remote Sens., 9 (12), 1319 (2017). https://doi.org/10.3390/rs9121319 Google Scholar

16. 

J. Zhou et al., “Atmospheric correction of PROBA/CHRIS data in an urban environment,” Int. J. Remote Sens., 32 (9), 2591 –2604 (2011). https://doi.org/10.1080/01431161003698443 IJSEDK 0143-1161 Google Scholar

17. 

F. G. Hall et al., “Radiometric rectification: toward a common radiometric response among multidate, multisensor images,” Remote Sens. Environ., 35 (1), 11 –27 (1991). https://doi.org/10.1016/0034-4257(91)90062-B Google Scholar

18. 

X. Chen, L. Vierling and D. Deering, “A simple and effective radiometric correction method to improve landscape change detection across sensors and across time,” Remote Sens. Environ., 98 (1), 63 –79 (2005). https://doi.org/10.1016/j.rse.2005.05.021 Google Scholar

19. 

T. A. Schroeder et al., “Radiometric correction of multi-temporal Landsat data for characterization of early successional forest patterns in western Oregon,” Remote Sens. Environ., 103 (1), 16 –26 (2006). https://doi.org/10.1016/j.rse.2006.03.008 Google Scholar

20. 

W. Li, K. Sun and H. Zhang, “Algorithm for relative radiometric consistency process of remote sensing images based on object-oriented smoothing and contourlet transforms,” J. Appl. Remote Sens., 8 (1), 083607 (2014). https://doi.org/10.1117/1.JRS.8.083607 Google Scholar

21. 

Jr. P. S. Chavez, “An improved dark-object subtraction technique for atmospheric scattering correction of multispectral data,” Remote Sens. Environ., 24 (3), 459 –479 (1988). https://doi.org/10.1016/0034-4257(88)90019-3 Google Scholar

22. 

M. M. Rahman et al., “An assessment of polynomial regression techniques for the relative radiometric normalization (RRN) of high-resolution multi-temporal airborne thermal infrared (TIR) imagery,” Remote Sens., 6 (12), 11810 –11828 (2014). https://doi.org/10.3390/rs61211810 Google Scholar

23. 

H. Olsson, “Regression functions for multitemporal relative calibration of thematic mapper data over boreal forest,” Remote Sens. Environ., 46 (1), 89 –102 (1993). https://doi.org/10.1016/0034-4257(93)90034-U Google Scholar

24. 

J. R. Schott, C. Salvaggio and W. J. Volchok, “Radiometric scene normalization using pseudoinvariant features,” Remote Sens. Environ., 26 (1), 1 –16 (1988). https://doi.org/10.1016/0034-4257(88)90116-2 Google Scholar

25. 

H. Zhou et al., “A new model for the automatic relative radiometric normalization of multiple images with pseudo-invariant features,” Int. J. Remote Sens., 37 (19), 4554 –4573 (2016). https://doi.org/10.1080/01431161.2016.1213922 IJSEDK 0143-1161 Google Scholar

26. 

D. G. Hadjimitsis, C. R. I. Clayton and A. Retalis, “The use of selected pseudo-invariant targets for the application of atmospheric correction in multi-temporal studies using satellite remotely sensed imagery,” Int. J. Appl. Earth Obs. Geoinf., 11 (3), 192 –200 (2009). https://doi.org/10.1016/j.jag.2009.01.005 Google Scholar

27. 

C. D. Elvidge et al., “Relative radiometric normalization of Landsat multispectral scanner (MSS) data using an automatic scattergram-controlled regression,” Photogramm. Eng. Remote Sens., 61 (10), 1255 –1260 (1995). Google Scholar

28. 

V. Sadeghi, H. Ebadi and F. F. Ahmadi, “A new model for automatic normalization of multitemporal satellite images using artificial neural network and mathematical methods,” Appl. Math. Modell., 37 (9), 6437 –6445 (2013). https://doi.org/10.1016/j.apm.2013.01.006 AMMODL 0307-904X Google Scholar

29. 

I. Olthof et al., “Landsat-7 ETM+ radiometric normalization comparison for northern mapping applications,” Remote Sens. Environ., 95 (3), 388 –398 (2005). https://doi.org/10.1016/j.rse.2004.06.024 Google Scholar

30. 

H. U. Changmiao et al., “Landsat TM/ETM+ and HJ-1A/B CCD data automatic relative radiometric normalization and accuracy verification,” J. Remote Sens., 18 (2), 267 –286 (2014). https://doi.org/10.11834/jrs.20143225 Google Scholar

31. 

L. Zhang, C. Wu and B. Du, “Automatic radiometric normalization for multitemporal remote sensing imagery with iterative slow feature analysis,” IEEE Trans. Geosci. Remote Sens., 52 (10), 6141 –6155 (2014). https://doi.org/10.1109/TGRS.2013.2295263 IGRSD2 0196-2892 Google Scholar

32. 

D. P. Roy et al., “Multi-temporal MODIS-Landsat data fusion for relative radiometric normalization, gap filling, and prediction of Landsat data,” Remote Sens. Environ., 112 (6), 3112 –3130 (2008). https://doi.org/10.1016/j.rse.2008.03.009 Google Scholar

33. 

A. Langner et al., “Spectral normalization of spot 4 data to adjust for changing leaf phenology within seasonal forests in Cambodia,” Remote Sens. Environ., 143 (5), 122 –130 (2014). https://doi.org/10.1016/j.rse.2013.12.012 Google Scholar

34. 

M. C. Hansen et al., “A method for integrating MODIS and Landsat data for systematic monitoring of forest cover and change in the Congo basin,” Remote Sens. Environ., 112 (5), 2495 –2513 (2008). https://doi.org/10.1016/j.rse.2007.11.012 Google Scholar

35. 

M. J. Canty, A. A. Nielsen and M. Schmidt, “Automatic radiometric normalization of multitemporal satellite imagery,” Remote Sens. Environ., 91 (3–4), 441 –451 (2004). https://doi.org/10.1016/j.rse.2003.10.024 Google Scholar

36. 

A. A. Nielsen, “The regularized iteratively reweighted mad method for change detection in multi- and hyperspectral data,” IEEE Trans. Image Process., 16 (2), 463 –478 (2007). https://doi.org/10.1109/TIP.2006.888195 IIPRE4 1057-7149 Google Scholar

37. 

C. J. B. Mateos et al., “Relative radiometric normalization of multitemporal images,” Int. J. Interact. Multimedia Artif. Intell., 1 (3), 53 –58 (2010). https://doi.org/10.9781/ijimai.2010.139 Google Scholar

38. 

M. J. Canty and A. A. Nielsen, “Automatic radiometric normalization of multitemporal satellite imagery with the iteratively re-weighted MAD transformation,” Remote Sens. Environ., 112 (3), 1025 –1036 (2008). https://doi.org/10.1016/j.rse.2007.07.013 Google Scholar

39. 

M. Vetterli and C. Herley, “Wavelets and filter banks: theory and design,” IEEE Trans. Signal Process., 40 (9), 2207 –2232 (1992). https://doi.org/10.1109/78.157221 ITPRED 1053-587X Google Scholar

40. 

A. R. Tee et al., “Haze detection and removal in high resolution satellite image with wavelet analysis,” IEEE Trans. Geosci. Remote Sens., 40 (1), 210 –217 (2002). https://doi.org/10.1109/36.981363 IGRSD2 0196-2892 Google Scholar

41. 

R. H. Bamberger and M. J. T. Smith, “A filter bank for the directional decomposition of images: theory and design,” IEEE Trans. Signal Process., 40 (4), 882 –893 (1992). https://doi.org/10.1109/78.127960 ITPRED 1053-587X Google Scholar

42. 

S. G. Biday and U. Bhosle, “Relative radiometric correction of multitemporal satellite imagery using Fourier and wavelet transform,” J. Indian Soc. Remote Sens., 40 (2), 201 –213 (2012). https://doi.org/10.1007/s12524-011-0155-6 Google Scholar

43. 

K. Sun et al., “A new relative radiometric consistency processing method for change detection based on wavelet transform and a low-pass filter,” Sci. China Technol. Sci., 53 (S1), 7 –14 (2010). https://doi.org/10.1007/s11431-010-3197-z Google Scholar

44. 

S. G. Chang, B. Yu and M. Vetterli, “Adaptive wavelet thresholding for image denoising and compression,” IEEE Trans. Image Process., 9 (9), 1532 –1546 (2000). https://doi.org/10.1109/83.862633 IIPRE4 1057-7149 Google Scholar

45. 

A. Singh, “Review article digital change detection techniques using remotely-sensed data,” Int. J. Remote Sens., 10 (6), 989 –1003 (1989). https://doi.org/10.1080/01431168908903939 IJSEDK 0143-1161 Google Scholar

46. 

I. Molina et al., “Evaluation of a change detection methodology by means of binary thresholding algorithms and informational fusion processes,” Sensors, 12 (3), 3528 –3561 (2012). https://doi.org/10.3390/s120303528 SNSRES 0746-9462 Google Scholar

Biography

Yepei Chen received her BS degree in GIS from Hubei University, Wuhan, China, in 2015. She is currently pursuing her MS and PhD degrees in photogrammetry and remote sensing at the School of State Key Laboratory of Information Engineering in Surveying, Mapping, and Remote Sensing, Wuhan University. Her research interests include radiometric normalization, change detection, and time series analysis.

Kaimin Sun received his BS, MS, and PhD degrees in photogrammetry and remote sensing from Wuhan University, Wuhan, China, in 1999, 2004, and 2008, respectively. He is currently an associate professor in the State Key Laboratory of Information Engineering in Surveying, Mapping, and Remote Sensing, Wuhan University. His research interests include photogrammetry, object-oriented image analysis, and image change detection.

Deren Li received his PhD in photogrammetry and remote sensing from the University of Stuttgart, Stuttgart, Germany, in 1984. Currently, he is a PhD supervisor with the State Key Laboratory of Information Engineering in Mapping, Surveying, and Remote Sensing, Wuhan University, China. He is also an academician of the Chinese Academy of Sciences, the Chinese Academy of Engineering, and the Euro-Asia International Academy of Sciences. His research interests are spatial information science and technology represented by RS, GPS, and GIS.

Ting Bai received her BS degree in GIS from Huazhong Agricultural University, Wuhan, China, in 2014. She is currently pursuing her MS and PhD degrees in photogrammetry and remote sensing with the School of State Key Laboratory of Information Engineering in Surveying, Mapping, and Remote Sensing, Wuhan University. Her current research interests include remote sensing and feature fusion, machine learning, ensemble learning, land use, and land cover changes analysis of long-time series.

Wenzhuo Li received his BS and MS degrees in photogrammetry and remote sensing from Wuhan University, Wuhan, China, in 2011 and 2013, respectively. He is currently pursuing his PhD in photogrammetry and remote sensing with the School of Remote Sensing and Information Engineering, Wuhan University. His current research interests include image segmentation, image classification, land use and land cover changes detection, and object-oriented image analysis.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Yepei Chen, Kaimin Sun, Deren Li, Ting Bai, and Wenzhuo Li "Improved relative radiometric normalization method of remote sensing images for change detection," Journal of Applied Remote Sensing 12(4), 045018 (17 December 2018). https://doi.org/10.1117/1.JRS.12.045018
Received: 11 August 2018; Accepted: 22 November 2018; Published: 17 December 2018
Lens.org Logo
CITATIONS
Cited by 5 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Wavelet transforms

Principal component analysis

Detection and tracking algorithms

Remote sensing

Radiometric corrections

Wavelets

Lithium

RELATED CONTENT


Back to Top