The advancement of computer simulation tools for high fidelity signature modeling has led to a requirement for a better understanding of effects of light scattering from surfaces. Measurements of the Bidirectional Reflectance Distribution Function (BRDF) fully describe the angular scattering properties of materials, and these may be used in signature simulations to quantitatively characterize the optical effects of surface treatments on targets. This paper reviews the theoretical and experimental techniques for characterizing the BRDF of surfaces and examines some of the popular parameterized BRDF representations that are used in signature calculations.
Increasingly, the signature management community is demanding modeling tools for a variety of purposes from real-time simulations to complex modeling tasks. RenderView is one of the tools which has been developed and continues to evolve in response to this demand.
The focus of RenderView development has been physics based modeling of high complexity both geometrically and with respect to surface optical properties. RenderView incorporates full bi-directional reflectance distribution function (BRDF) models and measured and calibrated global illumination maps. With these tools comes the capability to evaluate with a very high level of fidelity the impact of vehicle geometric and surface properties on its visible and thermal signature.
A description of RenderView will be presented in terms of its focus on high fidelity models of vehicles and materials. A number of examples will be presented that show how the fidelity of the BRDF impacts the signature via the rendering model.
The current state of the art in synthetic radiometrically accurate scene generation for visual signatures remains immature. Even more difficult is creating composite images of photo-realistic synthetic images placed into images of real scenes. A potential solution to this problem is to use measured background data to drive the target rendering process. This approach has the advantage of deriving synthetic images with sufficient fidelity for inputs into the visual laboratory and performance codes. Since scene luminance can change rapidly, especially during partly cloudy conditions, all measurements must be obtained nearly simultaneously. This paper will explore the requirements for a visual predictive code and meeting these requirements with a background measurement process. A prototype measurement system will be described along with results from measurements.
A new iterative algorithm (EMLS) via the expectation maximization method is derived for extrapolating a non- negative object function from noisy, diffraction blurred image data. The algorithm has the following desirable attributes; fast convergence is attained for high frequency object components, is less sensitive to constraint parameters, and will accommodate randomly missing data. Speed and convergence results are presented. Field test imagery was obtained with a passive millimeter wave imaging sensor having a 30.5 cm aperture. The algorithm was implemented and tested in near real time using field test imagery. Theoretical results and experimental results using the field test imagery will be compared using an effective aperture measure of resolution increase. The effective aperture measure, based on examination of the edge-spread function, will be detailed.
The spatial resolution of under sampled or diffraction limited images can be improved through micro scanning and super-resolution technologies. The objective of this Air Force Phase Ii Small Business Innovative Research was to develop and demonstrate real-time or near real-time micro scanning and super-resolution algorithms using passive millimeter wave imagery. A new super-resolution algorithm based on expectation-maximization was developed which is insensitive to missing data, incorporates both positivity and smoothness constraints, and rapidly converges in 15 to 20 iterations. Analysis using measured data shows that the practical resolution gain that can be expected using this algorithm is less than a facto of two. A new micro scanning algorithm was developed and demonstrated that can reliably detect less than one fifth of an IFOV displacement using field test data. The iteration of the super-resolution and microscanning algorithms was demonstrated and resolution gains of four to six times can be achieved if the image is under sampled by a factor of two or three. Consequently, it makes sense to use a wide under sampled FOV sensor in which high spatial resolution can be obtained as desired using micro scanning and super-resolution techniques.
Achievement of ultra-high fidelity signature modeling of targets requires a significant level of complexity for all of the components required in the rendering process. Specifically, the reflectance of the surface must be described using the bi-directional distribution function (BRDF). In addition, the spatial representation of the background must be high fidelity. A methodology and corresponding model for spectral and band rendering of targets using both isotropic and anisotropic BRDFs is presented. In addition, a set of tools will be described for generating theoretical anisotropic BRDFs and for reducing data required for a description of an anisotropic BRDF by 5 orders of magnitude. This methodology is hybrid using a spectrally measured panoramic of the background mapped to a large hemisphere. Both radiosity and ray-tracing approaches are incorporated simultaneously for a robust solution. In the thermal domain the spectral emission is also included in the solution. Rendering examples using several BRDFs will be presented.
The application of advanced low observable treatments to ground vehicles has led to a requirement for a better understanding of effects of light scattering from surfaces. Measurements of the Bidirectional Reflectance Distribution Function (BRDF) fully describe the angular scattering properties of materials, and these may be used in signature simulations to quantitatively characterize the optical effects of surface treatments on targets. This paper reviews the theoretical and experimental techniques for characterizing the BRDF of surfaces and examines some of the popular parameterized BRDF representations that are used in signature calculations.
To recover spatial information from bandlimited images using maximum likelihood (ML) and constrained least squares techniques it is necessary that the image plane be oversampled. Specifically, oversampling allows the blur component induced by spatial integration of the signal over the finite size of the detector element(s) to be reduced. However, if oversampling in the image plane is achieved with a fixed array, the field of view (FOV) is proportionately reduced. Conversely, if the FOV is to be preserved then proportionately more samples are required implying the requirement for additional detector elements. An effective solution to obtaining oversampling in the image plane and subsequently preserving the FOV, is to use either controlled or uncontrolled microscanning. There are a number of methods to achieve microscanning including translation of the sensor array in the image plane and exploitation of airframe jitter. Three unique sixteen-times-Nyquist oversampled passive millimeter wave (PMMW) images; a point source, an extended source, and an M48 tank were carefully obtained. Both ML and constrained least squares (CLS) algorithms were used for restoration of spatial information in the images. Restoration of known extended source object functions (contained in the extended source image) resulted in resolution gains of 1.47 and 3.43 using the CLS and ML methods respectively, as measured by increase in effective aperture.
We consider the numerical solution of first kind Fredholm integral equations. Such integral equations occur in signal processing and image recovery problems among others. For this numerical study, the kernel k(x,t) is the sinc kernel. This study compares traditional Tikhonov regularization with an extension of Tikhonov regularization which updates the solution found by the usual method. In this work, both the identity, derivative and Laplacian operators are used as regularizers and tests were done with and without error in the image data g(x). The results indicate that the extension can provide a decrease in error of about two orders of magnitude.
It is well known that regularization techniques are often required to obtain stable approximate solutions to ill-posed problems in imaging. Most regularization techniques require the choice of one or more parameters. In previous work, the cross referencing method has proven to be an effective method in obtaining such approximate solutions. It is also true that varying the singular values of the regularization operator independently can provide great improvement in the quality of the regularized solution. In the present paper, we incorporate this idea as an extension to the cross referencing method.
Clutter metrics are important image measures for evaluating the expected performance of sensors and detection algorithms. Typically, clutter metrics attempt to measure the degree to which background objects resemble targets. That is, the more target-like objects or attributes in the background the higher the clutter level. However, it is critically important that the characteristics of the sensor systems and the detection algorithms be included in any measure of clutter. For example, clutter to a coarse resolution sensor coupled with a pulse thresholding detection algorithm is not necessarily clutter to a second generation FLIR with a man in the loop. Using present state- of-the-art first and second order clutter metrics and respective performance studies, a new class of sensor/algorithm clutter metrics will be derived which explicitly use characteristics of the sensor and detection algorithms. A methodology will be presented for deriving sensor/algorithm dependent clutter metric coefficients and algorithms for a broad class of systems.
The current status of U.S. efforts in infrared background and model development is assessed. The paper includes discussions with both DoD and contractor research personnel active in data collection, system design, and background modeling, backed up by examination of a large body of existing reports and data. Future background measurement programs as well as the directions that background modeling efforts are taking are also discussed.
The infrared exitance of steel plates with several emissivities are modeled using PRISM 3.0 and LOWTRAN7 under sky backgrounds representative of Middle East desert conditions in the summer. LOWTRAN7 is used to calculate the downward thermal radiance of a desert haze atmosphere with multiple scattering. PRISM 3.0 incorporates the results from LOWTRAN7 into annular rings that represent the temperature gradiant of the sky dome and predicts the apparent temperature of the plates in the 8 to 14 micron band. This study is part of a preliminary look at the issue of passive low observable technology for application to ground vehicles and an illustration of state-of-the-art computer-based background modeling and thermal simulation.
Target detection in clutter depends sensitively on the spatial structure of the latter. In particular, it is the ratio of the target size to the clutter inhomogeneity scale which is of crucial importance. Indeed, looking for the leopard in the background of leopard skin is a difficult task. Hence quantifying thermal clutter is essential to the development of successful detection algorithms and signature analysis. This paper describes an attempt at clutter characterization along with several applications using calibrated thermal imagery collected by the Keweenaw Research Center. The key idea is to combine spatial and intensity statistics of the clutter into one number in order to characterize intensity variations over the length scale imposed by the target. Furthermore, when properly normalized, this parameter appears independent of temporal meteorological variation, thereby constituting a background scene invariant. This measure has a basis in analysis of variance and is related to digital signal processing fundamentals. Statistical analysis of thermal images is presented with promising results.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.