KEYWORDS: Field programmable gate arrays, Image processing, Digital signal processing, Logic, Algorithm development, Distortion, Clocks, Commercial off the shelf technology, Mathematics, Image sensors
Recent advances in Field-Programmable Gate Arrays (FPGAs) and innovations in firmware design have allowed more
complex image processing algorithms to be implemented entirely within the FPGA devices while substantially
improving performance and reducing development time. Firmware innovations include a unique memory buffer
architecture and the use of floating-point math. The design discussed takes advantage of these advances and innovations
to implement a geometric transformation algorithm with bilinear interpolation for applications such as distortion
correction. The firmware and hardware developed in this effort support image sizes of up to 1024x1024 pixels at 200 Hz
and pixel rates of 216 MHz with versions available that support oversized input images.
The Kinetic Kill Vehicle Hardware-in-the-Loop Vacuum Cold Chamber (KVACC) has been a work in progress since its initial delivery in 1995. Originally delivered as a basic cryogenic test chamber with little real world capability, it has evolved over the years to a valuable test asset incorporating many leading edge test technologies. KVACC is now the centerpiece for the cryogenic complex scene test capability within the Air Force Research Laboratory (AFRL). The purpose of this paper is to describe the capabilities of KVACC as they have evolved since its initial delivery.
In seekers that never resolve targets spatially, it may be adequate to calibrate only with sources that have known aperture irradiance. In modern missile interceptors, the target becomes spatially resolved at close ranges, and the seeker's ability to accurately measure the radiance at different positions in the scene is also important. Thus, it is necessary to calibrate the seekers with extended sources of known radiance. The aperture irradiance is given by the radiance integrated over the angular extent of the target in the scene. Thus radiance calibrations and accurately presenting the targets spatially produces accurate irradiances. The accuracy of the scene radiance is also important in generating synthetic imagery for testing seeker conceptual designs and seeker algorithms, and for hardware-in-the-loop testing with imaging projection systems. The routine procedure at the Air Force Research Laboratory Munitions Directorate's AFRL/MNGG is to model and project the detailed spatial and radiometric content of the scenes. Hence, accurate depiction of the radiance in the scene is important. AFRL/MNGG calibrates the complete projection system (synthetic image generator and scene projector) with extended sources of known radiance, not unresolved sources of known irradiance. This paper demonstrates that accurate radiance calibrations and accurate spatial rendering do provide accurate aperture irradiances in the projection systems. In recent tests conducted by AFRL/MNGG, the projection system was calibrated in terms of radiance, and the aperture irradiances were determined both as they were observed in the synthetic images that drove the projection system and in the images of the projection system measured by the unit under test. The aperture irradiances were compared with the known truth data and errors were determined. This paper presents results of analyzing the errors associated with the observed aperture irradiances.
The KHILS Vacuum Cold Chamber (KVACC) has formed the basis for a comprehensive test capability for newly developed dual-band infrared sensors. Since initial delivery in 1995, the KVACC chamber and its support systems have undergone a number of upgrades, maturing into a valuable test asset and technology demonstrator for missile defense systems. Many leading edge test technologies have been consolidated during the past several years, demonstrating the level of fidelity achievable in tomorrow's missile test facilities. These technologies include resistive array scene projectors, sub-pixel non-linear spatial calibration and coupled two-dimensional radiometric calibration techniques, re-configurable FPGA based calibration electronics, dual-band beam-combination and collimation optics, a closed-cycle multi-chamber cryo-vacuum environment, personal computer (PC) based scene generation systems and a surrounding class-1000 clean room environment. The purpose of this paper is to describe this unique combination of technologies and the capability it represents to the hardware-in-the-loop community.
One proven technique for nonuniformity correction (NUC) of a resistor array infrared scene projector requires careful measurement of the output-versus-input response for every emitter in a large array. In previous papers, we have discussed methods and results for accomplishing the projector NUC. Two difficulties that may limit the NUC results are residual nonuniformity in the calibration sensor, and nonlinearity in the calibration sensor's response to scene radiance. These effects introduce errors in the measurement of the projector elements' output, which lead to residual nonuniformity. In this paper we describe a recent effort to mitigate both of these problems using a procedure that combines sensor nonuniformity correction and sensor calibration, detector by detector, so that these problems do not contaminate the projector NUC. By measuring a set of blackbody flood-field images at a dozen or so different temperatures, the individual detector output-versus-input radiance responses can be measured. Similar to the projector NUC, we use a curve-fitting routine to model the response of each detector. Using this set of response curves, a post-processing algorithm is used to correct and calibrate the images measured by the sensor. We have used this approach to reduce several sensor error sources by a factor of 10 to 100. The resulting processing is used to correct and calibrate all of the sensor images used to perform the projector NUC, as one step in the projector NUC. The procedure appears to be useful for any application where sensor nonuniformity or response nonlinearities are significant.
Spatial distortion effects in infrared scene projectors, and methods to correct them, have been studied and reported in several recent papers. Such effects may be important when high angular fidelity is required of a projection test. The modeling and processing methods previously studied, though effective, have not been well suited for real-time implementation. However, the “spatial calibration” must be achieved in real-time for certain testing requirements. In this paper we describe recent efforts to formalize and implement real-time spatial calibration in a scene projector test. We describe the effect of the scene generation software, “distortion compensation”, the projector, the sensor, and sensor processing algorithms on the transfer of spatial quantities through the projection system. These effects establish requirements for spatial calibration. The paper describes the hardware and software recently developed at KHILS to achieve real-time spatial calibration of a projection system. The technique extends previous efforts in its consideration of implementation requirements, and also in its explicit treatment of the spatial effects introduced by each of the distinct components of the overall system, as mentioned above.
For many types of infrared scene projectors, differences in the outputs of individual elements are one source of error in projecting a desired radiance scene. This is particularly true of resistor-array based infrared projectors. Depending on the sensor and application, the desired response uniformity may prove difficult to achieve. The properties of the sensor used to measure the projector outputs critically affect the procedures that can be used for nonuniformity correction (NUC) of the projector, as well as the final accuracy achievable by the NUC. In this paper we present a description of recent efforts to perform NUC of an infrared projector under “adverse” circumstances. For example, the NUC sensor may have some undesirable properties, including: significant random noise, large residual response nonuniformity, temporal drift in bias or gain response, vibration, and bad pixels. We present a procedure for reliably determining the output versus input response of each individual emitter of a resistor array projector. This NUC procedure has been demonstrated in several projection systems at the Kinetic Kill Vehicle Hardware-In-the-Loop Simulator (KHILS) including those within the KHILS cryogenic chamber. The NUC procedure has proven to be generally robust to various sensor artifacts.
The effects of distortion in the complex optical system of an IR scene projector have motivated the development of methods for spatial calibration for scene projectors. A typical method utilizes the projection of a set of test images, with careful measurement of the location of points in the image. Given the projected and measured positions, a parametric model is used to describe the spatial “distortion” of the projection system. This distortion model can then be used for a variety of purposes, including pre-processing the images to be projected so that the distortion of the projection system is pre-compensated, and the distortion of the projection system is negated. This application and specific method have been demonstrated, and can compensate for a variety of distortion and alignment effects in the projector / sensor configuration. Personnel at the Kinetic Kill Vehicle Hardware-in-the-loop Simulator (KHILS) facility have demonstrated compensation and co-alignment of 2-color projection systems with sub-pixel precision using this technique. This paper describes an analysis of a situation in which pre-compensated images are translated (either mechanically or optically) to simulate motion of a target object or adjust alignment of the sensor and projector. The effect of physically translating images that had been pre-compensated for a different projector/sensor alignment was analyzed. We describe the results of a study of the translation and distortion effects, and characterize the expected performance of a testing procedure that requires translation of the pre-compensated images.
An unexpected effect was observed in a data set recently measured at the Kinetic Kill Vehicle Hardware-in-the-loop Simulator (KHILS) facility. A KHILS projector was driven to illuminate a contiguous block of emitters, with all other emitters turned off. This scene was measured with a two-color IR sensor. A sequence of 100 images was recorded, and certain statistics were computed from the image sequence. After measuring and analyzing these images, a “border” was observed with a particularly large standard deviation around the bright rectangular region. The pixels on the border of the region were much noisier than either inside or outside of the bright region. Although several explanations were possible, the most likely seemed to be a small vibration of either the sensor or projector. The sensor, for example, uses a mechanical cyro-cooler, which produces a vibration that can be felt by hand. Further analyses revealed an erratic motion of the position of objects in the image with amplitude of a few tents of the detector pitch. This small motion is sufficient to produce large fluctuations in the image pixel values in regions that have a large radiance gradient - such as suggest that the standard deviation of a “block image” sequence is easy to compute and will show the characteristic effect in the presence of image motion as small as a fraction of the detector pitch.
Infrared detectors operating in two or more wavebands can be used to obtain emissivity-area, temperature, and related parameters. While the cameras themselves may not collect data in the two bands simultaneously in space or time, the algorithms used to calculate such parameters rely on spatial and temporal alignment of the true optical data in the two bands. When such systems are tested in a hardware-in-the-loop (HWIL) environment, this requirement for alignment is in turn imposed on the projection systems used for testing. As has been discussed in previous presentations to this forum, optical distortion and misalignment can lead to significant band-to-band and band-to-truth simulation errors. This paper will address the potential impact of techniques to remove these errors on typical two-color estimation algorithms, as well as improvements obtained using distortion removal techniques applied to HWIL data collected at the Kinetic Kill Vehicle Hardware-in-the-Loop Simulator (KHILS) facility.
Infrared projection systems based on resistor arrays typically produce radiometric outputs with wavelengths that range from less than 3 microns to more than 12 microns. This makes it possible to test infrared sensors with spectral responsivity anywhere in this range. Two resistor-array projectors optically folded together can stimulate the two bands of a 2-color sensor. If the wavebands of the sensor are separated well enough, it is possible to fold the projected images together with a dichroic beam combiner (perhaps also using spectral filters in front of each resistor array) so that each resistor array independently stimulates one band of the sensor. If the wavebands are independently stimulated, it is simple to perform radiometric calibrations of both projector wavebands. In some sensors, the wavebands are strongly overlapping, and driving one of the resistor arrays stimulates both bands of the unit-under-test (UUT). This “coupling” of the two bands causes errors in the radiance levels measured by the sensor, if the projector bands are calibrated one at a time. If the coupling between the bands is known, it is possible to preprocess the driving images to effectively decouple the bands. This requires performing transformations, which read both driving images (one in each of the two bands) and judiciously adjusting both projectors to give the desired radiance in both bands. With this transformation included, the projection system acts as if the bands were decoupled - varying one input radiance at a time only produces a change in the corresponding band of the sensor. This paper describes techniques that have been developed to perform radiometric calibrations of spectrally coupled, 2-color projector/sensor systems. Also presented in the paper are results of tests performed to demonstrate the performance of the calibration techniques. Possible hardware and algorithms for performing the transformation in real-time are also presented.
The Air Force Research Laboratory (AFRL) Aerothermal Targets Analysis Program (ATAP) is a user-friendly, engineering-level computational tool that features integrated aerodynamics, six-degree-of-freedom (6-DoF) trajectory/motion, convective and radiative heat transfer, and thermal/material response to provide an optimal blend of accuracy and speed for design and analysis applications. ATAP is sponsored by the Kinetic Kill Vehicle Hardware-in-the-Loop Simulator (KHILS) facility at Eglin AFB, where it is used with the CHAMP (Composite Hardbody and Missile Plume) technique for rapid infrared (IR) signature and imagery predictions. ATAP capabilities include an integrated 1-D conduction model for up to 5 in-depth material layers (with options for gaps/voids with radiative heat transfer), fin modeling, several surface ablation modeling options, a materials library with over 250 materials, options for user-defined materials, selectable/definable atmosphere and earth models, multiple trajectory options, and an array of aerodynamic prediction methods. All major code modeling features have been validated with ground-test data from wind tunnels, shock tubes, and ballistics ranges, and flight-test data for both U.S. and foreign strategic and theater systems. Numerous applications include the design and analysis of interceptors, booster and shroud configurations, window environments, tactical missiles, and reentry vehicles.
KEYWORDS: Signal processing, Projection systems, Nonuniformity corrections, Infrared radiation, Data processing, Computer simulations, Field effect transistors, Temperature metrology, Black bodies, Error analysis
An alternative class of infrared projector real-time nonuniformity correction processor is introduced, based on the concept that the fundamental role of the processor is to reverse each of the projector processing steps as the input DAC voltage word is converted into infrared signal radiance output. The design is developed by assessment of the sequence of processes occurring within the projector and is tested by simulation. It is shown that there is potential for high fidelity nonuniformity correction across the infrared dynamic range without the need for the introduction of curve-fitting breakpoints.
As discussed in a previous paper to this forum, optical components such as collimators that are part of many infrared projection systems can lead to significant distortions in the sensed position of projected objects versus their true position. The previous paper discussed the removal of these distortions in a single waveband through a polynomial correction process. This correction was applied during post-processing of the data from the infrared camera-under-test. This paper extends the correction technique to two-color infrared projection. The extension of the technique allows the distortions in the individual bands to be corrected, as well as providing for alignment of the two color channels at the aperture of the camera-under-test. The co-alignment of the two color channels is obtained through the application of the distortion removal function to the object position data prior to object projection.
This paper describes a simulation and analysis of a sensor viewing a 'pixelized' scene projector like the KHILS' Wideband Infrared Scene Projector (WISP). The main objective of this effort is to understand and quantify the effects of different scene projector configurations on the performance of several sensor signal processing algorithms. We present simulation results that quantify the performance of two signal processing algorithms used to estimate the sub-pixel position and irradiance of a point source. The algorithms are characterized for different signal-to-noise ratios, different projector configurations, and two different methods for preparing images that drive the projector. We describe the simulation in detail, numerous results obtained by processing simulated images, algorithms and projector properties, and present conclusions.
The Honeywell resistor arrays produce radiance outputs, which are observed to have a strong non-linear dependence on the voltage out of the digital-to-analog-converters (DACs). In order for the projection system to run in a radiometrically calibrated mode, the radiances in the image generator must be transformed with exactly the inverse of the resistor array response function before they are sent to the DACs. Representing the image values out of the image generator and the values into the DACs with quantized, digital values introduces errors in the radiance out of the resistor array. Given the functional form of the emitter array response and the number of bits used to represent the image values, these errors in the radiometric output due to the quantization effects can be calculated. This paper describes the calculations and presents results for WISP, MSSP, and the new extended range and standard range BRITE II arrays.
Infrared projection systems commonly use a collimating optical system to make images of a projection device appear far away from the infrared camera observing the projector. These `collimators' produce distortions in the image seen by the camera. For many applications the distortions are negligible, and the major problem is simply shifting, rotating, and adjusting the magnification, so that the projector image is aligned with the camera. In a recent test performed in the Kinetic Kill Vehicle Hardware-in-the-Loop Simulator facility, it was necessary to correct for distortions as small as 1/10th the size of the camera pixels across the field of view of the camera. This paper describes measurements and analyses performed to determine the optical distortions, and methods used to correct them.
KEYWORDS: Projection systems, Infrared radiation, Resistors, Thermography, Temperature metrology, Black bodies, Signal processing, Calibration, Field effect transistors, Infrared imaging
The thermal conduction and electronic drive processes that govern the temporal response of resistor array infrared projectors are reviewed. The characteristics and limitations of the voltage overdrive method that can be implemented for sharpening the temporal response are also discussed. Overdrive is shown to be a viable technique provided sufficient drive power and temperature margins are available outside of the normal dynamic range. It is shown also by analysis of overdrive measurements applied to a Honeywell GE snapshot resistor array that practical real-time overdrive processors can be designed to operate consistently with theoretical predictions.
The Ballistic Missile Defense Organization (BMDO) sponsored the development of the Kinetic Kill Vehicle Hardware-in-the- Loop Simulator (KHILS) to provide a comprehensive ground test capability for end game performance evaluation of BMDO interceptor concepts. Since its inception in 1986, the KHILS facility has been on the forefront of HWIL test technology development. This development has culminated in closed-loop testing involving large format resistive element projection arrays, 3D scene rendering systems, and real-time high fidelity phenomenology codes. Each of these components has been integrated into a real-time environment that allows KHILS to perform dynamic closed-loop testing of BMDO interceptor systems or subsystems. Ongoing activities include the integration of multiple resistor arrays into both a cold chamber and flight motion simulator environment, increasing the update speed of existing arrays to 180 Hz, development of newer 200 Hz snapshot resistor arrays, design of next generation 1024 X 1024 resistor arrays, development of a 1000 Hz seeker motion stage, integration of a resistor array into an RF chamber, and development of advanced real-time plume flow-field codes. This paper describes these activities and test results of the major facility components.
The KHILS facility in the Air Force Research Laboratory (AFRL) Munitions Directorate at Eglin AFB has developed a hardware- in-the-loop (HIL) simulation for the Low Cost Autonomous Attack System (LOCAAS). This simulation was developed to provide risk reduction for the LOCAAS guided test vehicle (GTV) flight test program. This paper reports on the results of this support activity and describes the simulation techniques employed to enable real-time closed-loop testing of the LOCAAs Laser Radar (LADAR) concept. The overall HIL layout will be described, including a discussion of interfaces, transport delays associated with these interfaces, compensation techniques employed to minimize the effects of these interface delays, real-time 3-D LADAR scene generation, and flight motion simulation.
Kinetic Energy Weapon (KEW) programs under the Ballistic Missile Defense Office (BMDO) need high fidelity infrared (IR) seekers. As imaging sensors have matured to support BMDO, the complexity of functions assigned to the KEW weapon systems has magnified the necessity for robust hardware-in- the-loop (HWIL) simulation facilities to reduce program risk. The IR projector, an integral component of a HWIL simulation, must reproduce the real world with enough fidelity that the unit-under-test algorithms respond to the projected images as though it were viewing the real world. For test scenarios involving unresolved objects, IR projector arrays have limitations which constrain testing accuracy. These arrays have limited dynamic range, spatial resolution, and spatial bandwidth for unresolved targets, decoys, and debris. The Steerable Laser Projector (SLP) will allow the HWIL simulation facility to address these testing issues. The Kinetic Kill Vehicle Hardware-in-the-loop Simulation (KHILS) facility located at Eglin AFB, FL is now in the process of integrating a projector array with the SLP. This new projector combines the capabilities of both projector technologies to provide KHILS with a unique asset that addresses many of the challenges that are required to support testing of state-of-the-art IR guided weapons.
KEYWORDS: Computer simulations, Missiles, Data modeling, Distortion, Signal processing, Sensors, Convolution, Energy based weapons, Control systems, Image processing
The Kinetic-kill-vehicle Hardware-in-the-Loop Simulation Facility (KHILS), located at Eglin AFB FL, has been involved in the development and ground testing of Ballistic Missile Defense Organization hit-to-kill interceptor concepts for 10 years. Work is ongoing to characterize the implement hardware-in-the-loop models for missile `environment' effects that are associated with high speed flight in general and endo-atmospheric flight in particular. Two critical areas of interest in endo-atmospheric simulation are: (1) effects on the line-of-sight due to divert thruster firings and the resulting structural vibration, and (2) the line-of-sight aero-optical environment which can be influenced by heated missile flowfields, coolant layers, and thruster fringes. The structural and aero-optical effects manifest themselves as image jitter, blurring, boresight shifts, and increased background radiance. At the KHILS facility, real-time closed-loop simulation techniques are being developed for structural and aero-optical effects presentation. These techniques include both software and hardware solutions. This paper describe the status of activities by describing the issues and the present KHILS solutions. The paper includes discussion of model interfaces with hardware-in-the-loop simulations, timing issues, and data transmittal bandwidth requirements. Image show the effects of structural and aero-optical disturbances on seeker focal plane energy distributions.
KEYWORDS: Computer simulations, Process modeling, LIDAR, Visualization, Fermium, Frequency modulation, Signal processing, Data modeling, Human-machine interfaces, Missiles
The KHILS facility in the Wright Laboratory Armament Directorate at Eglin AFB has developed a hardware-in-the- loop (HWIL) simulation for the Low Cost Autonomous Attack System. Unique techniques have been developed for real-time closed-loop signal injection testing of this Laser Radar (LADAR) guided munition concept. The overall HWIL layout will be described including discussion of interfaces, real- time 3D LADAR scene generation, flight motion simulation, and real-time graphical visualization. In addition, the practical application of a new simulation Verification, Validation and Accreditation procedure will be described in relation to this HWIL simulation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.