Open Access Paper
17 September 2019 Multimodal sensor: high-speed 3D and thermal measurement
Author Affiliations +
Proceedings Volume 11144, Photonics and Education in Measurement Science 2019; 1114403 (2019) https://doi.org/10.1117/12.2531950
Event: Joint TC1 - TC2 International Symposium on Photonics and Education in Measurement Science 2019, 2019, Jena, Germany
Abstract
For the measurement of three-dimensional (3D) shapes, active optical measurement systems based on pattern projection are widely used. These sensors work without contact and are non-destructive. Between one camera and the projector or between two cameras, the 3D reconstruction is performed by detection and triangulation of corresponding image points. Recently, we developed a 3D stereo sensor working in the visible range of light (VIS). It consists of two highspeed cameras and a GOBO projection-based high-speed pattern projector. Our system allows us to successfully reconstruct 3D point clouds of fast processes such as athletes in motion or even crash tests. Simultaneously measuring the surface temperature would be of great benefit as fast processes usually exhibit local temperature changes. In order to include thermal data into the evaluation, we have extended our existing high-speed 3D sensor by including an additional high-speed long-wave infrared (LWIR) camera. The thermal camera detects radiation in the spectral range between 7.5 and 12 μm. We map the measured temperatures as texture onto the reconstructed 3D points. In this contribution, we present the design of this novel 5D (three spatial coordinates, temperature, and time) sensor. The simultaneous calibration process of the VIS cameras and the LWIR camera in a common coordinate system is described. First promising measurements of an inflating airbag, a basketball player, and the crushing of a metal tube performed at a frame rate of 1 kHz are shown. Keywords: high-speed visible (VIS) and infrared (IR) cameras, 3D shape measurement, multimodal sensor, GOBO projection, aperiodic sinusoidal patterns, temperature mapping

1.

INTRODUCTION

Optical surface temperature measurement is widely used, e.g., in non-destructive testing,1 food industry,2 condition monitoring,3 biological observation,4 and medical applications.5 Anomalous temperature distributions can indicate diseases like breast cancer6 or diabetic foot complications.7 Two-dimensional (2D) passive infrared thermography8 is an appropriate technique for measuring non-invasively, non-destructively, economically, and in real time.

However, 2D imaging techniques do not retain spatial information. By combining three-dimensional (3D) imaging systems based on structured light with thermal 2D imaging cameras, Colantonio et al.9 and Grubišić et al.10 obtained 3D thermograms of static objects. In order to measure dynamic scenes, real-time 3D thermography has been developed.1113 Frame rates up to 30 fps were achieved.

In recent years, our research group presented high-speed active 3D sensors based on array14 and GOBO (GOes Before Optics)15, 16 projection techniques imaging aperiodic sinusoidal patterns onto the measurement object.14, 17, 18 With these improvements, Heist et al.15 were able to record over 1,300 independent 3D point clouds per second. By integrating a high-speed long-wave infrared (LWIR) camera into one of our existing setups, we are now able to obtain 3D thermograms with a new multimodal five-dimensional (5D) sensor (three spatial coordinates, temperature, and time) at frame rates of up to 1 kHz at full resolution.

2.

MEASUREMENT SYSTEM

Our multimodal 5D sensor consists of two measurement principles: an active stereo vision system1923 for the 3D measurement and a passive infrared camera. We map the measured surface temperature onto the reconstructed 3D point cloud. The schematic setup and a photograph of our multimodal system are shown in Fig. 1.

Figure 1.

Multimodal 5D system: (a) schematic setup and (b) photograph.

00011_PSISDG11144_1114403_page_2_1.jpg

For the 3D reconstruction, corresponding pixels P1 and P2 have to be identified in both cameras working in the visible range of light (VIS). As the object surface might appear homogeneous in VIS, we encode it by a sequence of gray values. Therefore, we use a sequence of length N of rapidly changing aperiodic sinusoidal fringe patterns projected by a high-speed VIS GOBO projector.15 The maximum of the normalized cross correlation function14, 24, 25 between the N temporal gray values of each pixel pair determines the corresponding points P1 and P2. By triangulation and taking into account the internal and external camera parameters, the 3D coordinates of the object point P are calculated from the coordinates of the pixels P1 and P2.

At the same time, the thermal camera records the spatial distribution of emitted radiation of the object surface. The spectral radiance depends on the surface temperature. The largest amount of radiation at room temperature is emitted at wavelengths around 10 µm. The temperatures are recorded with an LWIR camera (see Fig. 1) synchronously to the 3D data and mapped onto the previously reconstructed 3D point cloud.

Figure 2 shows a block diagram depicting the evaluation process from the image acquisition to the resulting 3D thermogram. First, we reconstruct the object 3D shape by triangulation of corresponding pixels. This 3D result is reprojected onto the thermal image such that we can assign a temperature value to each 3D point by bilinear interpolation.

Figure 2.

Block diagram depicting the evaluation process from image acquisition to resulting 3D thermogram.

00011_PSISDG11144_1114403_page_2_2.jpg

The high-speed VIS cameras applied in our sensor are Photron FASTCAM SA-X2 monochrome cameras. Their maximum frame rate at full resolution of 1 Mpx is 12,500 fps. The light source of the self-built high-speed VIS projector is a Philips MSR metal-halide gas discharge lamp with about 250W optical output power. As LWIR camera, we apply a FLIR X6900sc SLS camera. In the spectral range from 7.5 to 12 µm, this camera can record up to 1,004 Hz at full resolution of 640×512 px. The integration time in the temperature range from –20 to 150 °C is very low with about 0.16 ms. The noise expressed in noise equivalent temperature difference is less than 40 mK.

3.

SENSOR CALIBRATION

The sensor calibration is intended to determine the intrinsic and extrinsic parameters of the two high-speed VIS cameras and the high-speed LWIR camera. This is required for correct 3D reconstruction and accurate mapping of the texture. In our calibration process, we follow the well-established method of bundle adjustment.21, 23 The camera parameters and the 3D coordinates are optimized such that the rear projection error between the projected and the measured point in the image plane is minimized.

For calibration of cameras of very different wavelength ranges, Yang et al.26 split the calibration process in two sub-calibrations with two different calibration boards. In contrast to that, we developed a calibration tool with features detectable in VIS and LWIR to calibrate the three cameras in only one measurement. Our calibration tool shown in Fig. 3(a) is a special printed circuit board with a regular grid of circular rings and filled circles.

Figure 3.

(a) Photograph of calibration board, (b) VIS camera image, (c) LWIR camera image, and (d) evaluated LWIR image with detected circles and grid coordinate system.

00011_PSISDG11144_1114403_page_3_1.jpg

As our calibration board has a homogeneous temperature distribution, we use the emissivity difference of the features to achieve a sufficient contrast in the LWIR images. The circles are made of electroless nickel gold (ENIG) with a very low emissivity. The plate material is a composite of epoxy resin and glass fibre fabric (FR-4) with an emissivity close to 1. Figure 3(c) shows an example LWIR camera image. The plate material appears bright, while the circles appear dark. Due to different reflective characteristics in VIS, the circles, which appear bright on a dark plate, can be detected as well (see Fig. 3(b)).

For the calibration measurement, we place our tool at various positions and orientations in the measurement volume. The evaluation starts with the detection and assignment of the circles to the grid coordinate system. The circles’ centers are determined by fitting an ellipse to the circumference of each circle. In the bundle adjustment, we consider the stereo conditions of the system and the image pixel and grid coordinates of the circles.

4.

MEASUREMENT EXAMPLES

In this section, we present example 3D thermograms of three measurements: a crushing metal tube, a basketball player, and an inflating airbag. Table 1 gives an overview of the measurement parameters for the three experiments.

Table 1.

Measurement parameters of three performed experiments

parametercrushing tubebasketball playerairbag
measurement field in m20.8×0.81.8×1.81.0×0.9
measurement distance in m233
resolution in px defined by the VIS cameras768×7681,024×1,024512×448
lateral resolution in mm21.0×1.01.8×1.82.0×2.0
VIS camera frame rate in kHz121250
LWIR camera frame rate in kHz111
acquisition time in s0.810.450.50
rotational speed of GOBO in rpm5665662,222

4.1

Metal tube

We measured a deformation process of a 0.5m long and 125mm diameter metal tube when hitting it with a sledge hammer. The measurement results as 3D thermograms at different times are shown in Fig. 4. The tube had a temperature of about 26 °C before the deformation process. Due to the sledge hammer impact, the tube deformed over its entire length. In the area with the highest wall bending, heat generation can be observed. This temperature increase of about 2K could be measured already at t = 50 ms. The sledge hammer impact was roughly in the tube center. However, in this direct impact area, the temperature did not change noticeably.

Figure 4.

3D thermograms of a metal tube deformation process when hitting it with a sledge hammer acquired at different times.

00011_PSISDG11144_1114403_page_4_1.jpg

4.2

Basketball player

In order to measure a dribbling basketball player, we increased the size of the measurement field by adjusting the multimodal sensor. The measurement system had to be calibrated again. For three different times during the dribbling process, the measurement results are shown in Fig. 5.

Figure 5.

3D thermograms of a basketball player acquired at t = 20 ms, 258 ms, and 320 ms.

00011_PSISDG11144_1114403_page_5_1.jpg

At the beginning, the basketball player controlled the ball with his hand (see Fig. 5 at t = 20 ms). We see that temperatures higher than 30 °C are mapped completely onto the hand whereas lower temperatures are mapped onto the ball surface. Thus, the mapping of the thermal data onto the 3D data works well. In the further dribbling process, the athlete moved his hand downwards and the ball separated from the hand. On the ball surface (temperatures of about 27 °C), a heat imprint of the hand (temperatures of about 31 °C) is clearly visible at t = 258 ms. The ball hit the ground and bounced off from it. At t = 320 ms, an area of increased temperature can be seen on the spot of the floor, where the ball touched the ground.

4.3

Inflating airbag

The third measurement we present in this paper is probably the most impressive one. We measured the inflation process of a front airbag. Figure 6 shows the scene at six different times. At t = 0 ms, the airbag ignition was started. The temperature of the steering wheel was about 25 °C. At t = 6 ms, an increased temperature could be observed under the cover which is pushed upwards. Until t = 20 ms, the airbag volume grew quickly. The airbag surface showed temperatures of about 30 °C. In a small area in the upper right, temperatures close to 40 °C were reached. Then, after about 50 ms, the deflation of the airbag started. As the heat diffused through the bag material, the temperature of the outer surface increased continuously until the end of the measurement. Temperatures up to 60 °C were reached. Lower temperatures make areas with thicker material, e.g., seams, visible. In general, the temperature distribution can indicate material flaws.

Figure 6.

Airbag inflation process in 3D thermograms acquired at different times.

00011_PSISDG11144_1114403_page_6_1.jpg

5.

CONCLUSION

In this contribution, we presented our new multimodal 5D system for the simultaneous high-speed measurement of 3D shape and temperature. For this purpose, we further developed our high-speed 3D system by integrating a high-speed LWIR camera. The 3D data is recorded by active stereo vision whereas the 2D data is obtained by a passive thermal camera. Geometric calibration of the three cameras is carried out by using a specially designed calibration tool and bundle adjustment. This enables us to map the temperatures as texture onto the 3D data. With this system we are able to measure the three spatial dimensions and the surface temperature of rapidly changing objects.

At full resolution of 1 Mpx, the VIS cameras can record up to 12,500 frames per second. The LWIR camera has a maximum resolution of 640×512 px and reaches frame rates up to 1,000 Hz. In total, we achieve independent 3D thermograms with frame rates up to 1 kHz.

Three results of three different examples were presented, which demonstrate the broad application range of the multimodal sensor. The combination of high-speed 2D thermography with high-speed 3D sensors delivers much more information than traditional 2D thermography.

ACKNOWLEDGMENTS

This project was supported by the German Federal Ministry of Education and Research (BMBF) under project number 03ZZ0436. We thank FLIR Systems GmbH for the loan of the high-speed LWIR FLIR X6900sc SLS camera and technical support.

REFERENCES

[1] 

Usamentiaga,R., Venegas, P., Guerediaga, J., Vega, L., Molleda, J., and Bulnes, F., “Infrared thermography for temperature measurement and non-destructive testing,” Sensors, 14 (7), 12305 –12348 (2014). https://doi.org/10.3390/s140712305 Google Scholar

[2] 

Vadivambal, R. and Jayas, D. S., “Applications of thermal imaging in agriculture and food industry-a review,” Food Bioprocess Technol., 4 (2), 186 –199 (2011). https://doi.org/10.1007/s11947-010-0333-5 Google Scholar

[3] 

Bagavathiappan, S., Lahiri, B., Saravanan, T., Philip, J., and Jayakumar, T., “Infrared thermography for condition monitoring–a review,” Infrared Phys. Technol., 60 35 –55 (2013). https://doi.org/10.1016/j.infrared.2013.03.006 Google Scholar

[4] 

Hristov, N. I., Betke, M., and Kunz, T. H., “Applications of thermal infrared imaging for research in aeroecology,” Integr. Comp. Biol., 48 (1), 50 –59 (2008). https://doi.org/10.1093/icb/icn053 Google Scholar

[5] 

Ring, E. and Ammer, K., “Infrared thermal imaging in medicine,” Physiol. Meas., 33 (3), R33 (2012). https://doi.org/10.1088/0967-3334/33/3/R33 Google Scholar

[6] 

Schaefer, G., Závišek, M., and Nakashima, T., “Thermography based breast cancer analysis using statistical features and fuzzy classification,” Pattern Recognit., 42 (6), 1133 –1137 (2009). https://doi.org/10.1016/j.patcog.2008.08.007 Google Scholar

[7] 

van Netten, J. J., van Baal, J. G., Liu, C., van der Heijden, F., and Bus, S. A., “Infrared thermal imaging for automated detection of diabetic foot complications,” J. Diabetes Sci. Technol., 7 (5), 1122 –1129 (2013). https://doi.org/10.1177/193229681300700504 Google Scholar

[8] 

Kateb, B., Yamamoto, V., Yu, C., Grundfest, W., and Gruen, J. P., “Infrared thermal imaging: A review of the literature and case report,” NeuroImage, 47 T154 –T162 (2009). https://doi.org/10.1016/j.neuroimage.2009.03.043 Google Scholar

[9] 

Colantonio, S., Pieri, G., Salvetti, O., Benvenuti, M., Barone, S., and Carassale, L., “A method to integrate thermographic data and 3D shapes for diabetic foot disease,” in Proceedings of 8th International Conference on Quantitative Infrared Thermography (QIRT), (2006). https://doi.org/10.21611/qirt.2006.073 Google Scholar

[10] 

Grubišić, I., Gjenero, L., Lipić, T., Sović, I., and Skala, T., “Active 3D scanning based 3D thermography system and medical applications,” 269 –273 (2011). Google Scholar

[11] 

Skala, K., Lipić, T., Sović, I., Gjenero, L., and Grubišić, I., “4D thermal imaging system for medical applications,” Period. Biol., 113 (4), 407 –416 (2011). Google Scholar

[12] 

Zuo, C., Chen, Q., Feng, S., Gu, G., and Asundi, A., “Real-time three-dimensional infrared imaging using fringe projection profilometry,” Chin. Opt. Lett., 11 (2013). Google Scholar

[13] 

Vidas, S., Moghadam, P., and Sridharan, S., “Real-time mobile 3D temperature mapping,” IEEE Sens. J., 15 (2), 1145 –1152 (2015). https://doi.org/10.1109/JSEN.2014.2360709 Google Scholar

[14] 

Heist, S., Mann, A., Kühmstedt, P., Schreiber, P., and Notni, G., “Array projection of aperiodic sinusoidal fringes for high-speed three-dimensional shape measurement,” Opt. Eng., 53 (11), 112208 (2014). https://doi.org/10.1117/1.OE.53.11.112208 Google Scholar

[15] 

Heist, S., Lutzke, P., Schmidt, I., Dietrich, P., Kühmstedt, P., Tünnermann, A., and Notni, G., “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng., 87 90 –96 (2016). https://doi.org/10.1016/j.optlaseng.2016.02.017 Google Scholar

[16] 

Heist, S., Dietrich, P., Landmann, M., Kühmstedt, P., Notni, G., and Tünnermann, A., “GOBO projection for 3D measurements at highest frame rates: a performance analysis,” Light-Sci. Appl., 7 (1), 71 (2018). https://doi.org/10.1038/s41377-018-0072-3 Google Scholar

[17] 

Heist, S., Kühmstedt, P., Tünnermann, A., and Notni, G., “Theoretical considerations on aperiodic sinusoidal fringes in comparison to phase-shifted sinusoidal fringes for high-speed three-dimensional shape measurement,” Appl. Opt., 54 (35), 10541 –10551 (2015). https://doi.org/10.1364/AO.54.010541 Google Scholar

[18] 

Heist, S., Kühmstedt, P., Tünnermann, A., and Notni, G., “Experimental comparison of aperiodic sinusoidal fringes and phase-shifted sinusoidal fringes for high-speed three-dimensional shape measurement,” Opt. Eng., 55 (2), 024105 (2016). https://doi.org/10.1117/1.OE.55.2.024105 Google Scholar

[19] 

Geng, J., “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photon., 3 (2), 128 –160 (2011). https://doi.org/10.1364/AOP.3.000128 Google Scholar

[20] 

Salvi, J., Fernandez, S., Pribanic, T., and Llado, X., “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit., 43 (8), 2666 –2680 (2010). https://doi.org/10.1016/j.patcog.2010.03.004 Google Scholar

[21] 

Luhmann, T., Robson, S., Kyle, S., and Böhm, J., Close-Range Photogrammetry and 3D Imaging, (2014). Google Scholar

[22] 

Zhang, S., Handbook of 3D machine vision: Optical metrology and imaging, CRC press(2013). Google Scholar

[23] 

Hartley, R. and Zisserman, A., Multiple view geometry in computer vision, Cambridge university press(2003). Google Scholar

[24] 

Albrecht, P. and Michaelis, B., “Stereo photogrammetry with improved spatial resolution,” IEEE Transactions on Pattern Recognition, 14 845 –849 (1998). Google Scholar

[25] 

Wiegmann, A., Wagner, H., and Kowarschik, R., “Human face measurement by projecting bandlimited random patterns,” Opt. Express, 14 (17), 7692 –7698 (2006). https://doi.org/10.1364/OE.14.007692 Google Scholar

[26] 

Yang, R. and Chen, Y., “Design of a 3-D infrared imaging system using structured light,” IEEE Trans. Instrum. Meas., 60 (2), 608 –617 (2011). https://doi.org/10.1109/TIM.2010.2051614 Google Scholar
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Martin Landmann, Stefan Heist, Patrick Dietrich, Peter Lutzke, Ingo Gebhart, Peter Kühmstedt, and Gunther Notni "Multimodal sensor: high-speed 3D and thermal measurement", Proc. SPIE 11144, Photonics and Education in Measurement Science 2019, 1114403 (17 September 2019); https://doi.org/10.1117/12.2531950
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Cameras

3D metrology

Temperature metrology

Long wavelength infrared

Sensors

Calibration

3D image processing

Back to Top