Designing a robust and accurate object tracker is important in many computer vision applications. The problem becomes more complicated when additional factors like changing appearance, illumination, and scale are introduced in the sequence. Recently, trackers that are based on the correlation filter method like Sum of Template and Pixel-wise Learners (STAPLE)1 have shown state-of-the-art short-term tracking performance. STAPLE consists two major modules: learning correlation filter on HOG features and representing color information using RGB histogram. In this paper, we propose an improved STAPLE (iSTAPLE) tracker by adding the Color Names (CN)2 to the correlation part of the tracker. CN complements HOG feature because using only HOG can lead to tracking failures in some cases where occlusion or deformation is present. As the color information could be a confusing factor and unreliable in tracking due to the rapid illumination changes, Bhattacharyya distance is used to measure the color similarity between the target and surrounding area to decide whether the color information is helpful. Since we use multiple feature cues to improve tracking performance, a robust approach to fuse multiple features is required. To fully utilize all features and optimize the tracking result, numerous weight combinations assigned to each feature are tested. We show through comprehensive experiments on the VOT Challenge 2016 dataset3 that iSTAPLE obtains a gain of 25% in tracking robustness.
In order to ensure privacy constraints in biometrics, new algorithms called template protection schemes have been proposed in the last 10 years in the literature. Most of these algorithms require as input a feature having a fixed size. Texture features can be used within this context for fingerprints as the number of minutiae varies in general for different captures. BioHashing is a two authentication factor algorithm that can be used to ensure privacy while using biometrics. In this work, we compare different recent texture features from the literature within the BioHashing scheme while considering many constraints: efficiency, maximal representation size, and constant size description. Experiments are conducted on three fingerprint databases from the FVC competition. Results permit us to conclude on the texture features to be used within this context.
Microscopic homodyne interferometry with monochromatic or white light illumination is up to now the most widely used technique for micromechanical devices and MEMS surface profiling. In the last five years its capabilities have been largely extended and this technique can now be applied to out-of-plane or in-plane vibration measurements, to micromechanical testing, to transparent film thickness mapping and to surface spectral reflectivity mapping. In this paper we will review the performances and limits of this technique and its various applications in the MEMS field from technology assessment up to final device characterization. Some guidelines are provided to achieve high frequency vibration measurements, transient response measurements as well as on wafer or in vacuum measurements. Finally, future developments needed are discussed.
Several optical methods have been developed for the measurement of in-plane vibration of microscopic objects. However most of them need a scattering surface or a specific surface structuring. A low cost method which has not these limitations is optical stroboscopic microscopy combined with image processing by optical flow techniques. Previous works have shown that a nanometric sensitivity can be obtained. In this paper, we investigated several subpixel image processing methods for in-plane vibration measurements of MEMS by this technique. Emphasis was put on whole displacement field measurements and on fast algorithms able to process a large sequence of images without the need of a multi-resolution approach to get local vibration amplitudes. It is notably shown that use of spatiotemporal regularity between images is an efficient way to reduce noise and that a resolution in the 0.01 - 0.03 pixel range can be achieved. Results are applied to in-plane vibration local measurements in two perpendicular directions at video rate as well as to full-field mapping of in-plane vibration mode of electrostatically actuated MEMS devices in SOI technology.
Conference Committee Involvement (5)
Geospatial Informatics, Fusion, and Motion Video Analytics VI
19 April 2016 | Baltimore, MD, United States
Geospatial Informatics, Fusion, and Motion Video Analytics V
20 April 2015 | Baltimore, MD, United States
Geospatial InfoFusion and Video Analytics IV
5 May 2014 | Baltimore, MD, United States
Geospatial InfoFusion III
2 May 2013 | Baltimore, Maryland, United States
Geospatial InfoFusion II
26 April 2012 | Baltimore, Maryland, United States
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.