Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that
often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these
filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a
must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics
(“adaptivity criterion”) using noise reduction algorithms as an example. The method allows finding an “orthogonal
decomposition” of the filter parameter space into the “filter adaptivity” and “filter strength” directions. This metric can
be used as a cost function in automatic filter optimization. Since it is a measure of a physical “information restoration”
rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for
a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can
be used for assessment of the whole imaging system (sensor plus post-processing).
A system simulation model was used to create scene-dependent noise masks that reflect current performance of mobile
phone cameras. Stimuli with different overall magnitudes of noise and with varying mixtures of red, green, blue, and
luminance noises were included in the study. Eleven treatments in each of ten pictorial scenes were evaluated by twenty
observers using the softcopy ruler method. In addition to determining the quality loss function in just noticeable
differences (JNDs) for the average observer and scene, transformations for different combinations of observer sensitivity
and scene susceptibility were derived. The psychophysical results were used to optimize an objective metric of isotropic
noise based on system noise power spectra (NPS), which were integrated over a visual frequency weighting function to
yield perceptually relevant variances and covariances in CIE L*a*b* space. Because the frequency weighting function is
expressed in terms of cycles per degree at the retina, it accounts for display pixel size and viewing distance effects, so
application-specific predictions can be made. Excellent results were obtained using only L* and a* variances and L*a*
covariance, with relative weights of 100, 5, and 12, respectively. The positive a* weight suggests that the luminance
(photopic) weighting is slightly narrow on the long wavelength side for predicting perceived noisiness. The L*a*
covariance term, which is normally negative, reflects masking between L* and a* noise, as confirmed in informal
evaluations. Test targets in linear sRGB and rendered L*a*b* spaces for each treatment are available at
http://www.aptina.com/ImArch/ to enable other researchers to test metrics of their own design and calibrate them to
JNDs of quality loss without performing additional observer experiments. Such JND-calibrated noise metrics are
particularly valuable for comparing the impact of noise and other attributes, and for computing overall image quality.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.