Paper
6 December 1989 Mutual information and estimative consistency
R. C. McCarty
Author Affiliations +
Abstract
The high speed processing of sample data from continuous stochastic signal processes, such as broad-band, noise-like spread spectrum signals, provides highly correlated, stochastically-dependent sample information. The utilization of dependent sample data in the estimation of the parameters of the parent process, generally provides biased but more importantly, inconsistent parametric estimates. For a set of continuous random variables, {X1, , Xn; -oo < Xi < oo, i = 1, . . . , n; n > 2}, Blachmanl, circa 1966, proposed a two-dimensional vector measure of mutual information, I0 , which was easily extended to the general n-dimensional case. In 1988, the author of this paper proposed a consistent sample estimate of an extended Blachman measure of mutual information for the case of multivariate exponential-type probability distributions. A more generalized estimated sample measure of mutual information provides a means to determine an appropriate sample interval between adjacent temporal samples which will not vanish unless the samples are statistically independent. These samples can then be used to generate the usual statistically consistent process moment estimates.
© (1989) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
R. C. McCarty "Mutual information and estimative consistency", Proc. SPIE 1154, Real-Time Signal Processing XII, (6 December 1989); https://doi.org/10.1117/12.962378
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Statistical analysis

Signal processing

Stochastic processes

Data analysis

Data processing

Probability theory

Binary data

RELATED CONTENT


Back to Top