• Media type: E-Book
  • Title: Random processes for image and signal processing
  • Contributor: Dougherty, Edward R. [Author]
  • Corporation: SPIE ; Society of Photo-optical Instrumentation Engineers
  • Published: Bellingham, Wash. <1000 20th St. Bellingham WA 98225-6705 USA>: SPIE, 1999
  • Published in: SPIE press monograph ; 44,onl
    SPIE Press monograph ; PM44
  • Extent: 1 online resource (xix, 592 p. : ill.)
  • Language: English
  • DOI: 10.1117/3.268105
  • ISBN: 9780819478450
  • Identifier:
  • Keywords: Signal processing Statistical methods ; Stochastic processes ; Image processing Statistical methods
  • Reproduction note: Also available in print version
  • Origination:
  • Footnote: "SPIE digital library. - Includes bibliographical references (p. 575-581) and index
    Includes bibliographical references (p. 575-581) and index
    Restricted to subscribers or individual electronic text purchasers
    Mode of access: World Wide Web
    System requirements: Adobe Acrobat Reader
  • Description: Part of the SPIE/IEEE Series on Imaging Science and Engineering. This book provides a framework for understanding the ensemble of temporal, spatial, and higher-dimensional processes in science and engineering that vary randomly in observations. Suitable as a text for undergraduate and graduate students with a strong background in probability and as a graduate text in image processing courses

    Chapter 1. Probability theory -- Probability space -- Events -- Conditional probability -- Random variables -- Probability distributions -- Probability densities -- Functions of a random variable -- Moments -- Expectation and variance -- Moment-generating function -- Important probability distributions -- Binomial distribution -- Poisson distribution -- Normal distribution -- Gamma distribution -- Beta distribution -- Computer simulation -- Multivariate distributions -- Jointly distributed random variables -- Conditioning -- Independence -- Functions of several random variables -- Basic arithmetic functions of two random variables -- Distributions of sums of independent random variables -- Joint distributions of output random variables -- Expectation of a function of several random variables -- Covariance -- Multivariate normal distribution -- Laws of large numbers -- Weak law of large numbers -- Strong law of large numbers -- Central limit theorem -- Parametric estimation via random samples -- Random-sample estimators -- Sample mean and sample variance -- Minimum-variance unbiased estimators -- Method of moments -- Order statistics -- Maximum-likelihood estimation -- Maximum-likelihood estimators -- Additive noise -- Minimum noise -- Entropy -- Uncertainty -- Information -- Entropy of a random vector -- Source coding -- Prefix codes -- Optimal coding -- Exercises for chapter 1

    Chapter 2. Random processes -- Random functions -- Moments of a random function -- Mean and covariance functions -- Mean and covariance of a sum -- Differentiation -- Differentiation of random functions -- Mean-square differentiability -- Integration -- Mean ergodicity -- Poisson process -- One-dimensional Poisson model -- Derivative of the Poisson process -- Properties of Poisson points -- Axiomatic formulation of the Poisson process -- Wiener process and white noise -- White noise -- Random walk -- Wiener process -- Stationarity -- Wide-sense stationarity -- Mean-ergodicity for WS stationary processes -- Covariance-ergodicity for WS stationary processes -- Strict-sense stationarity -- Estimation -- Linear systems -- Communication of a linear operator with expectation -- Representation of linear operators -- Output covariance -- Exercises for chapter 2

    Chapter 3. Canonical representation -- Canonical expansions -- Fourier representation and projections -- Expansion of the covariance function -- Karhunen-Loeve expansion -- The Karhunen-Loeve theorem -- Discrete Karhunen-Loeve expansion -- Canonical expansions with orthonormal coordinate functions -- Relation to data compression -- Noncanonical representation -- Generalized Bessel inequality -- Decorrelation -- Trigonometric representation -- Trigonometric Fourier series -- Generalized Fourier coefficients for WS stationary processes -- Mean-square periodic WS stationary processes -- Expansions as transforms -- Orthonormal transforms of random functions -- Fourier descriptors -- Transform coding -- Karhunen-Loeve compression -- Transform compression using arbitrary orthonormal systems -- Walsh-Hadamard transform -- Discrete cosine transform -- Transform coding for digital images -- Optimality of the Karhunen-Loeve transform -- Coefficients generated by linear functionals -- Coefficients from integral functionals -- Generating bi-orthogonal function systems -- Complete function systems -- Canonical expansion of the covariance function -- Canonical expansions from covariance expansions -- Constructing canonical expansions for covariance functions -- Integral canonical expansions -- Construction via integral functional coefficients -- Construction from a covariance expansion -- Power spectral density -- The power-spectral-density/autocorrelation transform pair -- Power spectral density and linear operators -- Integral representation of WS stationary random functions -- Canonical representation of vector random functions -- Vector random functions -- Canonical expansions for vector random functions -- Finite sets of random vectors -- Canonical representation over a discrete set -- Exercises for chapter 3

    Chapter 4. Optimal filtering -- Optimal mean-square-error filters -- Conditional expectation -- Optimal nonlinear filter -- Optimal filter for jointly normal random variables -- Multiple observation variables -- Bayesian parametric estimation -- Optimal finite-observation linear filters -- Linear filters and the orthogonality principle -- Design of the optimal linear filter -- Optimal linear filter in the jointly Gaussian case -- Role of wide-sense stationarity -- Signal-plus-noise model -- Edge detection -- Steepest descent -- Steepest descent iterative algorithm -- Convergence of the steepest-descent algorithm -- Least-mean-square adaptive algorithm -- Convergence of the LMS algorithm -- Nonstationary processes -- Least-squares estimation -- Pseudoinverse estimator -- Least-squares estimation for nonwhite noise -- Multiple linear regression -- Least-squares image restoration -- Optimal linear estimation of random vectors -- Optimal linear filter for linearly dependent observations -- Optimal estimation of random vectors -- Optimal linear filters for random vectors -- Recursive linear filters -- Recursive generation of direct sums -- Static recursive optimal linear filtering -- Dynamic recursive optimal linear filtering -- Optimal infinite-observation linear filters -- Wiener-Hopf equation -- Wiener filter -- Optimal linear filter in the context of a linear model -- The linear signal model -- Procedure for finding the optimal linear filter -- Additive white noise -- Discrete domains -- Optimal linear filters via canonical expansions -- Integral decomposition into white noise -- Integral equations involving the autocorrelation function -- Solution via discrete canonical expansions -- Optimal binary filters -- Binary conditional expectation -- Boolean functions and optimal translation-invariant filters -- Optimal increasing filters -- Pattern classification -- Optimal classifiers -- Gaussian maximum-likelihood classification -- Linear discriminants -- Neural networks -- Two-layer neural networks -- Steepest descent for nonquadratic error surfaces -- Sum-of-squares error -- Error back-propagation -- Error back-propagation for multiple outputs -- Adaptive network design -- Exercises for chapter 4

    Chapter 5. Random models -- Markov chains -- Chapman-Kolmogorov equations -- Transition probability matrix -- Markov processes -- Steady-state distributions for discrete-time Markov chains -- Long-run behavior of a two-state Markov chain -- Classification of states -- Steady-state and stationary distributions -- Long-run behavior of finite Markov chains -- Long-run behavior of Markov chains with infinite state spaces -- Steady-state distributions for continuous-time Markov chains -- Irreducible continuous-time Markov chains -- Birth-death model-queues -- Forward and backward Kolmogorov equations -- Markov random fields -- Neighborhood systems -- Determination by conditional probabilities -- Gibbs distributions -- Random Boolean model -- Germ-grain model -- Vacancy -- Hitting -- Linear boolean model -- Granulometries -- Openings -- Classification by granulometric moments -- Adaptive reconstructive openings -- Random sets -- Hit-or-miss topology -- Convergence and continuity -- Random closed sets -- Capacity functional -- Exercises for chapter 5 -- Bibliography -- Index