Statistical Signal Processing

From Ioannis Kourouklides
Jump to: navigation, search

This page contains resources about Statistical Signal Processing, including Statistical Modelling, Signal Modelling, Signal Estimation, Spectral Estimation, Point Estimation, Estimation Theory, Adaptive Filtering, Adaptive Signal Processing, Adaptive Filter Theory, Adaptive Array Processing and System Identification.

Filtering is not to be confused with Filter in Signal Processing.

Smoothing is not to be confused with Smoothing/Blurring/Averaging in Statistics and Digital Image Processing.

Estimation is not to be confused with Approximation (which might be similar sometimes, but not the same).

Subfields and Concepts[edit]

Signal Modelling & Spectral Estimation[edit]

See also Signal Processing, Linear Dynamical Systems and Stochastic Processes

  • Signal Modelling
    • Linear Nonparametric Signal Models
      • Linear random signal model / General Linear Model
      • Recursive representation
      • Innovations representation
    • Parametric Pole-Zero Signal Models / Time Series Models 
      • Autoregressive (AR) model / All-Pole model
      • Moving Average (MA) model / All-Zero model
      • ARMA model / Pole-Zero model
    • Linear Predictive Coding
      • Reflection Coefficients
      • Partial Correlation Coefficients
    • Least Squares Method
    • Padé Approximation
    • Levinson–Durbin Algorithm / Levinson–Durbin Recursion
    • Prony's Method
    • Itakura–Saito / Lattice filter
    • Maximum Entropy Method (MEM)
    • Iterative Prefiltering
    • Final Data Records / Block Estimation / Data Windowing / Finite Interval Modelling
      • Yule Walker Method / Autocorrelation Method
      • Covariance Method
      • Modified Covariance
      • Pre-windowing Method
      • Post-windowing Method
      • Unbiased Autocorrelation Estimate
      • Burg Method
      • Forward and Backward Linear Prediction (FBLP)
    • Stochastic Modelling
      • Modified Yule-Walker Equation (MYWE) Method
      • Least Squares MYWE Method
      • MA Model using Spectral Factorization
      • Durbin's Method
  • Stochastic Systems
  • (Nonparametric) Time-domain Methods
    • Autocorrelation Function
    • Cross-Correlation
    • Cross-Covariance
    • Correlogram
  • Frequency-domain Methods / Spectral Estimation
    • Nonparametric Methods
      • Periodogram
      • Modified Periodogram
      • Barlett's Method / Periodogram Averaging
      • Welch's Method / Modified Periodogram Averaging
      • Blackman-Tukey Method / Periodogram Smoothing
      • Multiple Window Method
      • Cross-spectrum / Magnitude squared coherence
      • Empirical Transfer Function Estimation (ETFE)
      • Minimum Variance Spectral Estimation (MVSE) / Capon's Method
    • Maximum Entropy Method (MEM)
    • Parametric Methods
      • AR Spectral Estimation (using All-Pole Signal Modelling)
      • MA Spectral Estimation (using All-Zero Signal Modelling)
      • ARMA Spectral Estimation (using Pole-Zero Signal Modelling)
      • Minimum Variance Distortionless Response (MVDR) filter
      • Linearly Constrained Minimum Variance (LCMV) filter
      • Subspace Methods
    • Frequency Estimation (using a Harmonic Process / Sinusoid Model)
      • Pisarenko Harmonic Decomposition
      • Multiple Signal Classification (MUSIC) pseudospectrum
      • Eigenvector Method
      • Root-MUSIC pseudospectrum
      • Minimum Norm Method
      • ESPRIT Algorithm
      • Pencil Method
      • Frequency-domain version of Prony's Method
      • Subspace Methods / Eigendecomposition-based Methods
        • Blackman-Tukey Principal Components Method
        • AR Method
        • Minimum Variance Method
    • High-resolution / Super-resolution Spectral Estimators
      • MVSE
      • MVDR
      • LCMV
      • MUSIC

Point Estimation & Estimation Theory[edit]

See also Probability and Statistics and Information Theory

  • Regression Analysis
    • Linear Regression Model
    • Simple Linear Regression
    • Multiple Linear Regression (not to be confused with Multivariate Linear Regression)
    • General Linear Model / Multivariate Linear Model
    • Generalized Linear Model (GLM or GLIM)
    • Poisson Regression
    • Negative Binomial Regression
    • Logistic Regression Model / Logit Model
    • Multinomial Logistic Regression / Softmax Regression
    • Probit Model
    • Fixed Effects Model
    • Hierarchical Linear Models / Multilevel Models / Nested Data Models
      • Random Effects Model / Variance Components Model
      • Mixed Effects Models (not to be confused with Mixture Models)
    • Nonparametric Regression Models
    • Semi-parametric Regression Models
    • Nonlinear Regression Models
    • Robust Regression Models
    • Random sample consensus (RANSAC)
    • Least Squares Methods
      • Ordinary Least Squares / Linear Least Squares
      • Weighted Least Squares
      • Nonlinear Least Squares
      • L1-regularization / Least absolute shrinkage and selection operator (LASSO) / Laplace prior
      • L2-regularization / Ridge Regression / Tikhonov Regularization / Gaussian prior
  • Types of estimators
    • Biased estimator
    • (Asymptotically) Unbiased estimator
    • Inconsistent estimator
    • (Asymptotically) Consistent estimator
  • Sufficiency, Minimality, Completeness and Variance Reduction Techniques (VRT)
    • Gauss-Markov Theorem
    • Lehmann–Scheffe Theorem
    • Factorization Theorem
    • Complete statistic
    • Minimal sufficient statistic
    • Ancillary statistic
    • Fisher information
    • Fisher information metric / Fisher–Rao metric
    • Scoring algorithm / Fisher's scoring
    • Score function
    • Cramer–Rao bound (CRB) / Cramer–Rao lower bound (CRLB)
    • Rao–Blackwell Theorem
      • Rao–Blackwellization
      • Rao–Blackwell estimator
    • Exponential family
    • Conjugate prior family
  • Decision Theory
    • Neyman-Pearson Theory
    • The Expected Loss Principle
    • Optimal decision rules
    • Bayesian Decision Theory / Bayes estimator
    • Cost function / Loss function
    • Risk function
    • Admissibility
    • Unbiasedness
    • Minimaxity / Minimax estimator
    • James–Stein estimator (a biased estimator)
    • Stein paradox
  • Density Estimation (i.e. the unknown parameter is probability density itself)
    • Risk Function / Expected Loss (i.e. Expectation Value of Loss Function)
      • Mean integrated squared error (MISE)
    • Parametric Density Estimation
      • Maximum likelihood estimator (MLE)
      • Bayes estimator / Bayesian Density Estimation (i.e. a distribution over distributions)
    • Nonparametric Density Estimation
      • Rescaled Histogram (i.e. the oldest and most naive approach)
      • Parzen window / Kernel Density Estimation (KDE) / Parzen-Rosenblatt estimator
      • k-Nearest Neighbors Density Estimation
      • Bayesian Nonparametric Density Estimation
  • Frequentist Parameter Estimation
    • Frequentist Parametric Models
    • Frequentist Nonparametric Models
      • Kernel Regression Model / Kernel smoother
        • Nadaraya–Watson Kernel (Regression) estimator
        • Local Linear (Regression) estimator
        • Local Polynomial (Regression) estimator
        • Kernel Average smoother
        • Kernel Ridge Regression
        • Support Vector Regression
      • k-Nearest Neighbours smoother
      • Frequentist Smoothing Splines
    • Frequentist Risk (linked with variance and bias of an estimator)
    • Frequentist Risk function / Frequentist Expected Loss (i.e. Expectation Value of Loss Function)
      • Minimum MSE (MMSE) estimator / Squared error loss
      • Absolute error loss
  • Bayesian Parameter Estimation / Bayesian Point Estimation / Bayesian Methods
    • Bayesian Parametric Models
      • Bayesian Linear (Regression) Model
      • Bayesian Multivariate Linear (Regression) Model
    • Bayesian Nonparametric Models
    • Bayesian Smoothing Splines
    • Posterior Risk / Bayesian Risk
      • Posterior variance (when MSE is used)
    • Bayes Risk Function / Posterior Expected Loss (i.e. Posterior Expectation Value of Loss Function)
      • Posterior mean / Minimum MSE (MMSE) estimator / Bayes least squared error (BLSE) estimator / Squared error loss
      • Posterior median / Median-unbiased estimator / Absolute error loss
      • Posterior mode
    • Bayes estimator
      • MMSE / BLSE estimator
      • Median-unbiased estimator
      • Bayes estimator for conjugate priors (eg. exponential family)
    • Generalized Bayes estimator (i.e. when prior is improper)
    • Bayesian Hierarchical Modelling / Hierarchical Bayes
      • Hyperparameter
      • Hyperprior
    • Empirical Bayes / Maximum marginal likelihood estimator (MMLE) / Evidence Approximation
      • Nonparametric Empirical Bayes (NPEB)
      • Parametric Empirical Bayes Point Estimation
    • Full Bayes
  • Recursive Bayesian Estimation / Bayes filter (generalization of the Kalman filter)
    • Kalman filter (generalization of the Wiener filter)
    • Wiener filter / Linear MMSE (LMMSE) estimator
  • Bayesian Information Theory
    • The Principle of Maximum Entropy
    • Bayesian Occam's Razor
    • Minimum Message Length (MML)
    • Minimum Description Length (MDL) principle
    • Bayesian Compression (in Deep Learning)
  • Methods for finding estimators
    • Minimum-variance unbiased estimator (MVUE)
    • Best linear unbiased estimator (BLUE)
    • Maximum entropy method (MEM)
    • Method of moments estimator (MME) / Empirical estimate of a moment
    • Maximum likelihood estimator (MLE)
    • Least squares estimator (LSE)
    • Bayesian Approaches / Bayes estimator
    • Maximum a posteriori (MAP) estimator
    • Pseudolikelihood
  • M-estimators
    • MLE
    • Steepest Descent / Gradient Descent
      • Stochastic Gradient Descent (SGD)
      • Generalized Normalised Gradient Descent (GNGD)
      • Hierarchical Gradient Descent (HGD)
      • Normalized Nonlinear Gradient Descent (NNGD)
      • Fully Adaptive NNGD (FANNGD) 
  • Model Order Selection / Model Comparison
    • Akaike Information Criterion (AIC)
    • Bayesian Information Criterion (BIC)
    • Deviance Information Criterion (DIC)
    • Bayesian Predictive Information Criterion (BPIC)
    • Focused Information Criterion (FIC)
    • Minimum Description Length (MDL)
    • Minimum Message Length (MML)
    • Akaike Final Prediction Error (FPE)
    • Parzen's Criterion Autoregressive Transfer Function (CAT)
    • Bayesian Model Selection / Bayesian Model Comparison
    • Cross-Validation
    • Statistical hypothesis testing (for Multilevel Models / Nested Models only)
      • Lagrange multiplier test / (Rao's) Score test
      • Likelihood-ratio test
      • Wald test

Adaptive Filtering & Optimal Filtering[edit]

See also Digital Signal Processing and Optimization

  • Linear Shift-Invariant (LSI) filter / LTI filter
  • Filtering Problem
  • Smoothing Problem
  • Adaptive filter
    • Time-Variant / Shift-Varying filter structure
    • Criterion of performance
    • Adaptive Algorithm
  • Four major configurations of an adaptive filter
    • System Identification
    • Noise Cancellation
    • Prediction
    • Inverse System Identification
  • Basic filter representations
    • Transversal (direct form) filter structure
    • Symmetric transversal filter structure
    • Lattice filter structure
    • Parallel form filter structure
    • State-space representation
    • Innovations representation / Innovations filter structure
  • FIR filter / Non-recursive adaptive filter
    • Gradient Descent / Steepest Descent
    • Generalized Normalised Gradient Descent (GNGD)
    • Normalized Nonlinear Gradient Descent (NNGD)
    • Fully Adaptive NNGD (FANNGD) 
    • Unnormalized Gradient / Least Mean Squares (LMS) filter
    • Normalized Gradient / Normalized LMS (NLMS) filter
    • Leaky LMS filter
    • Gradient Adaptive Lattice (GAL) filter
    • Lattice LMS filter with Joint Process Estimation
    • FIR Wiener filter
  • IIR filter / Recursive adaptive filter
    • Recursive Least Squares (RLS) filter / Forgetting Factor
    • Kernel RLS (KRLS) filter
    • Sliding Window RLS filter
    • IIR Wiener filter
  • Polynomial Regression filters
  • Gaussian Regression filter
  • Online State Estimation / Recursive Estimation (i.e. Density Estimation recursively over time using incremental measurements of the incoming signal)
    • Bayesian Recursive Estimation / Bayes filter
    • Kalman filter (special case of Bayer filter)
    • Extended Kalman filter (EKF)
    • Unscented Kalman filter (UKF)
    • Iterated EKF
    • Information filter
    • Interacting Multiple Models (IMM) Filter
    • Histogram filter
    • Monte Carlo Methods (Approximation to Bayesian Estimation)
      • Particle filter
  • Optimum filters
    • Eigenfilter
    • Kalman filter
    • Wiener filter
    • Linear Prediction
      • Forward Linear Prediction
      • Backward Linear Prediction
  • Square Root filter / Cholesky Decomposition-based Kalman Filter
    • Exponentially Weighted RLS filter
    • QR Decomposition-based RLS (QRD-RLS)
    • Extended QRD-RLS
    • Inverse QRD-RLS
  • Hierarchical filters
    • Hierarchical Gradient Descent (HGD)
    • Hierarchical LMS (HLMS) filter
  • Adaptive Linear Neuron (ADALINE) filter
  • Complex valued filters
    • Complex LMS (CLMS) filter
  • Iterative Methods in Optimization
    • Expectation-Maximization (EM) Algorithm
    • Gradient Descent / Steepest Descent
    • Levenberg–Marquardt Algorithm
    • Iteratively Reweighted Least Squares
    • Nonlinear Least Squares
    • Krylov Subspace Methods
      • Conjugate Gradient Method
    • Broyden–Fletcher–Goldfarb–Shanno (BFGS) Algorithm 
  • Risk Function / Expected Loss (i.e. Expectation Value of Loss Function)
    • Mean Squared Error (MSE) / Squared Error Loss
    • Absolute error loss
  • Criteria of performance
    • Mean squared error (MSE) / Minimum MSE estimator
    • Mean n-th order error
    • Sum of squared errors (SSE)
    • Mean absolute error
    • Signal-to-Noise Ratio (SNR)
  • Stochastic Optimization
    • Stochastic Approximation
  • Adaptive Control
    • Auto-tuning / Self-tuning
  • Adaptive Array Processing
    • Adaptive beamforming
    • Matched subspace filter / Matched subspace detector
    • Angle estimation
    • Space-time adaptive processing (STAP)
    • MVDR

System Identification and Subspace Methods[edit]

See also Machine Learning

  • Subspace Identification
    • N4ASID
    • MOESP
    • CVA / CCA
    • SSARX
    • Frequency-domain subspace identification
    • Time-domain subspace identification
  • Linear System Identification
    • Linear Grey Box Models / Linear Ordinary Differential Equations
    • Linear Black Box Models
      • Transfer Function Model
      • ARMAX Model
      • State-Space Model
      • Frequency-Response Model
      • Output Error Model
      • Box-Jenkins Model
  • Nonlinear Adaptive Filtering / Nonlinear System Identification
    • Blind Deconvolution
      • Unsupervised adaptive filter / Blind equalizer / Blind adaptive filter
      • Bussgang Algorithm
    • Artificial Neural Networks / Adaptive Neural filters
      • Feedforward Neural Network
      • Recurrent Neural Network
    • NARMAX Models
    • Volterra Series Models
    • Block Structured Models
      • Hammerstein systems
      • Wiener systems / Parallel cascade nonlinear systems
  • Subspace Methods / Subspace Learning / Dimensionality Reduction
    • Supervised Learning
      • Feature Selection
    • Unsupervised Learning
      • Singular Value Decomposition (SVD)
      • Principal Components Analysis (PCA)
    • Subspace Tracking
      • Grassmannian Rank-One Update Subspace Estimation (GROUSE)
      • Parallel Estimation and Tracking by REcursive Least Squares (PETRELS)
      • Multiscale Online Union of Subspaces Estimation (MOUSSE)
      • Grassmannian Robust Adaptive Subspace Tracking Algorithm (GRASTA)
      • Online Supervised Dimensionality Reduction (OSDR)

Online Courses[edit]

Video Lectures[edit]


Lecture Notes[edit]

Books and Book Chapters[edit]

  • Candy, J. V. (2016). Bayesian signal processing: Classical, modern and particle filtering methods. 2nd Ed. John Wiley & Sons.
  • Box, G. E., Jenkins, G. M., Reinsel, G. C., & Ljung, G. M. (2015). Time series analysis: forecasting and control. John Wiley & Sons.
  • Kumar, P. R., & Varaiya, P. (2015). Stochastic systems: Estimation, identification, and adaptive control. SIAM.
  • Haykin , S. (2014). Adaptive filter theory. 5th Ed. Prentice Hall.
  • Tangirala, A. K. (2014). Principles of System Identification: Theory and Practice. CRC Press.
  • Goodwin, G. C., & Sin, K. S. (2014). Adaptive filtering prediction and control. Dover Publications.
  • Åström, K. J., & Wittenmark, B. (2013). Adaptive control. Dover Publications.
  • Gentle, J. E. (2013). Theory of statistics. (link)
  • Van Trees, H. L. (2013). Detection Estimation and Modulation Theory, Part I: Detection, Estimation, and Filtering Theory. 2nd Ed. John Wiley & Sons.
  • Farhang-Boroujeny, B. (2013). Adaptive filters: theory and applications. John Wiley & Sons.
  • Diniz, P. S. (2013). Adaptive filtering. 4th Ed. Springer.
  • Söderström, T. (2013). Discrete-time stochastic systems: estimation and control. 2nd Ed. Springer Science & Business Media.
  • Aster, R. C., Borchers, B., & Thurber, C. (2012) Parameter estimation and inverse problems. Academic Press
  • Kundu, D. s, & Nandi, S. (2012). Statistical Signal Processing: Frequency Estimation. Springer.
  • Sayed, A. H. (2011). Adaptive filters. John Wiley & Sons.
  • Adali, T., & Haykin, S. (2010). Adaptive signal processing: next generation solutions. John Wiley & Sons.
  • Hayes, M. H. (2009). Statistical digital signal processing and modeling. John Wiley & Sons.
  • Mandic, D. P., & Goh, V. S. L. (2009). Complex valued nonlinear adaptive filters: noncircularity, widely linear and neural models (Vol. 59). John Wiley & Sons.
  • Kulkarni, V. G. (2009). Modeling and analysis of stochastic systems. 2nd Ed. CRC Press.
  • Porat, B. (2008). Digital processing of random signals: theory and methods. Prentice Hall.
  • Vaseghi, S. V. (2008). Advanced digital signal processing and noise reduction. John Wiley & Sons.
  • Van den Bos, A. (2007). Parameter estimation for scientists and engineers. John Wiley & Sons.
  • Jazwinski, A. H. (2007). Stochastic processes and filtering theory. Dover Publications.
  • Poularikas, A. D., & Ramadan, Z. M. (2006). Adaptive filtering primer with MATLAB. CRC Press.
  • Candy, J. V. (2006). Model-based signal processing. John Wiley & Sons.
  • Haykin, S. S. J., Príncipe, C., Sejnowski, T. J., & McWhirter, J. (Eds.). (2006). New directions in statistical signal processing: from systems to brain. MIT Press.
  • Manolakis, D. G., Ingle, V. K., & Kogon, S. M. (2005). Statistical and adaptive signal processing: spectral estimation, signal modeling, adaptive filtering, and array processing. Artech House.
  • Anderson, B. D., & Moore, J. B. (2005). Optimal filtering. Dover Publications.
  • Gray, R. M., & Davisson, L. D. (2004). An introduction to statistical signal processing. Cambridge University Press.
  • Lehmann, E. L., & Casella, G. (2003). Theory of point estimation. Springer.
  • Casella, G., & Berger, R. L. (2002). Statistical inference. Cengage Learning.
  • Cichocki, A., & Amari, S. I. (2002). Adaptive blind signal and image processing: learning algorithms and applications. John Wiley & Sons.
  • Nelles, O. (2001). Nonlinear system identification: from classical approaches to neural networks and fuzzy models. Springer Science & Business Media.
  • Moon, T. K., &  Stirling, W. C. (2000). Mathematical methods and algorithms for signal processing. Pearson.
  • Mathews, V.J. and Sicuranza , G.L. (2000). Polynomial signal processing. Wiley.
  • Ljung, L. (1999). System identification: theory for the user. PTR Prentice Hall.
  • West, M. (1999). Bayesian forecasting. John Wiley & Sons.
  • Bretthorst, G. L. (1998). Bayesian spectrum analysis and parameter estimation. Springer Science & Business Media.
  • Stoica, P., & Moses, R. L. (1997). Introduction to spectral analysis. Prentice hall.
  • Grover, R., & Hwang, P. Y. (1996). Introduction to random signals and applied Kalman filtering. 3rd Ed. John Wiley & Sons.
  • Van Overschee, P., & De Moor, B. L. (1996). Subspace identification for linear systems: Theory—Implementation—Applications. Springer.
  • van den Bosch, P. P., & van der Klauw, A. C. (1994). Modeling, identification and simulation of dynamical systems. CRC Press.
  • Poor, H. V. (1994). An introduction to signal detection and estimation. Springer Science & Business Media.
  • Kay, S. M. (1993). Fundamentals of Statistical Signal Processing, Vol I: Estimation Theory.
  • Therrien, C. W. (1992). Discrete random signals and statistical signal processing. Prentice Hall PTR.
  • Scharf, L. L. (1991). Statistical signal processing. Addison-Wesley.
  • Kay, S. M. (1988). Modern spectral estimation. Pearson Education.
  • Marple, S. L. (1987). Digital Spectral Analysis. Prentice Hall.
  • Widrow, B., and S.D. Sterns. (1985). Adaptive Signal Processing. Prentice Hall.

 Scholarly Articles[edit]

  • Geering, H. P., Dondi, G., Herzog, F., & Keel, S. (2011). Stochastic systems. Course script. (link)
  • Vaidyanathan, P. P. (2007). The theory of linear prediction. Synthesis lectures on signal processing2(1), 1-184.
  • de Jesús Rubio, J., & Yu, W. (2007). Nonlinear system identification with recurrent neural networks and dead-zone Kalman filter algorithm. Neurocomputing70(13), 2460-2466.
  • De Cock, K., De Moor, B., & Leuven, K. U. (2003). Subspace identification methods. Control systems robotics and automation, Vol. 1 of 3, pp. 933-979
  • Parlos, A. G., Menon, S. K., & Atiya, A. (2001). An algorithmic approach to adaptive state filtering using recurrent neural networks. IEEE Transactions on Neural Networks12(6), 1411-1432.
  • Griffiths, J. W. R. (1983). Adaptive array processing. A tutorial. Communications, Radar and Signal Processing, IEE Proceedings F130(1), 3.
  • Billings, S. A. (1980). Identification of nonlinear systems: a survey. In IEE Proceedings D-Control Theory and Applications (Vol. 127, No. 6, pp. 272-285). IET.
  • Palm, G., & Poggio, T. (1978). Stochastic identification methods for nonlinear systems: an extension of the Wiener theory. SIAM Journal on Applied Mathematics34(3), 524-535.

Tutorials[edit]

Software[edit]

See also[edit]

Other Resources[edit]