ECE 513 - Vector Space Signal Processing

Semesters Offered

Official Description

Mathematical tools in a vector space framework, including: finite and infinite dimensional vector spaces, Hilbert spaces, orthogonal projections, subspace techniques, least-squares methods, matrix decomposition, conditioning and regularizations, bases and frames, the Hilbert space of random variables, random processes, iterative methods; applications in signal processing, including inverse problems, filter design, sampling, interpolation, sensor array processing, and signal and spectral estimation. Course Information: Prerequisite: ECE 310, ECE 313, and MATH 415.

Prerequisites

Credit in ECE 313 or STAT 410
Credit in ECE 410
Credit in MATH 415

Subject Area

Signal Processing

Course Directors

Description

Fundamentals of linear least squares estimation of discrete-time signals and their spectra: minimum-norm least squares and total least squares solutions; singular value decomposition; Wiener and Kalman filtering; autoregressive spectral analysis; and the maximum entropy method.

Topics

  • Matrix inversion: orthogonal projections; left and right inverses; minimum-norm least squares solutions; Moore-Penrose pseudoinverse; reularization; singular value decomposition; Eckart and Young theorem; total least squares; principal components analysis
  • Projections in Hilbert space: Hilbert space; projection theorem; normal equations, approximation and Fourier series; pseudoinverse operators, application to extrapolation of bandlimited sequences
  • Hilbert space of random variables: spectral representation of discrete-time stochastic processes; spectral factorization; linear minimum-variance estimation; discrete-time Wiener filter; innovations representation; Wold decomposition; Gauss Markov theorem; sequential least squares; discrete-time Kalman filter
  • Power spectrum estimation: system identification; Prony's linear prediction method; Fourier and other nonparametric methods of spectrum estimation; resolution limits and model based methods; autoregressive models and the maximum entropy method; Levinson's algorithm; lattice filters; harmonic retrieval by Pisarenko's method; direction finding with passive multi-sensor arrays

Detailed Description and Outline

Topics:

  • Matrix inversion: orthogonal projections; left and right inverses; minimum-norm least squares solutions; Moore-Penrose pseudoinverse; reularization; singular value decomposition; Eckart and Young theorem; total least squares; principal components analysis
  • Projections in Hilbert space: Hilbert space; projection theorem; normal equations, approximation and Fourier series; pseudoinverse operators, application to extrapolation of bandlimited sequences
  • Hilbert space of random variables: spectral representation of discrete-time stochastic processes; spectral factorization; linear minimum-variance estimation; discrete-time Wiener filter; innovations representation; Wold decomposition; Gauss Markov theorem; sequential least squares; discrete-time Kalman filter
  • Power spectrum estimation: system identification; Prony's linear prediction method; Fourier and other nonparametric methods of spectrum estimation; resolution limits and model based methods; autoregressive models and the maximum entropy method; Levinson's algorithm; lattice filters; harmonic retrieval by Pisarenko's method; direction finding with passive multi-sensor arrays

Texts

Class notes.

Recommended:
B. Porat, Digital Processing of Random Signals, Prentice-Hall, 1994.

Last updated

2/13/2013