Wiener–Khinchin theorem: Difference between revisions
Line 14: | Line 14: | ||
:<math>r_{xx}(\tau) = \operatorname{E}\big[\, x(t)x^*(t-\tau) \, \big] \ </math> |
:<math>r_{xx}(\tau) = \operatorname{E}\big[\, x(t)x^*(t-\tau) \, \big] \ </math> |
||
is the [[autocorrelation function]] ( |
is the [[autocorrelation function]] (sometimes called [[autocovariance]]) defined in terms of statistical [[expected value]]. Note that the autocorrelation function is defined in terms of the expected value of a product, and that the Fourier transform of <math>x(t)\,</math> does not exist in general, because stationary random functions are not genearally [[square-integrable function|square integrable]]. |
||
The asterisk denotes complex conjugate, and can be omitted if the random process is real-valued. |
The asterisk denotes complex conjugate, and of course it can be omitted if the random process is real-valued. |
||
==Discrete case== |
==Discrete case== |
Revision as of 00:39, 9 July 2012
The Wiener–Khinchin theorem (also known as the Wiener–Khintchine theorem and sometimes as the Wiener–Khinchin–Einstein theorem or the Khinchin–Kolmogorov theorem) states that the power spectral density of a wide-sense-stationary random process is the Fourier transform of the corresponding autocorrelation function.[1][2][3][4]
History
Norbert Wiener first published this theorem in 1930, and Aleksandr Khinchin did so independently in 1934. Albert Einstein had probably anticipated the idea in a brief two-page memo in 1914.[5]
Continuous case
For the continuous case, the power spectral density of is:
where
is the autocorrelation function (sometimes called autocovariance) defined in terms of statistical expected value. Note that the autocorrelation function is defined in terms of the expected value of a product, and that the Fourier transform of does not exist in general, because stationary random functions are not genearally square integrable.
The asterisk denotes complex conjugate, and of course it can be omitted if the random process is real-valued.
Discrete case
For the discrete case, the power spectral density of the function with discrete values is:
where
is the discrete autocorrelation function of . Being a sampled and discrete-time sequence, the spectral density is periodic in the frequency domain.
Application
The theorem is useful for analyzing linear time-invariant systems, LTI systems, when the inputs and outputs are not square integrable, so their Fourier transforms do not exist. A corollary is that the Fourier transform of the autocorrelation function of the output of an LTI system is equal to the product of the Fourier transform of the autocorrelation function of the input of the system times the squared magnitude of the Fourier transform of the system impulse response.[6] This works even when the Fourier transforms of the input and output signals do not exist because these signals are not square integrable, so the system inputs and outputs cannot be directly related by the Fourier transform of the impulse response.
Since the Fourier transform of the autocorrelation function of a signal is the power spectrum of the signal, this corollary is equivalent to saying that the power spectrum of the output is equal to the power spectrum of the input times the power transfer function.
This corollary is used in the parametric method for power spectrum estimation.
Discrepancy of definition
By the definitions involving infinite integrals in the articles on spectral density and autocorrelation, the Wiener–Khinchin theorem is a simple Fourier transform pair, trivially provable for any square integrable function, i.e. for functions whose Fourier transforms exist. More usefully, and historically, the theorem applies to wide-sense-stationary random processes, signals whose Fourier transforms do not exist, using the definition of autocorrelation function in terms of expected value rather than an infinite integral. This trivialization of the Wiener–Khinchin theorem is commonplace in modern technical literature, and obscures the contributions of Aleksandr Yakovlevich Khinchin, Norbert Wiener, and Andrey Kolmogorov.
Notes
- ^ Dennis Ward Ricker (2003). Echo Signal Processing. Springer. ISBN 1-4020-7395-X.
- ^ Leon W. Couch II (2001). Digital and Analog Communications Systems (sixth ed. ed.). Prentice Hall, New Jersey. pp. 406–409. ISBN 0-13-522583-3.
{{cite book}}
:|edition=
has extra text (help) - ^ Krzysztof Iniewski (2007). Wireless Technologies: Circuits, Systems, and Devices. CRC Press. ISBN 0-8493-7996-2.
- ^ Joseph W. Goodman (1985). Statistical Optics. Wiley-Interscience. ISBN 0-471-01502-4.
- ^ Jerison, David; Singer, Isadore Manuel; Stroock, Daniel W. (1997). The Legacy of Norbert Wiener: A Centennial Symposium (Proceedings of Symposia in Pure Mathematics). American Mathematical Society. p. 95. ISBN 0-8218-0415-4.
- ^ Shlomo Engelberg (2007). Random signals and noise: a mathematical introduction. CRC Press. p. 130. ISBN 978-0-8493-7554-5.