Dirichlet–Jordan test
A request that this article title be changed to Dirichlet–Jordan test is under discussion. Please do not move this article until the discussion is closed. |
In mathematics, the Dirichlet–Jordan test gives sufficient conditions for a real-valued, periodic function f to be equal to the sum of its Fourier series at a point of continuity. Moreover, the behavior of the Fourier series at points of discontinuity is determined as well (it is the midpoint of the values of the discontinuity). It is one of many conditions for the convergence of Fourier series.
The original test was established by Peter Gustav Lejeune Dirichlet in 1829,[1] for monotone functions. It was extended in 1894 by Camille Jordan to functions of bounded variation (any function of bounded variation is the difference of two increasing functions).[2]
Dirichlet–Jordan test for Fourier series
The Dirichlet–Jordan test states[3] that if a periodic function is of bounded variation on a period, then the Fourier series converges, as , at each point of the domain to In particular, if is continuous at , then the Fourier series converges to . Moreover, if is continuous everywhere, then the convergence is uniform.
Stated in terms of a periodic function of period 2π, the Fourier series coefficients are defined as and the partial sums of the Fourier series are
The analogous statement holds irrespective of what the period of f is, or which version of the Fourier series is chosen.
There is also a pointwise version of the test:[4] if is a periodic function in , and is of bounded variation in a neighborhood of , then the Fourier series at converges to the limit as above
Jordan test for Fourier integrals
For the Fourier transform on the real line, there is a version of the test as well.[5] Suppose that is in and of bounded variation in a neighborhood of the point . Then If is continuous in an open interval, then the integral on the left-hand side converges uniformly in the interval, and the limit on the right-hand side is .
Dirichlet conditions in signal processing
In signal processing, the test is often formulated as a pair of conditions of a signal, sometimes called the Dirichlet conditions. First, that the signal should be absolutely integrable (that is ) which guarantees the existence of the Fourier series, and secondly that the signal should be bounded with only finitely many local extrema and finitely many discontinuities.[6] Such a signal can be easily shown to be of bounded variation (the converse is not true). Any signal that can be physically produced in a laboratory satisfies these conditions.[7]
See also
References
- ^ Dirichlet (1829), "Sur la convergence des series trigonometriques qui servent à represénter une fonction arbitraire entre des limites donnees", J. Reine Angew. Math., 4: 157–169
- ^ C. Jordan, Cours d'analyse de l'Ecole Polytechnique, t.2, calcul integral, Gauthier-Villars, Paris, 1894
- ^ Antoni Zygmund (1952), Trigonometric series, Cambridge University Press, p. 57
- ^ R. E. Edwards (1967), Fourier series: a modern introduction, Springer, p. 156.
- ^ E. C. Titchmarsh (1948), Introduction to the theory of Fourier integrals, Oxford Clarendon Press, p. 13.
- ^ Alan V. Oppenheim; Alan S. Willsky; Syed Hamish Nawab (1997). Signals & Systems. Prentice Hall. p. 198. ISBN 9780136511755.
- ^ B P Lathi (2000), Signal processing and linear systems, Oxford