Spectral density: Difference between revisions
Line 71: | Line 71: | ||
</math> |
</math> |
||
In the context of [[distribution (mathematics)|distribution]] (a mathematic object generalizing the concept of functions), even non square integrable functions have a defined Fourier transforms. The |
In the context of [[distribution (mathematics)|distribution]] (a mathematic object generalizing the concept of functions), even non square integrable functions have a defined Fourier transforms. The Spectral Density (SD) <math>S_s(f)</math> of a (real) signal <math>s(t)</math> is the square of the modulus of its Fourier transform. |
||
<math> |
<math> |
||
Line 79: | Line 79: | ||
If the signal <math>s(t)</math> is a given unit <math>u</math>, its PSD <math>S_s(f)</math> os in unit <math>u^2/\rm{Hz}</math>, often expressed in logarithmic units <math>\rm{dB}(u^2/\rm{Hz})</math>. |
If the signal <math>s(t)</math> is a given unit <math>u</math>, its PSD <math>S_s(f)</math> os in unit <math>u^2/\rm{Hz}</math>, often expressed in logarithmic units <math>\rm{dB}(u^2/\rm{Hz})</math>. |
||
From the [[Wiener–Khinchin theorem]], the PSD is also the Fourier transform of the [[autocorrelation function]], <math>R(\tau)</math>, of the signal if the signal can be treated as a [[stationary process|wide-sense stationary random process]].<ref>{{cite book | title = Echo Signal Processing | author = Dennis Ward Ricker | publisher = Springer | year = 2003 | ibsn = 140207395X | url = http://books.google.com/books?id=NF2Tmty9nugC&pg=PA23&dq=%22power+spectral+density%22+%22energy+spectral+density%22&lr=&as_brr=3&ei=HZMvSPSWFZyStwPWsfyBAw&sig=1ZZcHwxXkErvNXtAHv21ijTXoP8#PPA23,M1 }}</ref> |
|||
⚫ | The power spectral density of a signal exists if and only if the signal is a wide-sense [[stationary process|stationary process]]. If the signal is not stationary, then the autocorrelation function must be a function of two variables, so no SD exists, but similar techniques may be used to estimate a time-varying spectral density. |
||
This results in the formula, |
|||
<math> |
|||
S_s(f)=\int_{-\infty}^{\infty}\,R(\tau)\,e^{-2\,\pi\,i\,f\,\tau}\,d \tau. |
|||
</math> |
|||
Since the signal <math>s(t)</math> is real, its autocorrelation <math>R(\tau)</math> is even and real. The (two-sided) PSD <math>S_s(f)</math> is therefore an even (and real) function of <math>f</math>. Without loss of information, a widely used concept known as the '''single-sided''' PSD, is defined for <math>f>0</math>as |
|||
<math> |
|||
S_s^{SS}(f) = S_s(f)+S_s(-f) = 2 S_s(f) |
|||
</math> |
|||
The power of the signal in a given frequency band can be calculated by integrating over positive and negative frequencies, |
|||
<math> |
|||
P=\int_{F_1}^{F_2}\,S(f)\,d f + \int_{-F_2}^{-F_1}\,S(f)\,d f = \int_{F_1}^{F_2}\,S^{SS}(f)\,d f. |
|||
</math> |
|||
⚫ | The power spectral density of a signal exists if and only if the signal is a wide-sense [[stationary process|stationary process]]. If the signal is not stationary, then the autocorrelation function must be a function of two variables, so no |
||
The '''power spectrum''' <math>G(f)</math> is defined as<ref>An Introduction to the Theory of Random Signals and Noise, Wilbur B. Davenport and Willian L. Root, IEEE Press, New York, 1987, ISBN 0-87942-235-1</ref> |
The '''power spectrum''' <math>G(f)</math> is defined as<ref>An Introduction to the Theory of Random Signals and Noise, Wilbur B. Davenport and Willian L. Root, IEEE Press, New York, 1987, ISBN 0-87942-235-1</ref> |
Revision as of 18:43, 27 November 2008
This article needs additional citations for verification. (May 2008) |
In statistical signal processing and physics, the spectral density (SD), power spectral density (PSD), and energy spectral density (ESD), are positive real functions of a frequency variable associated with a stationary stochastic process, or a deterministic function of time. They have dimensions respectively of (unit2) per Hz, power per Hz, and energy per Hz. It is often called simply the spectrum of the signal. Intuitively, the spectral density captures the frequency content of a stochastic process and helps identify periodicities and noise processes.
Explanation and examples
In physics, typical signals s(t) for which spectral density (SD) is measured are : s(t) representing a voltage v(t) in Volts, an electric intensity i(t) in Amperes, the frequency deviation of an oscillator with respect to another (t) in Herz, or a relative phase between two oscillators (t) in radians (non exhaustive list...). These examples give rise to SD respectively Sv(f) in V2/Hz, Si(f) in A2/Hz, S(f) in Hz2/Hz, and S(f) in rad2/Hz.
For a signal representing a wave, such as an electromagnetic wave, random vibration, or an acoustic wave, the spectral density of the wave, when multiplied by an appropriate factor, will give the power carried by the wave, per unit frequency, known as the power spectral density (PSD) of the signal. For example, a voltage signal v(t) applied to a 50 ohms resistor will have an associated PSD defined by its SD divided by 50 ohms. Power spectral density is commonly expressed in watts per hertz (W/Hz)[1] or dBm/Hz
For voltage signals, it is customary to use units of V2Hz-1 for PSD, and V2sHz-1 for ESD[2] or dBμV/Hz.
Typical signals for which PSD is measured are : representing a voltage in Volts , an electric intensity in Amperes , a frequency deviation in Herz, or a relative phase in radians. These exemples give rise to PSD respectively in , , , and .
Although it is not necessary to assign physical dimensions to the signal or its argument, in the following discussion the terms used will assume that the signal varies in time.
Definition
Spectral Density
The Spectral density (SD) Ss(f) of a (real) signal s(t) is the square of the modulus of its Fourier transform.
If the signal has a given unit , its SD Ss(f) is in unit , often expressed in logarithmic units .
From the Wiener–Khinchin theorem, the SD is also the Fourier transform of the autocorrelation function, , of the signal if the signal can be treated as a wide-sense stationary random process.[3]
This results in the formula,
Since the signal is real, its autocorrelation is even and real. The (two-sided) SD is therefore an even (and real) function of . Without loss of information, a widely used concept known as the single-sided PSD, is defined for as
The spectral density is often (abusively) called power spectral density, as it is usually measured in fine with an physical device (spectrum analyzer, FFT analyser, optical spectrum analyzer...) which effectively measures the integrated power in a given band of the spectrum.
Energy spectral density
The energy spectral density describes how the energy (or variance) of a signal or a time series is distributed with frequency. If is a finite-energy (square integrable) signal, the spectral density of the signal is the square of the magnitude of the continuous Fourier transform of the signal (here energy is taken as the integral of the square of a signal, which is the same as physical energy if the signal is a voltage applied to a 1-ohm load).
where is the angular frequency ( times the cycle frequency) and is the continuous Fourier transform of , and is its complex conjugate.
If the signal is discrete with values , over an infinite number of elements, we still have an energy spectral density:
where is the discrete-time Fourier transform of .
If the number of defined values is finite, the sequence does not have an energy spectral density per se, but the sequence can be treated as periodic, using a discrete Fourier transform to make a discrete spectrum, or it can be extended with zeros and a spectral density can be computed as in the infinite-sequence case.
The continuous and discrete spectral densities are often denoted with the same symbols, as above, though their dimensions and units differ; the continuous case has a time-squared factor that the discrete case does not have. They can be made to have equal dimensions and units by measuring time in units of sample intervals or by scaling the discrete case to the desired time units.
As is always the case, the multiplicative factor of is not absolute, but rather depends on the particular normalizing constants used in the definition of the various Fourier transforms.
Power spectral density
The above definitions of energy spectral density require that the signals has finite energy, that is, that the signals are square-integrable or square-summable. An often more useful alternative is the power spectral density (PSD), which describes how the power of a signal or time series is distributed with frequency. Here power can be the actual physical power, or more often, for convenience with abstract signals, can be defined as the squared value of the signal, that is, as the actual power if the signal was a voltage applied to a 1-ohm load. This instantaneous power (the mean or expected value of which is the average power) is then given by:
In the context of distribution (a mathematic object generalizing the concept of functions), even non square integrable functions have a defined Fourier transforms. The Spectral Density (SD) of a (real) signal is the square of the modulus of its Fourier transform.
If the signal is a given unit , its PSD os in unit , often expressed in logarithmic units .
The power spectral density of a signal exists if and only if the signal is a wide-sense stationary process. If the signal is not stationary, then the autocorrelation function must be a function of two variables, so no SD exists, but similar techniques may be used to estimate a time-varying spectral density.
The power spectrum is defined as[4]
Estimation
The goal of spectral density estimation is to estimate the spectral density of a random signal from a sequence of time samples. Depending on what is known about the signal, estimation techniques can involve parametric or non-parametric approaches, and may be based on time-domain or frequency-domain analysis. For example, a common parametric technique involves fitting the observations to an autoregressive model. A common non-parametric technique is the periodogram.
The spectral density is usually estimated using Fourier transform methods, but other techniques such as Welch's method and the maximum entropy method can also be used.
Properties
- The spectral density of and the autocorrelation of form a Fourier transform pair (for PSD versus ESD, different definitions of autocorrelation function are used).
- One of the results of Fourier analysis is Parseval's theorem which states that the area under the energy spectral density curve is equal to the area under the square of the magnitude of the signal, the total energy:
- The above theorem holds true in the discrete cases as well. A similar result holds for the total power in a power spectral density being equal to the corresponding mean total signal power, which is the autocorrelation function at zero lag.
Related concepts
- Most "frequency" graphs really display only the spectral density. Sometimes the complete frequency spectrum is graphed in 2 parts, "amplitude" versus frequency (which is the spectral density) and "phase" versus frequency (which contains the rest of the information from the frequency spectrum). The signal can be recovered from complete frequency spectrum. Note that the signal cannot be recovered from the spectral density part alone — the "temporal information" is lost.
- The spectral centroid of a signal is the midpoint of its spectral density function, i.e. the frequency that divides the distribution into two equal parts.
- The spectral edge frequency of a signal is an extension of the previous concept to any proportion instead of two equal parts.
- Spectral density is a function of frequency, not a function of time. However, the spectral density of small "windows" of a longer signal may be calculated, and plotted versus time associated with the window. Such a graph is called a spectrogram. This is the basis of a number of spectral analysis techniques such as the short-time Fourier transform and wavelets.
Applications
Electronics engineering
The concept and use of the power spectrum of a signal is fundamental in electronic engineering, especially in electronic communication systems (radio & microwave communications, radars, and related systems). Much effort has been made and millions of dollars spent on developing and producing electronic instruments called "spectrum analyzers" for aiding electronics engineers, technologists, and technicians in observing and measuring the power spectrum of electronic signals. The cost of a spectrum analyzer varies according to its bandwidth and its accuracy. The top quality instruments cost over $100,000.
The spectrum analyzer measures essentially the magnitude of the short-time Fourier transform (STFT) of an input signal. If the signal being analyzed is stationary, the STFT is a good smoothed estimate of its power spectral density.
Colorimetry
The spectrum of a light source is a measure of the power carried by each frequency or "color" in a light source. The light spectrum is usually measured at points (often 31) along the visible spectrum, in wavelength space instead of frequency space, which makes it not strictly a spectral density. Some spectrophotometers can measure increments as fine as 1 or 2 nanometers. Values are used to calculate other specifications and then plotted to demonstrate the spectral attributes of the source. This can be a helpful tool in analyzing the color characteristics of a particular source.
See also
- Spectral efficiency
- Noise spectral density
- Colors of noise
- Spectral leakage
- Window function
- Frequency domain
- Frequency spectrum
- Bispectrum
- Spectral density estimation
References
- ^ Gérard Maral (2003). VSAT Networks. John Wiley and Sons.
{{cite book}}
: Unknown parameter|ibsn=
ignored (help) - ^ Michael Peter Norton
and Denis G. Karczub (2003). Fundamentals of Noise and Vibration Analysis for Engineers. Cambridge University Press. ISBN 0521499135.
{{cite book}}
: line feed character in|author=
at position 21 (help) - ^ Dennis Ward Ricker (2003). Echo Signal Processing. Springer.
{{cite book}}
: Unknown parameter|ibsn=
ignored (help) - ^ An Introduction to the Theory of Random Signals and Noise, Wilbur B. Davenport and Willian L. Root, IEEE Press, New York, 1987, ISBN 0-87942-235-1