Jump to content

Eaton's inequality: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
DrMicro (talk | contribs)
mNo edit summary
 
(27 intermediate revisions by 7 users not shown)
Line 1: Line 1:
In [[probability theory]], '''Eaton's inequality''' is a bound on the largest values of a linear combination of bounded [[random variables]]. This inequality was described in 1974 by Morris L. Eaton.<ref name=Eaton1974>Eaton, Morris L. (1974) "A probability inequality for linear combinations of bounded random variables." ''Annals of Statistics'' 2(3) 609–614</ref>
{{primary sources|date=April 2013}}

In [[probability theory]], '''Eaton's inequality''' is a bound on the largest values of a linear combination of bounded [[random variables]]. This inequality was described in 1974 by Eaton.<ref name=Eaton1974>Eaton, Morris L. (1974) "A probability inequality for linear combinations of bounded random variables." ''Annals of Statistics'' 2(3) 609–614</ref>


==Statement of the inequality==
==Statement of the inequality==


Let ''X''<sub>i</sub> be a set of real independent random variables, each with a [[expected value]] of zero and bounded by 1 ( | ''X''<sub>''i''</sub> | ≤ 1, for 1 ≤ ''i'' ≤ ''n''). The variates do not have to be identically or symmetrically distributed. Let ''a''<sub>''i''</sub> be a set of ''n'' fixed real numbers with
Let {''X''<sub>i</sub>} be a set of real independent random variables, each with an [[expected value]] of zero and bounded above by 1 ( |''X''<sub>''i''</sub> | ≤ 1, for 1 ≤ ''i'' ≤ ''n''). The variates do not have to be identically or symmetrically distributed. Let {''a''<sub>''i''</sub>} be a set of ''n'' fixed real numbers with


: <math> \sum_{ i = 1 }^n a_i^2 = 1 .</math>
: <math> \sum_{ i = 1 }^n a_i^2 = 1 .</math>
Line 23: Line 21:
Pinelis has shown that Eaton's bound can be sharpened:<ref name=Pinelis1994>Pinelis, I. (1994) "Extremal probabilistic problems and Hotelling's ''T''<sup>2</sup> test under a symmetry condition." ''Annals of Statistics'' 22(1), 357–368</ref>
Pinelis has shown that Eaton's bound can be sharpened:<ref name=Pinelis1994>Pinelis, I. (1994) "Extremal probabilistic problems and Hotelling's ''T''<sup>2</sup> test under a symmetry condition." ''Annals of Statistics'' 22(1), 357–368</ref>


: <math> B_{ EP } = \min( 1, k^{ -2 }, 2 B_E ) </math>
: <math> B_{ EP } = \min\{ 1, k^{ -2 }, 2 B_E \} </math>


A set of critical values for Eaton's bound have been determined.<ref name=Dufour1993>Dufour, J-M; Hallin, M (1993) "Improved Eaton bounds for linear combinations of bounded random variables, with statistical applications", ''Journal of the American Statistical Association'', 88(243) 1026–1033</ref>
A set of critical values for Eaton's bound have been determined.<ref name=Dufour1993>Dufour, J-M; Hallin, M (1993) "Improved Eaton bounds for linear combinations of bounded random variables, with statistical applications", ''Journal of the American Statistical Association'', 88(243) 1026–1033</ref>


==Releated inequalities==
==Related inequalities==


Let ''a''<sub>i</sub> be a set of independent Rademacher random variables - ''P''( a<sub>i</sub> = 1 ) = ''P''( a<sub>i</sub> = -1 ) = 1/2. Let Z be a normally distributed variate with a [[mean]] 0 and [[variance]] of 1. Let ''b''<sub>''i''</sub> be a set of ''n'' fixed real numbers such that
Let {''a''<sub>i</sub>} be a set of independent [[Rademacher distribution|Rademacher random variables]] ''P''( ''a''<sub>''i''</sub> = 1 ) = ''P''( ''a''<sub>''i''</sub> = −1 ) = 1/2. Let ''Z'' be a normally distributed variate with a [[mean]] 0 and [[variance]] of 1. Let {''b''<sub>''i''</sub>} be a set of ''n'' fixed real numbers such that


: <math> \sum_{ i = 1 }^n b_i^2 = 1 .</math>
: <math> \sum_{ i = 1 }^n b_i^2 = 1 .</math>


This last condition is required by the [[Reisz-Fisher theorem]] which states that that
This last condition is required by the [[Riesz–Fischer theorem]] which states that


:<math> a_i b_i + ... + a_n b_n </math>
:<math> a_i b_i + \cdots + a_n b_n </math>


will converge if and only if
will converge if and only if
Line 45: Line 43:
Then
Then


: <math> E f( a_i b_i + ... + a_n b_n ) \le E f( Z ) </math>
: <math> E f( a_i b_i + \cdots + a_n b_n ) \le E f( Z ) </math>


for ''f''(x) = | x |<sup>p</sup>. The case for ''p'' ≥ 3 was proved by Whittle<ref name=Whittle1960>Whittle P (1960) Bounds for the moments of linear and quadratic forms in independent variables. Teor Verojatnost i Primenen 5: 331–335 MR0133849</ref> and ''p'' ≥ 2 was proved by Haagerup.<ref name=Haagerup1982>Haagerup U (1982) The best constants in the Khinchine inequality. Studia Math 70: 231–283 MR0654838</ref>
for ''f''(x) = | x |<sup>p</sup>. The case for ''p'' ≥ 3 was proved by Whittle<ref name=Whittle1960>Whittle P (1960) Bounds for the moments of linear and quadratic forms in independent variables. Teor Verojatnost i Primenen 5: 331–335 MR0133849</ref> and ''p'' ≥ 2 was proved by Haagerup.<ref name=Haagerup1982>Haagerup U (1982) The best constants in the Khinchine inequality. Studia Math 70: 231–283 MR0654838</ref>
Line 52: Line 50:
If ''f''(x) = ''e''<sup>λx</sup> with ''λ'' ≥ 0 then
If ''f''(x) = ''e''<sup>λx</sup> with ''λ'' ≥ 0 then


:<math> E f( a_i b_i + ... + a_n b_n ) \le inf [ \frac{ E ( e^{ \lambda Z } ) }{ e^{ \lambda x } } ] = e^{ x^2 / 2 } </math>
:<math> E f( a_i b_i + \cdots + a_n b_n ) \le \inf \left[ \frac{ E ( e^{ \lambda Z } ) }{ e^{ \lambda x } } \right] = e^{ -x^2 / 2 } </math>


where ''inf'' is the [[infimum]].<ref name=Hoeffding1963>Hoeffding W (1963) Probability inequalities for sums of bounded random variables. J Amer Statist Assoc 58: 13–30 MR144363</ref>
where ''inf'' is the [[infimum]].<ref name=Hoeffding1963>Hoeffding W (1963) Probability inequalities for sums of bounded random variables. J Amer Statist Assoc 58: 13–30 MR144363</ref>


Let

:<math> S_n = a_i b_i + \cdots + a_n b_n </math>


Then<ref name=Pinelis1994a>Pinelis I (1994) Optimum bounds for the distributions of martingales in Banach spaces. Ann Probab 22(4):1679–1706</ref>

:<math> P( S_n \ge x ) \le \frac{ 2e^3 }{ 9 } P( Z \ge x ) </math>

The constant in the last inequality is approximately 4.4634.


An alternative bound is also known:<ref name=delaPena2009>de la Pena, VH, Lai TL, Shao Q (2009) Self normalized processes. Springer-Verlag, New York</ref>

:<math> P( S_n \ge x ) \le e^{ -x^2 / 2 } </math>

This last bound is related to the [[Hoeffding's inequality]].


In the uniform case where all the ''b''<sub>''i''</sub> = ''n''<sup>−1/2</sup> the maximum value of ''S''<sub>''n''</sub> is ''n''<sup>1/2</sup>. In this case van Zuijlen has shown that<ref name=vanZuijlen2011>van Zuijlen Martien CA (2011) On a conjecture concerning the sum of independent Rademacher random variables. https://arxiv.org/abs/1112.4988</ref>

: <math> P( | \mu - \sigma | ) \le 0.5 \, </math>{{clarification needed|date=April 2013}}

where ''μ'' is the [[mean]] and ''σ'' is the [[standard deviation]] of the sum.


==References==
==References==

Latest revision as of 10:33, 19 September 2021

In probability theory, Eaton's inequality is a bound on the largest values of a linear combination of bounded random variables. This inequality was described in 1974 by Morris L. Eaton.[1]

Statement of the inequality

[edit]

Let {Xi} be a set of real independent random variables, each with an expected value of zero and bounded above by 1 ( |Xi | ≤ 1, for 1 ≤ in). The variates do not have to be identically or symmetrically distributed. Let {ai} be a set of n fixed real numbers with

Eaton showed that

where φ(x) is the probability density function of the standard normal distribution.

A related bound is Edelman's[citation needed]

where Φ(x) is cumulative distribution function of the standard normal distribution.

Pinelis has shown that Eaton's bound can be sharpened:[2]

A set of critical values for Eaton's bound have been determined.[3]

[edit]

Let {ai} be a set of independent Rademacher random variablesP( ai = 1 ) = P( ai = −1 ) = 1/2. Let Z be a normally distributed variate with a mean 0 and variance of 1. Let {bi} be a set of n fixed real numbers such that

This last condition is required by the Riesz–Fischer theorem which states that

will converge if and only if

is finite.

Then

for f(x) = | x |p. The case for p ≥ 3 was proved by Whittle[4] and p ≥ 2 was proved by Haagerup.[5]


If f(x) = eλx with λ ≥ 0 then

where inf is the infimum.[6]


Let


Then[7]

The constant in the last inequality is approximately 4.4634.


An alternative bound is also known:[8]

This last bound is related to the Hoeffding's inequality.


In the uniform case where all the bi = n−1/2 the maximum value of Sn is n1/2. In this case van Zuijlen has shown that[9]

[clarification needed]

where μ is the mean and σ is the standard deviation of the sum.

References

[edit]
  1. ^ Eaton, Morris L. (1974) "A probability inequality for linear combinations of bounded random variables." Annals of Statistics 2(3) 609–614
  2. ^ Pinelis, I. (1994) "Extremal probabilistic problems and Hotelling's T2 test under a symmetry condition." Annals of Statistics 22(1), 357–368
  3. ^ Dufour, J-M; Hallin, M (1993) "Improved Eaton bounds for linear combinations of bounded random variables, with statistical applications", Journal of the American Statistical Association, 88(243) 1026–1033
  4. ^ Whittle P (1960) Bounds for the moments of linear and quadratic forms in independent variables. Teor Verojatnost i Primenen 5: 331–335 MR0133849
  5. ^ Haagerup U (1982) The best constants in the Khinchine inequality. Studia Math 70: 231–283 MR0654838
  6. ^ Hoeffding W (1963) Probability inequalities for sums of bounded random variables. J Amer Statist Assoc 58: 13–30 MR144363
  7. ^ Pinelis I (1994) Optimum bounds for the distributions of martingales in Banach spaces. Ann Probab 22(4):1679–1706
  8. ^ de la Pena, VH, Lai TL, Shao Q (2009) Self normalized processes. Springer-Verlag, New York
  9. ^ van Zuijlen Martien CA (2011) On a conjecture concerning the sum of independent Rademacher random variables. https://arxiv.org/abs/1112.4988