User:Slava3087
In classical statistical decision theory, where we are faced with the problem of estimating a deterministicparameter (vector) Failed to parse (unknown function "\bfx"): {\displaystyle {\bfx}} from observations , an estimator (estimation rule) is called minimax if its maximal risk is minimal among all estimators of . In a sense this is an estimator which performs best in the worst possible case allowed in the problem.
Problem Definition
Consider the problem of estimating a deterministic (not Bayesian) parameter belonging to some sete \delta^M(y)</math> is called minimax (or minmax) with respect to a loss function
we have an estimator that is used to estimate a parameter . We also assume a risk function , usually specified as the integral of a loss function. In this framework, is called minimax if it satisfies
- .
An alternative criterion in the decision theoretic framework is the Bayes estimator in the presence of a prior distribution . An estimator is Bayes if it minimizes the average risk