Isotropic position: Difference between revisions
No edit summary Tags: Mobile edit Mobile web edit Advanced mobile edit |
Alexjust67 (talk | contribs) #suggestededit-add 1.0 Tags: Mobile edit Mobile app edit Android app edit |
||
Line 1: | Line 1: | ||
{{Short description|Theory}} |
|||
In the fields of [[machine learning]], the [[theory of computation]], and [[random matrix theory]], a probability distribution over vectors is said to be in '''isotropic position''' if its [[covariance matrix]] is equal to the [[identity matrix]]. |
In the fields of [[machine learning]], the [[theory of computation]], and [[random matrix theory]], a probability distribution over vectors is said to be in '''isotropic position''' if its [[covariance matrix]] is equal to the [[identity matrix]]. |
||
== Formal definitions == |
== Formal definitions == |
Revision as of 09:33, 10 July 2022
In the fields of machine learning, the theory of computation, and random matrix theory, a probability distribution over vectors is said to be in isotropic position if its covariance matrix is equal to the identity matrix.
Formal definitions
Let be a distribution over vectors in the vector space . Then is in isotropic position if, for vector sampled from the distribution,
A set of vectors is said to be in isotropic position if the uniform distribution over that set is in isotropic position. In particular, every orthonormal set of vectors is isotropic.
As a related definition, a convex body in is called isotropic if it has volume , center of mass at the origin, and there is a constant such that for all vectors in ; here stands for the standard Euclidean norm.
See also
References
- Rudelson, M. (1999). "Random Vectors in the Isotropic Position". Journal of Functional Analysis. 164 (1): 60–72. arXiv:math/9608208. doi:10.1006/jfan.1998.3384.