Partial least squares regression
Part of a series on |
Regression analysis |
---|
Models |
Estimation |
Background |
Partial least squares regression (PLS regression) is a statistical method that bears some relation to principal components regression; instead of finding hyperplanes of minimum variance between the response and independent variables, it finds a linear regression model by projecting the predicted variables and the observable variables to a new space. Because both the X and Y data are projected to new spaces, the PLS family of methods are known as bilinear factor models. Partial least squares Discriminant Analysis (PLS-DA) is a variant used when the Y is binary. Its counter part variant GOOBY-PLS (General Ordnance Obtrusive Binary Yield) is commonly used when the binary is X.
PLS is used to find the fundamental relations between two matrices (X and Y), i.e. a latent variable approach to modeling the covariance structures in these two spaces. A PLS model will try to find the multidimensional direction in the X space that explains the maximum multidimensional variance direction in the Y space. PLS regression is particularly suited when the matrix of predictors has more variables than observations, and when there is multicollinearity among X values. By contrast, standard regression will fail in these cases.
The PLS algorithm is employed in PLS path modelling,[1][2] a method of modeling a "causal" network of latent variables, as the word 'causal' has to put in quotes because causes obviously cannot be determined without experimental or quasi-experimental methods. This technique is a form of structural equation modeling, distinguished from the classical method by being component-based rather than covariance-based.[3]
Partial least squares was introduced by the Swedish statistician Herman Wold, who then developed it with his son, Svante Wold. An alternative term for PLS (and more correct according to Svante Wold[4]) is projection to latent structures, but the term partial least squares is still dominant in many areas. Although the original applications were in the social sciences, PLS regression is today most widely used in chemometrics and related areas. It is also used in bioinformatics, sensometrics, neuroscience and anthropology. In contrast, PLS path modeling is most often used in social sciences, econometrics, marketing and strategic management.
Underlying model
The general underlying model of multivariate PLS is
where is an matrix of predictors, is an matrix of responses; and are matrices that are, respectively, projections of X (the X score, component or factor matrix) and projections of Y (the Y scores); and are, respectively, and orthogonal loading matrices; and matrices and are the error terms, assumed to be i.i.d. normal. The decompositions of X and Y are made so as to maximise the covariance of T and U.
Algorithms
A number of variants of PLS exist for estimating the factor and loading matrices and . Most of them construct estimates of the linear regression between and as . Some PLS algorithms are only appropriate for the case where is a column vector, while others deal with the general case of a matrix . Algorithms also differ on whether they estimate the factor matrix as an orthogonal, an orthonormal matrix or not.[5][6][7][8][9][10] The final prediction will be the same for all these varieties of PLS, but the components will differ.
PLS1
PLS1 is a widely used algorithm appropriate for the vector case. It estimates as an orthonormal matrix. In pseudocode it may be expressed as:
1 function PLS1() 2 3 , an initial estimate of . 4 5 for = 0 to 6 7 8 9 10 if = 0 11 , break the for loop 12 if 13 14 15 16 end for 17 define to be the matrix with columns . Similarly define 18 19 20 return
This form of the algorithm does not require centering of the input and , as this is performed implicitly by the algorithm. This algorithm features 'deflation' of the matrix (subtraction of ), but deflation of the vector is not performed, as it is not necessary (it can be proved that deflating yields the same results as not deflating.). The user-supplied variable is the limit on the number of latent factors in the regression; if it equals the rank of the matrix , the algorithm will yield the least squares regression estimates for and
Extensions
In 2002 a new preprocessing method was published called orthogonal projections to latent structures (OPLS). In OPLS systematic variation in the X matrix not correlated to Y is removed. This improves the interpretability (but not the predictivity) of the PLS models.[11] L-PLS extends PLS regression to 3 connected data blocks.[12]
Software implementation
Most major statistical software packages offer PLS regression, including SAS, JMP, Minitab, SPSS and Statistica. The following PLS path modeling software tools implement PLS regression or variations of it: PLS-Graph, SmartPLS, WarpPLS and XLSTAT. There is a PLS regression module in XLSTAT, an add-in for Microsoft Excel. SIMCA-P+ from Umetrics and The Unscrambler are popular amongst chemometricians and sensometricians. PLS regression routines are available for MATLAB, the popular open source language R and open source data mining software tools Weka and Orange.
See also
References
- ^ Tenenhaus, M., Esposito Vinzi, V., Chatelinc, Y-M., Lauro, C. (2005), PLS path modeling, Computational Statistics & Data Analysis, 48, 159–205 http://www.stat.uni-muenchen.de/institut/ag/leisch/teaching/msl0910/PLS_path_modeling.pdf
- ^ Vinzi, V., Chin, W.W., Henseler, J., Wang, H. (eds) (2010). Handbook of Partial Least Squares. ISBN 978-3-540-32825-4
- ^ Tenenhaus, M. (2008), Component-based structural equation modelling http://www.hec.edu/var/fre/storage/original/application/888adcb77e8c8378551e689551ecc0a1.pdf
- ^ Wold, S, Sjöström, M., Eriksson, L. (2001). PLS-regression: a basic tool of chemometrics, Chemometrics and Intelligent Laboratory Systems, 58, 109–130.
- ^ Lindgren F, Geladi P, Wold S (1993) The kernel algorithm for PLS. J. Chemometrics 7:45–59
- ^ de Jong, S. and ter Braak, C. J. F. (1994) Comments on the PLS kernel algorithm. J. Chemometrics 8:169–174.
- ^ Dayal, B.S. & MacGregor, J.F. (1997) Improved PLS algorithms. J. Chemometrics 11:73–85.
- ^ de Jong, S. (1993) SIMPLS: an alternative approach to partial least squares regression. Chemometrics and Intelligent Laboratory Systems, 18:251–263
- ^ Rannar, S., Lindgren, F., Geladi, P. and Wold, S. (1994) A PLS Kernel Algorithm for Data Sets with Many Variables and Fewer Objects. Part 1: Theory and Algorithm. Journal of Chemometrics 8:111–125
- ^ Abdi, H. (2010). "Partial least squares regression and projection on latent structure regression (PLS-Regression)". Wiley Interdisciplinary Reviews: Computational Statistics, 2, 97–106 http://dx.doi.org/10.1002/wics.51
- ^ Trygg, J; Wold, S (2002). "Orthogonal Projections to Latent Structures". Journal of Chemometrics. 16 (3): 119–128. doi:10.1002/cem.695.
- ^ Sæbøa, S., Almøya, T., Flatbergb, A., Aastveita, A.H. and Martens, H. (2008), LPLS-regression: a method for prediction and classification under the influence of background information on predictor variables, Chemometrics and Intelligent Laboratory Systems, 91 (2), 121-132
Further reading
- R. Kramer, Chemometric Techniques for Quantitative Analysis, (1998). Marcel-Dekker, ISBN 0-8247-0198-4.
- Frank, Ildiko and Jerome Friedman (1993). "A Statistical View of Some Chemometrics Regression Tools, Technometrics, 35(2), pp 109–148".
{{cite journal}}
: Cite journal requires|journal=
(help) - Haenlein, Michael and Andreas M. Kaplan (2004). "A Beginner's Guide to Partial Least Squares Analysis, Understanding Statistics, 3(4), 283–297".
{{cite journal}}
: Cite journal requires|journal=
(help) - Henseler, Joerg and Georg Fassott (2005). "Testing Moderating Effects in PLS Path Models. An Illustration of Available Procedures".
{{cite journal}}
: Cite journal requires|journal=
(help) - Lingjærde, Ole-Christian and Nils Christophersen (2000). "Shrinkage Structure of Partial Least Squares, Scandinavian Journal of Statistics, 27(3), pp 459–473".
{{cite journal}}
: Cite journal requires|journal=
(help) - Tenenhaus Michel (1998). La Régression PLS: Théorie et Pratique. Paris: Technip.
- Rosipal, Roman and Nicole Kramer (2006) (2006). "Overview and Recent Advances in Partial Least Squares, in Subspace, Latent Structure and Feature Selection Techniques, pp 34–51".
{{cite journal}}
: Cite journal requires|journal=
(help)CS1 maint: numeric names: authors list (link) - Helland Inge S. (1990). PLS regression and statistical models. Scandivian Journal of Statistics, 17, 97–114.
- Wold, Herman. (1966). Estimation of principal components and related models by iterative least squares. In P.R. Krishnaiaah (Ed.). Multivariate Analysis. (pp. 391–420) New York: Academic Press.
- Wold, Herman. (1981). The fix-point approach to interdependent systems. Amsterdam: North Holland.
- Wold, Herman. (1985). Partial least squares, pp. 581–591 in Samuel Kotz and Norman L. Johnson, eds., Encyclopedia of statistical sciences, Vol. 6, New York: Wiley, 1985.
- Svante Wold, Axel Ruhe, Herman Wold, and W.J. Dunn. "The collinearity problem in linear regression. the partial least squares (PLS) approach to generalized inverses." SIAM J. Sci. Stat. Comp., 5:735-743, 1984.
- Garthwaite, Paul H. (1994) "An Interpretation of Partial Least Squares", Journal of the American Statistical Association, 89, 122–127 JSTOR 2291207
- Vinzi, V., Chin, W.W., Henseler, J., Wang, H. (eds) (2010). Handbook of Partial Least Squares. ISBN 978-3-540-32825-4
- Stone, M. and Brooks, R. J. (1990) "Continuum Regression: Cross-Validated Sequentially Constructed Prediction embracing Ordinary Least Squares, Partial Least Squares and Principal Components Regression", Journal of the Royal Statistical Society, Series B, 52 (2), 237–269 JSTOR 2345437
External links
- imDEV free Excel add-in for PLS and PLS-DA
- PLS and regression tutorial
- PLS in Brain Imaging
- on-line PLS regression (PLSR) at Virtual Computational Chemistry Laboratory
- Uncertainty estimation for PLS
- A short introduction to PLS regression and its history