Nonlinear regression
Nonlinear regression is a form of regression analysis in which observational data are modeled by a function which is a nonlinear combination of the model parameters and depends on one or more independent variables. The data are fitted by a method of successive approximations.
General
The data consist of m values taken from observations (dependent or response variable), y. The dependent variable is subject to error. This error is assumed to be random variable, with a mean of zero. Systematic error may be present but its treatment is outside the scope of regression analysis. The independent variable (explanatory variable), x, is error-free. If this is not so, modeling should be done using errors-in-variables model techniques. The independent variables are also called regressors, exogenous variables, input variables and predictor variables. A nonlinear model is one in which the calculated value, , is a nonlinear function of the parameters, . For example, the Michaelis-Menten model for enzyme kinetics
can be written as
where is the parameter , is the parameter and [S] is the independent variable, x. This function is nonlinear because the parameters do not occur as a linear combination. Other examples of nonlinear functions include exponential functions, logarithmic functions, trigonometric functions, power functions, Gaussian functions, Lorentzian curves and so on.
In general, there is no closed-form expression for the best-fitting parameters, as there is in linear regression. Usually numerical optimization algorithms are applied to determine the best-fitting parameters. Again in contrast to linear regression, there may be many local mimima of the function to be optimized. In practice, guess values of the parameters are used, in conjunction with the optimization algorithm, to attempt to find the global minimum of a sum of squares.
For details concerning nonlinear data modeling see least squares and non-linear least squares.
Regression statistics
The assumption underlying this procedure is that the model can be approximated by a linear function.
where . It follows from this that the least squares estimators are given by
The regression statistics are computed and used as in linear regression#regression statistics, but using J in place of X in the formulas. The linear approximation introduces bias into the statistics. Therefore more caution than usual is required in interpreting statistics derived from a nonlinear model.
Ordinary and weighted least squares
The best-fit curve is often assumed to be that which minimizes the sum of squared residuals. This is the (ordinary) least squares (OLS) approach. However, in cases where the dependent variable does not have constant variance a sum of weighted squared residuals may be minimized; see weighted least squares. Each weight should ideally be equal to the reciprocal of the variance of the observation, but weights may be recomputed on each iteration, in an iteratively weighted least squares algorithm.
Linearization
Some nonlinear regression problems can be linearized by a suitable transformation of the model formulation.
For example, consider the nonlinear regression problem (ignoring the error):
If we take a logarithm of both sides, it becomes
suggesting estimation of the unknown parameters by a linear regression of ln(y) on x, a computation that does not require iterative optimization. However, use of linearization requires caution. The influences of the data values will change, as will the error structure of the model and the interpretation of any inferential results. These may not be desired effects. On the other hand, depending on what the largest source of error is, linearization may distribute your errors in a normal fashion, so the choice to perform linearization must be informed by modeling considerations.
For Michaelis-Menten kinetics, the linear Lineweaver-Burk plot
of 1/v against 1/[S] has been much used. However, it is very sensitive to data error and it is strongly biased toward fitting the data in a particular range of the independent variable, [S], its use is strongly deprecated.
"Linearization" as used here is not to be confused with the local linearization involved in standard algorithms such as the Gauss-Newton algorithm. Similarly, the methodology of generalized linear models does not involve linearization for parameter estimation.
References
- G.A.F Seber and C.J. Wild. Nonlinear Regression. New York: John Wiley and Sons, 1989.
- R.M. Bethea, B.S. Duran and T.L. Boullion. Statistical Methods for Engineers and Scientists. New York: Marcel Dekker, Inc 1985 ISBN 0-8247-7227-X
External links
- NLINLS, Nonlinear least squares by differential evolution method of global optimization: a Fortran program
- ISAT, Nonlinear regression with explicit error control
- Zunzun.com, Online curve and surface fitting application
- NLREG, a proprietary program
- Matlab statistic
- GeneXproTools - Software for Nonlinear regression
- simplemax.net, online optimization service
- xuru.org Online nonlinear regression and regression tools
- Matlab SUrrogate MOdeling Toolbox - SUMO Toolbox - Matlab code for Active Learning + Model Selection + Surrogate Model Regression