Distributed lag
Appearance
In statistics, a distributed lag model is a model for time series data in which a regression-like equation is used to predict current values of a dependent variable based on both the current values of an explanatory variable and the lagged (past period) values of this explanatory variable.[1]
References
- ^ Jeff B. Cromwell, et. al., (1994). Multivariate Tests For Time Series Models. SAGE Publications, Inc. ISBN 0-8039-5440-9
This article needs additional citations for verification. (November 2009) |