Distributed lag
Appearance
This article provides insufficient context for those unfamiliar with the subject.(October 2009) |
In statistics a distributed lag model is a regression eqution utilizing time-series data that contains both the current values of an explanatory variable and the lagged (past period) values of this explanatory variable.
References
- ^ Jeff B. Cromwell, et. al., (1994). Multivariate Tests For Time Series Models. SAGE Publications, Inc. ISBN 0-8039-5440-9