Jump to content

Distributed lag

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Joeendymion (talk | contribs) at 15:39, 4 May 2009. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In statistics a distributed lag model is a regression eqution utilizing time-series data that contains both the current values of an explanatory variable and the lagged (past period) values of this explanatory variable.

[1]

References

  1. ^ Jeff B. Cromwell, et. al., (1994). Multivariate Tests For Time Series Models. SAGE Publications, Inc. ISBN 0-8039-5440-9