Jump to content

Distributed lag

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by SmackBot (talk | contribs) at 23:19, 4 October 2009 (Date maintenance tags and general fixes). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In statistics a distributed lag model is a regression eqution utilizing time-series data that contains both the current values of an explanatory variable and the lagged (past period) values of this explanatory variable.

[1]

References

  1. ^ Jeff B. Cromwell, et. al., (1994). Multivariate Tests For Time Series Models. SAGE Publications, Inc. ISBN 0-8039-5440-9