Jump to content

Distributed lag

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Addbot (talk | contribs) at 16:31, 11 March 2009 (Bot: Adding missing references section and tag (Report Errors)). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In statistics a distributed lag model explains a time series by a series of lags of the same variable. In general , where is the time series and is the error.

[1]

References

  1. ^ Jeff B. Cromwell, et. al., (1994). Multivariate Tests For Time Series Models. SAGE Publications, Inc. ISBN 0-8039-5440-9