Talk:Markov property: Difference between revisions
No edit summary |
|||
Line 31: | Line 31: | ||
: |
: |
||
Again, what was intended here? [[User:Michael Hardy|Michael Hardy]] ([[User talk:Michael Hardy|talk]]) 21:15, 21 October 2009 (UTC) |
Again, what was intended here? [[User:Michael Hardy|Michael Hardy]] ([[User talk:Michael Hardy|talk]]) 21:15, 21 October 2009 (UTC) |
||
I'm not probabilist, but all my books on probability define a sequence of random variables as having the Markov property if <math>P(X_i=x|X_0=x_0,\ldots,X_{i-1}=x_{i-1})=P(X_i=x|X_{i-1}=x_{i-1})</math>, not going any further back. I can see why you might want to extend the definition to the one given here, but has anyone else found any citations for it? |
Revision as of 18:00, 11 December 2009
Statistics Unassessed | ||||||||||
|
Mathematics Stub‑class High‑priority | ||||||||||
|
Brand new article
The old article called Markov property didn't even correctly define the property, confusing it with the process. That was a mess. So I created a new article. linas (talk) 22:29, 30 August 2008 (UTC)
Doubts
Does someone have a reference that does use the term "Markov property" in the sense suposedly used here? The reference to Feller doesn't do so I think ... The chapter/section mentioned seems to be about Markov processes and, while "Markov property" is in the index that points to a definition on other pages in connection with the "memoryless" property of the exponential distribution.
If the definition here is relevant to anything, what is the role of the index j? For a "Markov property" to hold does the stated condition have to hold for every j, or for only one j? Presumably Nj must exclude at least one other member besides j, but is it enough for this to apply for only one j?
Melcombe (talk) 17:58, 22 January 2009 (UTC)
Mangled sentence
- if a stochastic process of random variables determining a set of probabilities which can be factored in such a way that the Markov property is obtained, then [...]
The part between "if" and "then" is not a sentence. What was it intended to say? Michael Hardy (talk) 21:11, 21 October 2009 (UTC)
This article also says:
-
- In a broader sense, if a stochastic process of random variables determining a set of probabilities which can be factored in such a way that the Markov property is obtained, then that process is said to have the Markov-type property; this is defined in detail below.
But it's not defined below.
Again, what was intended here? Michael Hardy (talk) 21:15, 21 October 2009 (UTC)
I'm not probabilist, but all my books on probability define a sequence of random variables as having the Markov property if , not going any further back. I can see why you might want to extend the definition to the one given here, but has anyone else found any citations for it?