Answer:
False. See te explanation an counter example below.
Step-by-step explanation:
For this case we need to find:
[tex] P(X_{n+1} = | X_n =i, X_{n-1}=i') =P(X_{n+1}=j |X_n =i)[/tex] for all [tex] i,i',j[/tex] and for [tex] X_n[/tex] in the Markov Chain assumed. If we proof this then we have a Markov Chain
For example if we assume that [tex] j=2, i=1, i'=0[/tex] then we have this:
[tex]P(X_{n+1} = | X_n =i, X_{n-1}=i') =\frac{1}{2}[/tex]
Because we can only have [tex] j=2, i=1, i'=0[/tex] if we have this:
[tex] Y_{n+1}=1 , Y_n= 1, Y_{n-1}=0, Y_{n-2}=0[/tex], from definition given [tex] X_n = Y_n + Y_{n-1}[/tex]
With [tex] i=1, i'=0[/tex] we have that [tex] Y_n =1 , Y_{n-1}=0, Y_{n-2}=0[/tex]
So based on these conditions [tex] Y_{n+1}[/tex] would be 1 with probability 1/2 from the definition.
If we find a counter example when the probability is not satisfied we can proof that we don't have a Markov Chain.
Let's assume that [tex] j=2, i=1, i'=2[/tex] for this case in order to satisfy the definition then [tex]Y_n =0, Y_{n-1}=1, Y_{n-2}=1[/tex]
But on this case that means [tex] X_{n+1}\neq 2[/tex] and on this case the probability [tex] P(X_{n+1}=j| X_n =i, X_{n-1}=i')= 0[/tex], so we have a counter example and we have that:
[tex] P(X_{n+1} =j| X_n =i, X_{n-1}=i') \neq P(X_{n+1} =j | X_n =i)[/tex] for all [tex] i,i', j[/tex] so then we can conclude that we don't have a Markov chain for this case.