Convergence: no matter where we start, we eventually sample from target distribution
Uniqueness: there's only one stationary distribution our chain can converge to
Two Critical Properties: Convergence and Uniqueness
Convergence
Mathematically:
Starting distribution doesn't matter after sufficient iterations
Like tourists from anywhere in Ohio all becoming proper Columbus residents given enough time
\lim_{n\to\infty} (T^n \tilde{p})(\mathbf{x}) = p(\mathbf{x}), \quad \text{for all } \tilde{p}(\mathbf{x})
Uniqueness:
Uniqueness means if \((T p_1)(\mathbf{x}) = p_1(\mathbf{x})\) and \((T p_2)(\mathbf{x}) = p_2(\mathbf{x})\), then \(p_1(\mathbf{x}) = p_2(\mathbf{x})\)
Our travel guide must lead to only sampling from one unique distribution in the end point
Making sure our guide doesn't turn us into Cincinnati residents!
Counterexample 1: Uniqueness Failure
Neither convergence nor uniqueness is guaranteed for all Markov chains
Consider the "stay where you are" travel guide: \(T = \mathbf{Id}\)
All starting distributions are stationary: no one ever moves
Counterexample 2: Convergence Failure
Imagine Columbus with only two states: Short North and German Village