Ergodicity and mixing

Things that probably happen eventually on average

October 17, 2011 — February 13, 2022

dynamical systems
Markov processes
statmech
time series
Figure 1

🏗

Relevance to actual stochastic processes and dynamical systems, especially linear and non-linear system identification.

Keywords to look up:

Not much material, but please see learning theory for dependent data for some interesting categorisations of mixing and transcendence of miscellaneous mixing conditions for statistical estimators.

My main interest is the following 4-stages-of-grief kind of setup.

  1. Often I can prove that I can learn a thing from my data if it is stationary.
  2. But I rarely have stationarity, so at least showing the estimator is ergodic might be more useful, which would follow from some appropriate mixing conditions which do not necessarily assume stationarity.
  3. Except that often these theorems are hard to show, or estimate, or require knowing the parameters in question, and maybe I might suspect that showing some kind of partial identifiability might be more what I need.
  4. Furthermore, I usually would prefer a finite-sample result instead of some asymptotic guarantee. Sometimes I can get those from learning theory for dependent data.

That last one is TBC.

1 Coupling from the past

Dan Piponi explains coupling from the past via functional programming for Markov chains.

2 Mixing zoo

A recommended partial overview is Bradley (2005). 🏗

2.1 β-mixing

🏗

2.2 ϕ-mixing

🏗

2.3 Sequential Rademacher complexity

🏗

3 Lyapunov exponents

John D Cook says

Chaotic systems are unpredictable. Or rather chaotic systems are not deterministically predictable in the long run. You can make predictions if you weaken one of these requirements. You can make deterministic predictions in the short run, or statistical predictions in the long run. Lyapunov exponents are a way to measure how quickly the short run turns into the long run.

4 References

Berry, Giannakis, and Harlim. 2020. Bridging Data Science and Dynamical Systems Theory.” arXiv:2002.07928 [Physics, Stat].
Bradley. 2005. Basic Properties of Strong Mixing Conditions. A Survey and Some Open Questions.” Probability Surveys.
Brémaud, and Massoulié. 2001. Hawkes Branching Point Processes Without Ancestors.” Journal of Applied Probability.
Diaconis, and Freedman. 1999. Iterated Random Functions.” SIAM Review.
Gray. 1987. Probability, Random Processes, and Ergodic Properties.
Keane, and Petersen. 2006. Easy and Nearly Simultaneous Proofs of the Ergodic Theorem and Maximal Ergodic Theorem.” IMS Lecture Notes-Monograph Series Dynamics & Stochastics.
Kuznetsov, and Mohri. 2014. Generalization Bounds for Time Series Prediction with Non-Stationary Processes.” In Algorithmic Learning Theory. Lecture Notes in Computer Science.
———. 2016. Generalization Bounds for Non-Stationary Mixing Processes.” In Machine Learning Journal.
Livan, Inoue, and Scalas. 2012. On the Non-Stationarity of Financial Time Series: Impact on Optimal Portfolio Selection.” Journal of Statistical Mechanics: Theory and Experiment.
McDonald, Shalizi, and Schervish. 2011. Risk Bounds for Time Series Without Strong Mixing.” arXiv:1106.0730 [Cs, Stat].
Mohri, and Rostamizadeh. 2009. Stability Bounds for Stationary ϕ-Mixing and β-Mixing Processes.” Journal of Machine Learning Research.
Morvai, Yakowitz, and Györfi. 1996. Nonparametric Inference for Ergodic, Stationary Time Series.” The Annals of Statistics.
Palmer. 1982. Broken Ergodicity.” Advances in Physics.
Propp, and Wilson. 1996. Exact Sampling with Coupled Markov Chains and Applications to Statistical Mechanics.” In Random Structures & Algorithms.
———. 1998. Coupling from the Past: A User’s Guide.” In Microsurveys in Discrete Probability. DIMACS Series in Discrete Mathematics and Theoretical Computer Science.
Rosenblatt. 1984. Asymptotic Normality, Strong Mixing and Spectral Density Estimates.” The Annals of Probability.
Ryabko, and Ryabko. 2010. Nonparametric Statistical Inference for Ergodic Processes.” IEEE Transactions on Information Theory.
Shao, and Wu. 2007. Asymptotic Spectral Theory for Nonlinear Time Series.” The Annals of Statistics.
Shields. 1998. The Interactions Between Ergodic Theory and Information Theory.” IEEE Transactions on Information Theory.
Steif. 1997. Consistent Estimation of Joint Distributions for Sufficiently Mixing Random Fields.” The Annals of Statistics.
Stein, and Newman. 1995. Broken Ergodicity and the Geometry of Rugged Landscapes.” Physical Review E.
Thouvenot, and Weiss. 2012. Limit Laws for Ergodic Processes.” Stochastics and Dynamics.
van Delft, and Eichler. 2016. Locally Stationary Functional Time Series.” arXiv:1602.05125 [Math, Stat].