Sequential Monte Carlo
Particle filters if the objective is not necessarily a filter
July 25, 2014 — October 18, 2024
Bayes
Monte Carlo
particle
probabilistic algorithms
probability
sciml
signal processing
state space models
statistics
swarm
time series
A Monte Carlo algorithm updates a population of samples with a nested update to incorporate successively more information about an estimand. This mildly generalises particle filters, although sometimes the difference is only in the interpretation of the maths, since it works most naturally if a problem can be given a state-space model.
1 Introductions
- Pierre Jacob’s Particle methods for statistics reading list
- The lineage and reasoning are well explained by Cappé, Godsill, and Moulines (2007).
- Chopin and Papaspiliopoulos (2020)
2 Model selection within
🚧TODO🚧 clarify
3 Feynman-Kac formulae
See Feynman-Kac.
3.1 On weird graphs
4 References
Andrieu, Doucet, and Holenstein. 2010. “Particle Markov Chain Monte Carlo Methods.” Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Arulampalam, Maskell, Gordon, et al. 2002. “A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking.” IEEE Transactions on Signal Processing.
Bon, Lee, and Drovandi. 2021. “Accelerating Sequential Monte Carlo with Surrogate Likelihoods.”
Cappé, Godsill, and Moulines. 2007. “An Overview of Existing Methods and Recent Advances in Sequential Monte Carlo.” Proceedings of the IEEE.
Cérou, Frédéric, and Guyader. 2016. “Fluctuation Analysis of Adaptive Multilevel Splitting.” The Annals of Applied Probability.
Cérou, F., Moral, Furon, et al. 2011. “Sequential Monte Carlo for Rare Event Estimation.” Statistics and Computing.
Chopin, and Papaspiliopoulos. 2020. An Introduction to Sequential Monte Carlo. Springer Series in Statistics.
Davies, Ley-Cooper, Sutton, et al. 2021. “Bayesian Detectability of Induced Polarisation in Airborne Electromagnetic Data Using Reversible Jump Sequential Monte Carlo.”
Del Moral, Doucet, and Jasra. 2006. “Sequential Monte Carlo Samplers.” Journal of the Royal Statistical Society: Series B (Statistical Methodology).
———. 2011. “An Adaptive Sequential Monte Carlo Method for Approximate Bayesian Computation.” Statistics and Computing.
Doucet, Freitas, and Gordon. 2001a. Sequential Monte Carlo Methods in Practice.
Doucet, Freitas, and Gordon. 2001b. “An Introduction to Sequential Monte Carlo Methods.” In Sequential Monte Carlo Methods in Practice. Statistics for Engineering and Information Science.
Doucet, Godsill, and Andrieu. 2000. “On Sequential Monte Carlo Sampling Methods for Bayesian Filtering.” Statistics and Computing.
Gu, Ghahramani, and Turner. 2015. “Neural Adaptive Sequential Monte Carlo.” In Advances in Neural Information Processing Systems 28.
Gunawan, Dang, Quiroz, et al. 2018. “Subsampling Sequential Monte Carlo for Static Bayesian Models.”
Johansen, Del Moral, and Doucet. 2006. “Sequential Monte Carlo Samplers for Rare Events.” In Proceedings of the 6th International Workshop on Rare Event Simulation.
Naesseth, Christian Andersson, Lindsten, and Schön. 2014. “Sequential Monte Carlo for Graphical Models.” In Advances in Neural Information Processing Systems.
Naesseth, Christian A., Lindsten, and Schön. 2022. “Elements of Sequential Monte Carlo.” arXiv:1903.04797 [Cs, Stat].
Neal. 1998. “Annealed Importance Sampling.”
Rubinstein, and Kroese. 2016. Simulation and the Monte Carlo Method. Wiley series in probability and statistics.
Rubinstein, Ridder, and Vaisman. 2014. Fast Sequential Monte Carlo Methods for Counting and Optimization. Wiley Series in Probability and Statistics.
Salomone, South, Drovandi, et al. 2018. “Unbiased and Consistent Nested Sampling via Sequential Monte Carlo.”
Sisson, Fan, and Tanaka. 2007. “Sequential Monte Carlo Without Likelihoods.” Proceedings of the National Academy of Sciences.
Vergé, Dubarry, Del Moral, et al. 2013. “On Parallel Implementation of Sequential Monte Carlo Methods: The Island Particle Model.” Statistics and Computing.
Zhao, Mair, Schön, et al. 2024. “On Feynman-Kac Training of Partial Bayesian Neural Networks.” In Proceedings of The 27th International Conference on Artificial Intelligence and Statistics.