Ambrogioni, Güçlü, Güçlütürk, et al. 2018.
“Wasserstein Variational Inference.” In
Proceedings of the 32Nd International Conference on Neural Information Processing Systems. NIPS’18.
Arjovsky, Chintala, and Bottou. 2017.
“Wasserstein Generative Adversarial Networks.” In
International Conference on Machine Learning.
Bissiri, Holmes, and Walker. 2016.
“A General Framework for Updating Belief Distributions.” Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Campbell, and Broderick. 2017.
“Automated Scalable Bayesian Inference via Hilbert Coresets.” arXiv:1710.05053 [Cs, Stat].
Cherief-Abdellatif, and Alquier. 2020.
“MMD-Bayes: Robust Bayesian Estimation via Maximum Mean Discrepancy.” In
Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference.
Fernholz. 1983. von Mises calculus for statistical functionals. Lecture Notes in Statistics 19.
———. 2014.
“Statistical Functionals.” In
Wiley StatsRef: Statistics Reference Online.
Frogner, Zhang, Mobahi, et al. 2015.
“Learning with a Wasserstein Loss.” In
Advances in Neural Information Processing Systems 28.
Gibbs, and Su. 2002.
“On Choosing and Bounding Probability Metrics.” International Statistical Review.
Gulrajani, Ahmed, Arjovsky, et al. 2017.
“Improved Training of Wasserstein GANs.” arXiv:1704.00028 [Cs, Stat].
Guo, Hong, Lin, et al. 2017.
“Relaxed Wasserstein with Applications to GANs.” arXiv:1705.07164 [Cs, Stat].
———. 2022. “An Optimization-Centric View on Bayes’ Rule: Reviewing and Generalizing Variational Inference.” Journal of Machine Learning Research.
Liu, Huidong, Gu, and Samaras. 2018.
“A Two-Step Computation of the Exact GAN Wasserstein Distance.” In
International Conference on Machine Learning.
Liu, Qiang, Lee, and Jordan. 2016.
“A Kernelized Stein Discrepancy for Goodness-of-Fit Tests.” In
Proceedings of The 33rd International Conference on Machine Learning.
Lyddon, Walker, and Holmes. 2018.
“Nonparametric Learning from Bayesian Models with Randomized Objective Functions.” In
Proceedings of the 32nd International Conference on Neural Information Processing Systems. NIPS’18.
Mahdian, Blanchet, and Glynn. 2019.
“Optimal Transport Relaxations with Application to Wasserstein GANs.” arXiv:1906.03317 [Cs, Math, Stat].
Matsubara, Knoblauch, Briol, et al. 2022.
“Robust Generalised Bayesian Inference for Intractable Likelihoods.” Journal of the Royal Statistical Society Series B: Statistical Methodology.
Ostrovski, Dabney, and Munos. n.d. “Autoregressive Quantile Networks for Generative Modeling.”
Panaretos, and Zemel. 2019.
“Statistical Aspects of Wasserstein Distances.” Annual Review of Statistics and Its Application.
Ranganath, Tran, Altosaar, et al. 2016.
“Operator Variational Inference.” In
Advances in Neural Information Processing Systems 29.
Santambrogio. 2015.
Optimal Transport for Applied Mathematicians. Edited by Filippo Santambrogio. Progress in Nonlinear Differential Equations and Their Applications.
Schmon, Cannon, and Knoblauch. 2021.
“Generalized Posteriors in Approximate Bayesian Computation.” arXiv:2011.08644 [Stat].
Wang, Prince Zizhuang, and Wang. 2019.
“Riemannian Normalizing Flow on Variational Wasserstein Autoencoder for Text Modeling.” In
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers).
Zhang, Walder, Bonilla, et al. 2020.
“Quantile Propagation for Wasserstein-Approximate Gaussian Processes.” In
Proceedings of NeurIPS 2020.