Abbasnejad, Dick, and Hengel. 2016.
“Infinite Variational Autoencoder for Semi-Supervised Learning.” In
Advances in Neural Information Processing Systems 29.
Alexos, Boyd, and Mandt. 2022.
“Structured Stochastic Gradient MCMC.” In
Proceedings of the 39th International Conference on Machine Learning.
Alquier. 2021.
“User-Friendly Introduction to PAC-Bayes Bounds.” arXiv:2110.11216 [Cs, Math, Stat].
Archer, Park, Buesing, et al. 2015.
“Black Box Variational Inference for State Space Models.” arXiv:1511.07367 [Stat].
Bazzani, Torresani, and Larochelle. 2017. “Recurrent Mixture Density Network for Spatiotemporal Visual Attention.”
Bishop, Christopher. 1994.
“Mixture Density Networks.” Microsoft Research.
Bishop, Christopher M. 2006. Pattern Recognition and Machine Learning. Information Science and Statistics.
Blundell, Cornebise, Kavukcuoglu, et al. 2015.
“Weight Uncertainty in Neural Networks.” In
Proceedings of the 32nd International Conference on International Conference on Machine Learning - Volume 37. ICML’15.
Bora, Jalal, Price, et al. 2017.
“Compressed Sensing Using Generative Models.” In
International Conference on Machine Learning.
Breslow, and Clayton. 1993.
“Approximate Inference in Generalized Linear Mixed Models.” Journal of the American Statistical Association.
Bui, Ravi, and Ramavajjala. 2017.
“Neural Graph Machines: Learning Neural Networks Using Graphs.” arXiv:1703.04818 [Cs].
Chen, Wilson Ye, Mackey, Gorham, et al. 2018.
“Stein Points.” In
Proceedings of the 35th International Conference on Machine Learning.
Chen, Tian Qi, Rubanova, Bettencourt, et al. 2018.
“Neural Ordinary Differential Equations.” In
Advances in Neural Information Processing Systems 31.
Cutajar, Bonilla, Michiardi, et al. 2017.
“Random Feature Expansions for Deep Gaussian Processes.” In
PMLR.
Damianou, and Lawrence. 2013.
“Deep Gaussian Processes.” In
Artificial Intelligence and Statistics.
Dandekar, Chung, Dixit, et al. 2021.
“Bayesian Neural Ordinary Differential Equations.” arXiv:2012.07244 [Cs].
Daxberger, Kristiadi, Immer, et al. 2021.
“Laplace Redux — Effortless Bayesian Deep Learning.” In
arXiv:2106.14806 [Cs, Stat].
de Castro, and Dorigo. 2019.
“INFERNO: Inference-Aware Neural Optimisation.” Computer Physics Communications.
Dezfouli, and Bonilla. 2015.
“Scalable Inference for Gaussian Process Models with Black-Box Likelihoods.” In
Advances in Neural Information Processing Systems 28. NIPS’15.
Doerr, Daniel, Schiegg, et al. 2018.
“Probabilistic Recurrent State-Space Models.” arXiv:1801.10395 [Stat].
Dunlop, Girolami, Stuart, et al. 2018.
“How Deep Are Deep Gaussian Processes?” Journal of Machine Learning Research.
Dupont, Doucet, and Teh. 2019.
“Augmented Neural ODEs.” arXiv:1904.01681 [Cs, Stat].
Durasov, Bagautdinov, Baque, et al. 2021.
“Masksembles for Uncertainty Estimation.” In
2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
Dutordoir, Hensman, van der Wilk, et al. 2021.
“Deep Neural Networks as Point Estimates for Deep Gaussian Processes.” In
arXiv:2105.04504 [Cs, Stat].
Eleftheriadis, Nicholson, Deisenroth, et al. 2017.
“Identification of Gaussian Process State Space Models.” In
Advances in Neural Information Processing Systems 30.
Fabius, and van Amersfoort. 2014.
“Variational Recurrent Auto-Encoders.” In
Proceedings of ICLR.
Figurnov, Mohamed, and Mnih. 2018.
“Implicit Reparameterization Gradients.” In
Advances in Neural Information Processing Systems 31.
Flaxman, Wilson, Neill, et al. 2015. “Fast Kronecker Inference in Gaussian Processes with Non-Gaussian Likelihoods.” In.
Foong, Li, Hernández-Lobato, et al. 2019.
“‘In-Between’ Uncertainty in Bayesian Neural Networks.” arXiv:1906.11537 [Cs, Stat].
Gal. 2015. “Rapid Prototyping of Probabilistic Models: Emerging Challenges in Variational Inference.” In Advances in Approximate Bayesian Inference Workshop, NIPS.
———. 2016. “Uncertainty in Deep Learning.”
Gal, and Ghahramani. 2015a. “On Modern Deep Learning and Variational Inference.” In Advances in Approximate Bayesian Inference Workshop, NIPS.
———. 2015b.
“Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning.” In
Proceedings of the 33rd International Conference on Machine Learning (ICML-16).
———. 2016b.
“Bayesian Convolutional Neural Networks with Bernoulli Approximate Variational Inference.” In
4th International Conference on Learning Representations (ICLR) Workshop Track.
Gal, Hron, and Kendall. 2017.
“Concrete Dropout.” arXiv:1705.07832 [Stat].
Garnelo, Rosenbaum, Maddison, et al. 2018.
“Conditional Neural Processes.” arXiv:1807.01613 [Cs, Stat].
Garnelo, Schwarz, Rosenbaum, et al. 2018.
“Neural Processes.”
Gorad, Zhao, and Särkkä. 2020. “Parameter Estimation in Non-Linear State-Space Models by Automatic Differentiation of Non-Linear Kalman Filters.” In.
Graves. 2011.
“Practical Variational Inference for Neural Networks.” In
Proceedings of the 24th International Conference on Neural Information Processing Systems. NIPS’11.
Graves, Mohamed, and Hinton. 2013.
“Speech Recognition with Deep Recurrent Neural Networks.” In
2013 IEEE International Conference on Acoustics, Speech and Signal Processing.
Gregor, Danihelka, Graves, et al. 2015.
“DRAW: A Recurrent Neural Network For Image Generation.” arXiv:1502.04623 [Cs].
Gu, Ghahramani, and Turner. 2015.
“Neural Adaptive Sequential Monte Carlo.” In
Advances in Neural Information Processing Systems 28.
Gu, Levine, Sutskever, et al. 2016.
“MuProp: Unbiased Backpropagation for Stochastic Neural Networks.” In
Proceedings of ICLR.
He, Lakshminarayanan, and Teh. 2020.
“Bayesian Deep Ensembles via the Neural Tangent Kernel.” In
Advances in Neural Information Processing Systems.
Hu, Yang, Salakhutdinov, et al. 2018.
“On Unifying Deep Generative Models.” In
arXiv:1706.00550 [Cs, Stat].
Immer, Bauer, Fortuin, et al. 2021.
“Scalable Marginal Likelihood Estimation for Model Selection in Deep Learning.” In
Proceedings of the 38th International Conference on Machine Learning.
Immer, Korzepa, and Bauer. 2021.
“Improving Predictions of Bayesian Neural Nets via Local Linearization.” In
International Conference on Artificial Intelligence and Statistics.
Ingebrigtsen, Lindgren, and Steinsland. 2014.
“Spatial Models with Explanatory Variables in the Dependence Structure.” Spatial Statistics, Spatial Statistics Miami,.
Izmailov, Maddox, Kirichenko, et al. 2020.
“Subspace Inference for Bayesian Deep Learning.” In
Proceedings of The 35th Uncertainty in Artificial Intelligence Conference.
Jospin, Buntine, Boussaid, et al. 2022.
“Hands-on Bayesian Neural Networks — a Tutorial for Deep Learning Users.” arXiv:2007.06823 [Cs, Stat].
Khan, Immer, Abedi, et al. 2020.
“Approximate Inference Turns Deep Networks into Gaussian Processes.” arXiv:1906.01930 [Cs, Stat].
Kingma, Diederik P., Salimans, Jozefowicz, et al. 2016.
“Improving Variational Inference with Inverse Autoregressive Flow.” In
Advances in Neural Information Processing Systems 29.
Kingma, Diederik P., and Welling. 2014.
“Auto-Encoding Variational Bayes.” In
ICLR 2014 Conference.
Krishnan, Shalit, and Sontag. 2015.
“Deep Kalman Filters.” arXiv Preprint arXiv:1511.05121.
———. 2021b.
“Learnable Uncertainty Under Laplace Approximations.” In
Uncertainty in Artificial Intelligence.
Larsen, Sønderby, Larochelle, et al. 2015.
“Autoencoding Beyond Pixels Using a Learned Similarity Metric.” arXiv:1512.09300 [Cs, Stat].
Le, Baydin, and Wood. 2017.
“Inference Compilation and Universal Probabilistic Programming.” In
Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS). Proceedings of Machine Learning Research.
Lee, Jaehoon, Bahri, Novak, et al. 2018.
“Deep Neural Networks as Gaussian Processes.” In
ICLR.
Le, Igl, Jin, et al. 2017.
“Auto-Encoding Sequential Monte Carlo.” arXiv Preprint arXiv:1705.10306.
Lindgren, and Rue. 2015.
“Bayesian Spatial Modelling with R-INLA.” Journal of Statistical Software.
Liu, Qiang, and Wang. 2019.
“Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm.” In
Advances In Neural Information Processing Systems.
Lobacheva, Chirkova, and Vetrov. 2017.
“Bayesian Sparsification of Recurrent Neural Networks.” In
Workshop on Learning to Generate Natural Language.
Louizos, Shalit, Mooij, et al. 2017.
“Causal Effect Inference with Deep Latent-Variable Models.” In
Advances in Neural Information Processing Systems 30.
MacKay. 2002. Information Theory, Inference & Learning Algorithms.
Maddison, Lawson, Tucker, et al. 2017.
“Filtering Variational Objectives.” arXiv Preprint arXiv:1705.09279.
Martens, and Grosse. 2015.
“Optimizing Neural Networks with Kronecker-Factored Approximate Curvature.” In
Proceedings of the 32nd International Conference on Machine Learning.
Matthews, Rowland, Hron, et al. 2018.
“Gaussian Process Behaviour in Wide Deep Neural Networks.” In
arXiv:1804.11271 [Cs, Stat].
Matthews, van der Wilk, Nickson, et al. 2016.
“GPflow: A Gaussian Process Library Using TensorFlow.” arXiv:1610.08733 [Stat].
Molchanov, Ashukha, and Vetrov. 2017.
“Variational Dropout Sparsifies Deep Neural Networks.” In
Proceedings of ICML.
Ngiam, Chen, Koh, et al. 2011.
“Learning Deep Energy Models.” In
Proceedings of the 28th International Conference on Machine Learning (ICML-11).
Ovadia, Fertig, Ren, et al. 2019.
“Can You Trust Your Model’s Uncertainty? Evaluating Predictive Uncertainty Under Dataset Shift.” In
Proceedings of the 33rd International Conference on Neural Information Processing Systems.
Pan, Kuo, Rilee, et al. 2021.
“Assessing Deep Neural Networks as Probability Estimators.” arXiv:2111.08239 [Cs, Stat].
Papadopoulos, Edwards, and Murray. 2001.
“Confidence Estimation Methods for Neural Networks: A Practical Comparison.” IEEE Transactions on Neural Networks.
Papamakarios, Murray, and Pavlakou. 2017.
“Masked Autoregressive Flow for Density Estimation.” In
Advances in Neural Information Processing Systems 30.
Partee, Ringenburg, Robbins, et al. 2019. “Model Parameter Optimization: ML-Guided Trans-Resolution Tuning of Physical Models.” In.
Peluchetti, and Favaro. 2020.
“Infinitely Deep Neural Networks as Diffusion Processes.” In
International Conference on Artificial Intelligence and Statistics.
Petersen, and Pedersen. 2012.
“The Matrix Cookbook.”
Piterbarg, and Fatalov. 1995.
“The Laplace Method for Probability Measures in Banach Spaces.” Russian Mathematical Surveys.
Rasmussen, and Williams. 2006.
Gaussian Processes for Machine Learning. Adaptive Computation and Machine Learning.
Rezende, Danilo Jimenez, and Mohamed. 2015.
“Variational Inference with Normalizing Flows.” In
International Conference on Machine Learning. ICML’15.
Rezende, Danilo J, Racanière, Higgins, et al. 2019. “Equivariant Hamiltonian Flows.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS).
Rue, Riebler, Sørbye, et al. 2016.
“Bayesian Computing with INLA: A Review.” arXiv:1604.00860 [Stat].
Ruiz, Titsias, and Blei. 2016.
“The Generalized Reparameterization Gradient.” In
Advances In Neural Information Processing Systems.
Sanchez-Gonzalez, Bapst, Battaglia, et al. 2019. “Hamiltonian Graph Networks with ODE Integrators.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS).
Saumard, and Wellner. 2014.
“Log-Concavity and Strong Log-Concavity: A Review.” arXiv:1404.5886 [Math, Stat].
Sigrist, Künsch, and Stahel. 2015.
“Stochastic Partial Differential Equation Based Modelling of Large Space-Time Data Sets.” Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Snoek, Rippel, Swersky, et al. 2015.
“Scalable Bayesian Optimization Using Deep Neural Networks.” In
Proceedings of the 32nd International Conference on Machine Learning.
Tran, Dustin, Dusenberry, van der Wilk, et al. 2018.
“Bayesian Layers: A Module for Neural Network Uncertainty.”
Tran, Dustin, Hoffman, Saurous, et al. 2017.
“Deep Probabilistic Programming.” In
ICLR.
Tran, Dustin, Kucukelbir, Dieng, et al. 2016.
“Edward: A Library for Probabilistic Modeling, Inference, and Criticism.” arXiv:1610.09787 [Cs, Stat].
Tran, M.-N., Nguyen, Nott, et al. 2019.
“Bayesian Deep Net GLM and GLMM.” Journal of Computational and Graphical Statistics.
Tran, Ba-Hien, Rossi, Milios, et al. 2021.
“Model Selection for Bayesian Autoencoders.” In
Advances in Neural Information Processing Systems.
Tran, Ba-Hien, Rossi, Milios, et al. 2022.
“All You Need Is a Good Functional Prior for Bayesian Deep Learning.” Journal of Machine Learning Research.
van den Berg, Hasenclever, Tomczak, et al. 2018.
“Sylvester Normalizing Flows for Variational Inference.” In
UAI18.
Wainwright, and Jordan. 2005. “A Variational Principle for Graphical Models.” In New Directions in Statistical Signal Processing.
Watson, Lin, Klink, et al. 2020. “Neural Linear Models with Functional Gaussian Process Priors.” In.
Weber, Starc, Mittal, et al. 2018.
“Optimizing over a Bayesian Last Layer.” In
NeurIPS Workshop on Bayesian Deep Learning.
Wenzel, Roth, Veeling, et al. 2020.
“How Good Is the Bayes Posterior in Deep Neural Networks Really?” In
Proceedings of the 37th International Conference on Machine Learning.
Zeevi, and Meir. 1997.
“Density Estimation Through Convex Combinations of Densities: Approximation and Estimation Bounds.” Neural Networks: The Official Journal of the International Neural Network Society.