Generative flow nets

Gflownets

November 11, 2021 — February 13, 2023

approximation
Bayes
generative
likelihood free
Monte Carlo
neural nets
optimization
probabilistic algorithms
probability
statistics
unsupervised
Figure 1

Placeholder.

A concept that Yoshua Bengio is excited about includes several keywords with which I am familiar (generative, flow) but I believe is distinct from the concatenation of those terms.

Y. Bengio et al. (2022):

Generative Flow Networks (GFlowNets) have been introduced as a method to sample a diverse set of candidates in an active learning context, with a training objective that makes them approximately sample in proportion to a given reward function. In this paper, we show a number of additional theoretical properties of GFlowNets. They can be used to estimate joint probability distributions and the corresponding marginal distributions where some variables are unspecified and, of particular interest, can represent distributions over composite objects like sets and graphs. GFlowNets amortize the work typically done by computationally expensive MCMC methods in a single but trained generative pass. They could also be used to estimate partition functions and free energies, conditional probabilities of supersets (supergraphs) given a subset (subgraph), as well as marginal distributions over all supersets (supergraphs) of a given set (graph). We introduce variations enabling the estimation of entropy and mutual information, sampling from a Pareto frontier, connections to reward-maximising policies, and extensions to stochastic environments, continuous actions and modular energy functions.

1 References

Bengio, Emmanuel, Jain, Korablyov, et al. 2021. Flow Network Based Generative Models for Non-Iterative Diverse Candidate Generation.” In Advances in Neural Information Processing Systems.
Bengio, Yoshua, Lahlou, Deleu, et al. 2022. GFlowNet Foundations.”
Jain, Deleu, Hartford, et al. 2023. GFlowNets for AI-Driven Scientific Discovery.” Digital Discovery.