A Theory of Non-acyclic Generative Flow Networks
DOI:
https://doi.org/10.1609/aaai.v38i10.28989Keywords:
ML: Deep Generative Models & Autoencoders, ML: Deep Learning Theory, ML: Graph-based Machine Learning, ML: Multimodal Learning, ML: Online Learning & Bandits, ML: Optimization, ML: Reinforcement Learning, SO: Heuristic Search, SO: Mixed Discrete/Continuous Search, SO: Non-convex Optimization, SO: Other Foundations of Search & OptimizationAbstract
GFlowNets is a novel flow-based method for learning a stochastic policy to generate objects via a sequence of actions and with probability proportional to a given positive reward. We contribute to relaxing hypotheses limiting the application range of GFlowNets, in particular: acyclicity (or lack thereof). To this end, we extend the theory of GFlowNets on measurable spaces which includes continuous state spaces without cycle restrictions, and provide a generalization of cycles in this generalized context. We show that losses used so far push flows to get stuck into cycles and we define a family of losses solving this issue. Experiments on graphs and continuous tasks validate those principles.Downloads
Published
2024-03-24
How to Cite
Brunswic, L., Li, Y., Xu, Y., Feng, Y., Jui, S., & Ma, L. (2024). A Theory of Non-acyclic Generative Flow Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 38(10), 11124-11131. https://doi.org/10.1609/aaai.v38i10.28989
Issue
Section
AAAI Technical Track on Machine Learning I