A Hybrid Gradient Method to Designing Bayesian Experiments for Implicit Models. (arXiv:2103.08594v1 [cs.LG])

Bayesian experimental design (BED) aims at designing an experiment to
maximize the information gathering from the collected data. The optimal design
is usually achieved by maximizing the mutual information (MI) between the data
and the model parameters. When the analytical expression of the MI is
unavailable, e.g., having implicit models with intractable data distributions,
a neural network-based lower bound of the MI was recently proposed and a
gradient ascent method was used to maximize the lower bound. However, the
approach in Kleinegesse et al., 2020 requires a pathwise sampling path to
compute the gradient of the MI lower bound with respect to the design
variables, and such a pathwise sampling path is usually inaccessible for
implicit models. In this work, we propose a hybrid gradient approach that
leverages recent advances in variational MI estimator and evolution strategies
(ES) combined with black-box stochastic gradient ascent (SGA) to maximize the
MI lower bound. This allows the design process to be achieved through a unified
scalable procedure for implicit models without sampling path gradients. Several
experiments demonstrate that our approach significantly improves the
scalability of BED for implicit models in high-dimensional design space.

Source: https://arxiv.org/abs/2103.08594

webmaster

Related post