A NNAISENSE research team introduces a novel class of generative models known as Bayesian Flow Networks (BFNs). These BFNs combine the power of Bayesian inference with neural networks in an iterative modeling process, enabling successful application to continuous, discretized, and discrete data while maintaining competitive performance.
In a new paper MAPTree: Beating “Optimal” Decision Trees with Bayesian Decision Trees, a Stanford University research team introduces MAPTree, an algorithm that confidently uncovers the maximum a posteriori tree within Bayesian Classification and Regression Trees (BCART) posterior, achieving strong performance with significantly leaner and faster trees.
In a new paper Bayesian Flow Networks, the NNAISENSE research team presents Bayesian Flow Networks (BFNs), a novel family of generative model manipulates parameters of the data distribution rather than operating on noisy data, which provides an effective solution to deal with discrete data.
In the new paper Laplace Redux — Effortless Bayesian Deep Learning, a research team from the University of Cambridge, University of Tübingen, ETH Zurich and DeepMind conducts extensive experiments demonstrating that the Laplace approximation (LA) is a simple and cost-efficient yet competitive approximation method for inference in Bayesian deep learning.