PuSH - Publication Server of Helmholtz Zentrum München

Sharma, M.* ; Rainforth, T.* ; Teh, Y.W.* ; Fortuin, V.

Incorporating unlabelled data into bayesian neural networks.

Trans. Machine Learn. Res. 2024, accepted (2024)
Postprint
Conventional Bayesian Neural Networks (BNNs) are unable to leverage unlabelled data to improve their predictions. To overcome this limitation, we introduce Self-Supervised Bayesian Neural Networks, which use unlabelled data to learn models with suitable prior predictive distributions. This is achieved by leveraging contrastive pretraining techniques and optimising a variational lower bound. We then show that the prior predictive distributions of self-supervised BNNs capture problem semantics better than conventional BNN priors. In turn, our approach offers improved predictive performance over conventional BNNs, especially in low-budget regimes.
Additional Metrics?
Edit extra informations Login
Publication type Article: Journal article
Document type Review
Corresponding Author
ISSN (print) / ISBN 2835-8856
e-ISSN 2835-8856
Quellenangaben Volume: 2024 Issue: , Pages: , Article Number: , Supplement: ,
Publisher Journal of Machine Learning Research Inc.
Non-patent literature Publications
Reviewing status Peer reviewed