Bayesian Deep Learning
Using Bayesian inference in deep learning
Last updated
Using Bayesian inference in deep learning
Last updated
The idea behind Bayesian Neural Network is to model the network's wights as a distribution conditioned on the training data , instead of a deterministic point estimates. By placing a prior over the weights e.g. , the network training can be interpreted as determining a posterior over the weights given the training data: . However, evaluating this posterior is not tractable without approximation techniques.
We can apply this process to neural networks and come up with the probability distribution over the network weights, , given the training data, .
When you add the "Regularization or Weight Decay", it simply assumes that my weights follows zero centered gaussian distribution. Hence, it is a way to incorporate a prior.
Neurips 2019 talk - See at 19:00, interesting