Get the latest tech news

Bayesian Neural Networks


ference allows us to learn a probability distribution over possible neural networks. We can approximately solve inference with a simple modification to standard neural network tools.

Because they are good at approximating functions (input-output relationships) when lots of data are available, neural networks are well-suited to artificial intelligence tasks like speech recognition and image classification. Deep neural networks have a ton of parameters (typically millions in modern models), which essentially guarantees eventual overfitting because the learning algorithm can always do just a little bit better on the training set by tweaking some of the many knobs available to it. But because the flexibility of neural networks makes them particularly susceptible, researchers and practitioners have come up with many extensions to the standard learning algorithm (early stopping, weight decay, and dropout, just to name a few) to reduce overfitting .

Get the Android app

Or read this on Hacker News