Chrome icon Install the ML Dictionary Chrome Extension and discover a new machine learning concept on every tab 🎉🙌
Logo ml2


Dropout is a regularization technique for neural network models proposed by Srivastava, et al. In their 2014 paper Dropout: A Simple Way to Prevent Neural Networks from Overfitting, They explain Dropout as a technique that randomly selects neurons to be ignored during the training process. The neurons are “dropped-out” randomly, which means that their contribution to the activation of downstream neurons is temporarily removed on the forward pass and any weight updates are not applied to the neuron on the backward pass.

Made by AI Summer Internship ☀️