Scroll

dropout

A regularization technique in machine learning where certain neurons are randomly “dropped out” during training to prevent overfitting and improve generalization of the model.