Skip to content

class Num::NN::DropoutLayer(T)
inherits Num::NN::Layer #

Dilution (also called Dropout) is a regularization technique for reducing overfitting in artificial neural networks by preventing complex co-adaptations on training data. It is an efficient way of performing model averaging with neural networks. The term dilution refers to the thinning of the weights. The term dropout refers to randomly "dropping out", or omitting, units (both hidden and visible) during the training process of a neural network. Both the thinning of weights and dropping out units trigger the same type of regularization, and often the term dropout is used when referring to the dilution of weights.

Constructors#

.new(context : Num::Grad::Context(T), output_shape : Array(Int32), prob = 0.5_f32) #

Initialize a dropout layer in a Num::NN::Network(T)

Arguments#
  • context : Num::Grad::Context(T) - Context associated with the network, used only for determining generic type.
  • output_shape : Array(Int32) - Cached output shape
  • prob : Float32 - Probability of dropping out a value when performing a forward pass
View source

Methods#

#forward(input : Num::Grad::Variable(T)) : Num::Grad::Variable(T) #

Computes the forward pass of a Num::NN::Network. This will remove a certain amount of neurons from the input variable, and scale the remaining values by the probability of removal.

Arguments#
View source

#output_shape : Array(Int32) #

View source