class Num::NN::EluLayer(T)
inherits Num::NN::Layer
#
Exponential Linear Unit or its widely known name ELU is a function that tend to converge cost to zero faster and produce more accurate results. Different to other activation functions, ELU has a extra alpha constant which should be positive number.
ELU is very similiar to RELU except negative inputs. They are both in identity function form for non-negative inputs. On the other hand, ELU becomes smooth slowly until its output equal to -α whereas RELU sharply smoothes.
Constructors#
.new(context : Num::Grad::Context(T), output_shape : Array(Int32), alpha : Float32 | Float64 = 0.01)
#
(context : Num::Grad::Context(T), output_shape : Array(Int32), alpha : Float32 | Float64 = 0.01)
Initializes an ELU activation layer as part of a Num::NN::Network
Arguments#
- context :
Num::Grad::Context(T)
- Context of theNum::NN::Network
, used only to determine generic type of theNum::NN::Layer(T)
- output_shape :
Array(Int32)
- The shape of the output of the layer - alpha :
Float
- Scale for the negative factor
Methods#
#forward(input : Num::Grad::Variable(T)) : Num::Grad::Variable(T)
#
(input : Num::Grad::Variable(T)) : Num::Grad::Variable(T)
Computes a forward pass through an ELU layer.
Arguments#
- input :
Num::Grad::Variable(T)
- Variable to activate