Skip to content

class Num::NN::EluLayer(T)
inherits Num::NN::Layer #

Exponential Linear Unit or its widely known name ELU is a function that tend to converge cost to zero faster and produce more accurate results. Different to other activation functions, ELU has a extra alpha constant which should be positive number.

ELU is very similiar to RELU except negative inputs. They are both in identity function form for non-negative inputs. On the other hand, ELU becomes smooth slowly until its output equal to -α whereas RELU sharply smoothes.

Constructors#

.new(context : Num::Grad::Context(T), output_shape : Array(Int32), alpha : Float32 | Float64 = 0.01) #

Initializes an ELU activation layer as part of a Num::NN::Network

Arguments#
View source

Methods#

#forward(input : Num::Grad::Variable(T)) : Num::Grad::Variable(T) #

Computes a forward pass through an ELU layer.

Arguments#
View source

#output_shape : Array(Int32) #

View source