The ELU activation function, defined by. More...
Public Member Functions | |
ELU () | |
Create the ELU object. More... | |
ELU (const double alpha) | |
Create the ELU object using the specified parameter. More... | |
double const & | Alpha () const |
Get the non zero gradient. More... | |
double & | Alpha () |
Modify the non zero gradient. More... | |
template < typename DataType > | |
void | Backward (const DataType &input, const DataType &gy, DataType &g) |
Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f. More... | |
OutputDataType const & | Delta () const |
Get the delta. More... | |
OutputDataType & | Delta () |
Modify the delta. More... | |
bool | Deterministic () const |
Get the value of deterministic parameter. More... | |
bool & | Deterministic () |
Modify the value of deterministic parameter. More... | |
template < typename InputType , typename OutputType > | |
void | Forward (const InputType &input, OutputType &output) |
Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. More... | |
double const & | Lambda () const |
Get the lambda parameter. More... | |
OutputDataType const & | OutputParameter () const |
Get the output parameter. More... | |
OutputDataType & | OutputParameter () |
Modify the output parameter. More... | |
template < typename Archive > | |
void | serialize (Archive &ar, const uint32_t) |
Serialize the layer. More... | |
The ELU activation function, defined by.
For more information, read the following paper:
The SELU activation function is defined by
For more information, read the following paper:
In the deterministic mode, there is no computation of the derivative.
InputDataType | Type of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube). |
OutputDataType | Type of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube). |
ELU | ( | const double | alpha | ) |
|
inline |
void Backward | ( | const DataType & | input, |
const DataType & | gy, | ||
DataType & | g | ||
) |
Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f.
Using the results from the feed forward pass.
input | The propagated input activation f(x). |
gy | The backpropagated error. |
g | The calculated gradient. |
|
inline |
|
inline |
void Forward | ( | const InputType & | input, |
OutputType & | output | ||
) |
Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f.
input | Input data used for evaluating the specified function. |
output | Resulting output activation. |
|
inline |
Get the lambda parameter.
Definition at line 174 of file elu.hpp.
References ELU< InputDataType, OutputDataType >::serialize().
|
inline |
|
inline |
void serialize | ( | Archive & | ar, |
const uint32_t | |||
) |
Serialize the layer.
Referenced by ELU< InputDataType, OutputDataType >::Lambda().