LayerNorm< InputDataType, OutputDataType > Class Template Reference

Declaration of the Layer Normalization class. More...

Public Member Functions

 LayerNorm ()
 Create the LayerNorm object. More...

 
 LayerNorm (const size_t size, const double eps=1e-8)
 Create the LayerNorm object for a specified number of input units. More...

 
template
<
typename
eT
>
void Backward (const arma::Mat< eT > &input, const arma::Mat< eT > &gy, arma::Mat< eT > &g)
 Backward pass through the layer. More...

 
OutputDataType const & Delta () const
 Get the delta. More...

 
OutputDataType & Delta ()
 Modify the delta. More...

 
double Epsilon () const
 Get the value of epsilon. More...

 
template
<
typename
eT
>
void Forward (const arma::Mat< eT > &input, arma::Mat< eT > &output)
 Forward pass of Layer Normalization. More...

 
template
<
typename
eT
>
void Gradient (const arma::Mat< eT > &input, const arma::Mat< eT > &error, arma::Mat< eT > &gradient)
 Calculate the gradient using the output delta and the input activations. More...

 
OutputDataType const & Gradient () const
 Get the gradient. More...

 
OutputDataType & Gradient ()
 Modify the gradient. More...

 
size_t InputShape () const
 Get the shape of the input. More...

 
size_t InSize () const
 Get the number of input units. More...

 
OutputDataType Mean ()
 Get the mean across single training data. More...

 
OutputDataType const & OutputParameter () const
 Get the output parameter. More...

 
OutputDataType & OutputParameter ()
 Modify the output parameter. More...

 
OutputDataType const & Parameters () const
 Get the parameters. More...

 
OutputDataType & Parameters ()
 Modify the parameters. More...

 
void Reset ()
 Reset the layer parameters. More...

 
template
<
typename
Archive
>
void serialize (Archive &ar, const uint32_t)
 Serialize the layer. More...

 
OutputDataType Variance ()
 Get the variance across single training data. More...

 

Detailed Description


template
<
typename
InputDataType
=
arma::mat
,
typename
OutputDataType
=
arma::mat
>

class mlpack::ann::LayerNorm< InputDataType, OutputDataType >

Declaration of the Layer Normalization class.

The layer transforms the input data into zero mean and unit variance and then scales and shifts the data by parameters, gamma and beta respectively over a single training data. These parameters are learnt by the network. Layer Normalization is different from Batch Normalization in the way that normalization is done for individual training cases, and the mean and standard deviations are computed across the layer dimensions, as opposed to across the batch.

For more information, refer to the following papers,

@article{Ba16,
author = {Jimmy Lei Ba, Jamie Ryan Kiros and Geoffrey E. Hinton},
title = {Layer Normalization},
volume = {abs/1607.06450},
year = {2016},
url = {http://arxiv.org/abs/1607.06450},
eprint = {1607.06450},
}
@article{Ioffe15,
author = {Sergey Ioffe and
Christian Szegedy},
title = {Batch Normalization: Accelerating Deep Network Training by
Reducing Internal Covariate Shift},
journal = {CoRR},
volume = {abs/1502.03167},
year = {2015},
url = {http://arxiv.org/abs/1502.03167},
eprint = {1502.03167},
}
Template Parameters
InputDataTypeType of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).
OutputDataTypeType of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).

Definition at line 65 of file layer_norm.hpp.

Constructor & Destructor Documentation

◆ LayerNorm() [1/2]

LayerNorm ( )

Create the LayerNorm object.

◆ LayerNorm() [2/2]

LayerNorm ( const size_t  size,
const double  eps = 1e-8 
)

Create the LayerNorm object for a specified number of input units.

Parameters
sizeThe number of input units.
epsThe epsilon added to variance to ensure numerical stability.

Member Function Documentation

◆ Backward()

void Backward ( const arma::Mat< eT > &  input,
const arma::Mat< eT > &  gy,
arma::Mat< eT > &  g 
)

Backward pass through the layer.

Parameters
inputThe input activations.
gyThe backpropagated error.
gThe calculated gradient.

◆ Delta() [1/2]

OutputDataType const& Delta ( ) const
inline

Get the delta.

Definition at line 130 of file layer_norm.hpp.

◆ Delta() [2/2]

OutputDataType& Delta ( )
inline

Modify the delta.

Definition at line 132 of file layer_norm.hpp.

◆ Epsilon()

double Epsilon ( ) const
inline

Get the value of epsilon.

Definition at line 149 of file layer_norm.hpp.

◆ Forward()

void Forward ( const arma::Mat< eT > &  input,
arma::Mat< eT > &  output 
)

Forward pass of Layer Normalization.

Transforms the input data into zero mean and unit variance, scales the data by a factor gamma and shifts it by beta.

Parameters
inputInput data for the layer.
outputResulting output activations.

◆ Gradient() [1/3]

void Gradient ( const arma::Mat< eT > &  input,
const arma::Mat< eT > &  error,
arma::Mat< eT > &  gradient 
)

Calculate the gradient using the output delta and the input activations.

Parameters
inputThe input activations.
errorThe calculated error.
gradientThe calculated gradient.

◆ Gradient() [2/3]

OutputDataType const& Gradient ( ) const
inline

Get the gradient.

Definition at line 135 of file layer_norm.hpp.

◆ Gradient() [3/3]

OutputDataType& Gradient ( )
inline

Modify the gradient.

Definition at line 137 of file layer_norm.hpp.

◆ InputShape()

size_t InputShape ( ) const
inline

Get the shape of the input.

Definition at line 152 of file layer_norm.hpp.

References LayerNorm< InputDataType, OutputDataType >::serialize().

◆ InSize()

size_t InSize ( ) const
inline

Get the number of input units.

Definition at line 146 of file layer_norm.hpp.

◆ Mean()

OutputDataType Mean ( )
inline

Get the mean across single training data.

Definition at line 140 of file layer_norm.hpp.

◆ OutputParameter() [1/2]

OutputDataType const& OutputParameter ( ) const
inline

Get the output parameter.

Definition at line 125 of file layer_norm.hpp.

◆ OutputParameter() [2/2]

OutputDataType& OutputParameter ( )
inline

Modify the output parameter.

Definition at line 127 of file layer_norm.hpp.

◆ Parameters() [1/2]

OutputDataType const& Parameters ( ) const
inline

Get the parameters.

Definition at line 120 of file layer_norm.hpp.

◆ Parameters() [2/2]

OutputDataType& Parameters ( )
inline

Modify the parameters.

Definition at line 122 of file layer_norm.hpp.

◆ Reset()

void Reset ( )

Reset the layer parameters.

◆ serialize()

void serialize ( Archive &  ar,
const uint32_t   
)

Serialize the layer.

Referenced by LayerNorm< InputDataType, OutputDataType >::InputShape().

◆ Variance()

OutputDataType Variance ( )
inline

Get the variance across single training data.

Definition at line 143 of file layer_norm.hpp.


The documentation for this class was generated from the following file:
  • /home/ryan/src/mlpack.org/_src/mlpack-git/src/mlpack/methods/ann/layer/layer_norm.hpp