RNN< OutputLayerType, InitializationRuleType, CustomLayers > Class Template Reference

Implementation of a standard recurrent neural network container. More...

Inheritance diagram for RNN< OutputLayerType, InitializationRuleType, CustomLayers >:

Public Types

using NetworkType = RNN< OutputLayerType, InitializationRuleType, CustomLayers... >
 Convenience typedef for the internal model construction. More...

 

Public Member Functions

 RNN (const size_t rho, const bool single=false, OutputLayerType outputLayer=OutputLayerType(), InitializationRuleType initializeRule=InitializationRuleType())
 Create the RNN object. More...

 
 RNN (const RNN &)
 Copy constructor. More...

 
 RNN (RNN &&)
 Move constructor. More...

 
 ~RNN ()
 Destructor to release allocated memory. More...

 
template<class LayerType , class... Args>
void Add (Args... args)
 
void Add (LayerTypes< CustomLayers... > layer)
 
double Evaluate (const arma::mat &parameters, const size_t begin, const size_t batchSize, const bool deterministic)
 Evaluate the recurrent neural network with the given parameters. More...

 
double Evaluate (const arma::mat &parameters, const size_t begin, const size_t batchSize)
 Evaluate the recurrent neural network with the given parameters. More...

 
template
<
typename
GradType
>
double EvaluateWithGradient (const arma::mat &parameters, const size_t begin, GradType &gradient, const size_t batchSize)
 Evaluate the recurrent neural network with the given parameters. More...

 
void Gradient (const arma::mat &parameters, const size_t begin, arma::mat &gradient, const size_t batchSize)
 Evaluate the gradient of the recurrent neural network with the given parameters, and with respect to only one point in the dataset. More...

 
size_t NumFunctions () const
 Return the number of separable functions (the number of predictor points). More...

 
RNNoperator= (const RNN &)
 Copy assignment operator. More...

 
RNNoperator= (RNN &&)
 Move assignment operator. More...

 
const arma::mat & Parameters () const
 Return the initial point for the optimization. More...

 
arma::mat & Parameters ()
 Modify the initial point for the optimization. More...

 
void Predict (arma::cube predictors, arma::cube &results, const size_t batchSize=256)
 Predict the responses to a given set of predictors. More...

 
const arma::cube & Predictors () const
 Get the matrix of data points (predictors). More...

 
arma::cube & Predictors ()
 Modify the matrix of data points (predictors). More...

 
void Reset ()
 Reset the state of the network. More...

 
void ResetParameters ()
 Reset the module information (weights/parameters). More...

 
const arma::cube & Responses () const
 Get the matrix of responses to the input data points. More...

 
arma::cube & Responses ()
 Modify the matrix of responses to the input data points. More...

 
const size_t & Rho () const
 Return the maximum length of backpropagation through time. More...

 
size_t & Rho ()
 Modify the maximum length of backpropagation through time. More...

 
template
<
typename
Archive
>
void serialize (Archive &ar, const uint32_t)
 Serialize the model. More...

 
void Shuffle ()
 Shuffle the order of function visitation. More...

 
template<typename OptimizerType , typename... CallbackTypes>
double Train (arma::cube predictors, arma::cube responses, OptimizerType &optimizer, CallbackTypes &&... callbacks)
 Train the recurrent neural network on the given input data using the given optimizer. More...

 
template<typename OptimizerType = ens::StandardSGD, typename... CallbackTypes>
double Train (arma::cube predictors, arma::cube responses, CallbackTypes &&... callbacks)
 Train the recurrent neural network on the given input data. More...

 
template
<
typename
OptimizerType
>
std::enable_if< HasMaxIterations< OptimizerType, size_t &(OptimizerType::*)()>::value, void >::type WarnMessageMaxIterations (OptimizerType &optimizer, size_t samples) const
 Check if the optimizer has MaxIterations() parameter, if it does then check if it's value is less than the number of datapoints in the dataset. More...

 
template
<
typename
OptimizerType
>
std::enable_if< !HasMaxIterations< OptimizerType, size_t &(OptimizerType::*)()>::value, void >::type WarnMessageMaxIterations (OptimizerType &optimizer, size_t samples) const
 Check if the optimizer has MaxIterations() parameter, if it doesn't then simply return from the function. More...

 

Detailed Description


template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization, typename... CustomLayers>
class mlpack::ann::RNN< OutputLayerType, InitializationRuleType, CustomLayers >

Implementation of a standard recurrent neural network container.

Template Parameters
OutputLayerTypeThe output layer type used to evaluate the network.
InitializationRuleTypeRule used to initialize the weight matrix.

Definition at line 45 of file rnn.hpp.

Member Typedef Documentation

◆ NetworkType

using NetworkType = RNN<OutputLayerType, InitializationRuleType, CustomLayers...>

Convenience typedef for the internal model construction.

Definition at line 51 of file rnn.hpp.

Constructor & Destructor Documentation

◆ RNN() [1/3]

RNN ( const size_t  rho,
const bool  single = false,
OutputLayerType  outputLayer = OutputLayerType(),
InitializationRuleType  initializeRule = InitializationRuleType() 
)

Create the RNN object.

Optionally, specify which initialize rule and performance function should be used.

If you want to pass in a parameter and discard the original parameter object, be sure to use std::move to avoid unnecessary copy.

Parameters
rhoMaximum number of steps to backpropagate through time (BPTT).
singlePredict only the last element of the input sequence.
outputLayerOutput layer used to evaluate the network.
initializeRuleOptional instantiated InitializationRule object for initializing the network parameter.

◆ RNN() [2/3]

RNN ( const RNN< OutputLayerType, InitializationRuleType, CustomLayers > &  )

Copy constructor.

◆ RNN() [3/3]

RNN ( RNN< OutputLayerType, InitializationRuleType, CustomLayers > &&  )

Move constructor.

◆ ~RNN()

~RNN ( )

Destructor to release allocated memory.

Member Function Documentation

◆ Add() [1/2]

void Add ( Args...  args)
inline

Definition at line 284 of file rnn.hpp.

◆ Add() [2/2]

void Add ( LayerTypes< CustomLayers... >  layer)
inline

Definition at line 291 of file rnn.hpp.

◆ Evaluate() [1/2]

double Evaluate ( const arma::mat &  parameters,
const size_t  begin,
const size_t  batchSize,
const bool  deterministic 
)

Evaluate the recurrent neural network with the given parameters.

This function is usually called by the optimizer to train the model.

Parameters
parametersMatrix model parameters.
beginIndex of the starting point to use for objective function evaluation.
batchSizeNumber of points to be passed at a time to use for objective function evaluation.
deterministicWhether or not to train or test the model. Note some layer act differently in training or testing mode.

◆ Evaluate() [2/2]

double Evaluate ( const arma::mat &  parameters,
const size_t  begin,
const size_t  batchSize 
)

Evaluate the recurrent neural network with the given parameters.

This function is usually called by the optimizer to train the model. This just calls the other overload of Evaluate() with deterministic = true.

Parameters
parametersMatrix model parameters.
beginIndex of the starting point to use for objective function evaluation.
batchSizeNumber of points to be passed at a time to use for objective function evaluation.

◆ EvaluateWithGradient()

double EvaluateWithGradient ( const arma::mat &  parameters,
const size_t  begin,
GradType &  gradient,
const size_t  batchSize 
)

Evaluate the recurrent neural network with the given parameters.

This function is usually called by the optimizer to train the model.

Parameters
parametersMatrix model parameters.
beginIndex of the starting point to use for objective function evaluation.
gradientMatrix to output gradient into.
batchSizeNumber of points to be passed at a time to use for objective function evaluation.

◆ Gradient()

void Gradient ( const arma::mat &  parameters,
const size_t  begin,
arma::mat &  gradient,
const size_t  batchSize 
)

Evaluate the gradient of the recurrent neural network with the given parameters, and with respect to only one point in the dataset.

This is useful for optimizers such as SGD, which require a separable objective function.

Parameters
parametersMatrix of the model parameters to be optimized.
beginIndex of the starting point to use for objective function gradient evaluation.
gradientMatrix to output gradient into.
batchSizeNumber of points to be processed as a batch for objective function gradient evaluation.

Referenced by RNN< OutputLayerType, InitializationRuleType, CustomLayers... >::Predictors().

◆ NumFunctions()

size_t NumFunctions ( ) const
inline

Return the number of separable functions (the number of predictor points).

Definition at line 294 of file rnn.hpp.

◆ operator=() [1/2]

RNN& operator= ( const RNN< OutputLayerType, InitializationRuleType, CustomLayers > &  )

Copy assignment operator.

◆ operator=() [2/2]

RNN& operator= ( RNN< OutputLayerType, InitializationRuleType, CustomLayers > &&  )

Move assignment operator.

◆ Parameters() [1/2]

const arma::mat& Parameters ( ) const
inline

Return the initial point for the optimization.

Definition at line 297 of file rnn.hpp.

◆ Parameters() [2/2]

arma::mat& Parameters ( )
inline

Modify the initial point for the optimization.

Definition at line 299 of file rnn.hpp.

◆ Predict()

void Predict ( arma::cube  predictors,
arma::cube &  results,
const size_t  batchSize = 256 
)

Predict the responses to a given set of predictors.

The responses will reflect the output of the given output layer as returned by the output layer function.

If you want to pass in a parameter and discard the original parameter object, be sure to use std::move to avoid unnecessary copy.

The format of the data should be as follows:

  • each slice should correspond to a time step
  • each column should correspond to a data point
  • each row should correspond to a dimension So, e.g., predictors(i, j, k) is the i'th dimension of the j'th data point at time slice k. The responses will be in the same format.
Parameters
predictorsInput predictors.
resultsMatrix to put output predictions of responses into.
batchSizeNumber of points to predict at once.

◆ Predictors() [1/2]

const arma::cube& Predictors ( ) const
inline

Get the matrix of data points (predictors).

Definition at line 312 of file rnn.hpp.

◆ Predictors() [2/2]

arma::cube& Predictors ( )
inline

Modify the matrix of data points (predictors).

Definition at line 314 of file rnn.hpp.

◆ Reset()

void Reset ( )

Reset the state of the network.

This ensures that all internally-held gradients are set to 0, all memory cells are reset, and the parameters matrix is the right size.

Referenced by RNN< OutputLayerType, InitializationRuleType, CustomLayers... >::Predictors().

◆ ResetParameters()

void ResetParameters ( )

Reset the module information (weights/parameters).

Referenced by RNN< OutputLayerType, InitializationRuleType, CustomLayers... >::Predictors().

◆ Responses() [1/2]

const arma::cube& Responses ( ) const
inline

Get the matrix of responses to the input data points.

Definition at line 307 of file rnn.hpp.

◆ Responses() [2/2]

arma::cube& Responses ( )
inline

Modify the matrix of responses to the input data points.

Definition at line 309 of file rnn.hpp.

◆ Rho() [1/2]

const size_t& Rho ( ) const
inline

Return the maximum length of backpropagation through time.

Definition at line 302 of file rnn.hpp.

◆ Rho() [2/2]

size_t& Rho ( )
inline

Modify the maximum length of backpropagation through time.

Definition at line 304 of file rnn.hpp.

◆ serialize()

void serialize ( Archive &  ar,
const uint32_t   
)

◆ Shuffle()

void Shuffle ( )

Shuffle the order of function visitation.

This may be called by the optimizer.

◆ Train() [1/2]

double Train ( arma::cube  predictors,
arma::cube  responses,
OptimizerType &  optimizer,
CallbackTypes &&...  callbacks 
)

Train the recurrent neural network on the given input data using the given optimizer.

This will use the existing model parameters as a starting point for the optimization. If this is not what you want, then you should access the parameters vector directly with Parameters() and modify it as desired.

If you want to pass in a parameter and discard the original parameter object, be sure to use std::move to avoid unnecessary copy.

The format of the data should be as follows:

  • each slice should correspond to a time step
  • each column should correspond to a data point
  • each row should correspond to a dimension So, e.g., predictors(i, j, k) is the i'th dimension of the j'th data point at time slice k.
Template Parameters
OptimizerTypeType of optimizer to use to train the model.
CallbackTypesTypes of Callback Functions.
Parameters
predictorsInput training variables.
responsesOutputs results from input training variables.
optimizerInstantiated optimizer used to train the model.
callbacksCallback function for ensmallen optimizer OptimizerType. See https://www.ensmallen.org/docs.html#callback-documentation.
Returns
The final objective of the trained model (NaN or Inf on error).

◆ Train() [2/2]

double Train ( arma::cube  predictors,
arma::cube  responses,
CallbackTypes &&...  callbacks 
)

Train the recurrent neural network on the given input data.

By default, the SGD optimization algorithm is used, but others can be specified (such as ens::RMSprop).

This will use the existing model parameters as a starting point for the optimization. If this is not what you want, then you should access the parameters vector directly with Parameters() and modify it as desired.

If you want to pass in a parameter and discard the original parameter object, be sure to use std::move to avoid unnecessary copy.

The format of the data should be as follows:

  • each slice should correspond to a time step
  • each column should correspond to a data point
  • each row should correspond to a dimension So, e.g., predictors(i, j, k) is the i'th dimension of the j'th data point at time slice k.
Template Parameters
OptimizerTypeType of optimizer to use to train the model.
CallbackTypesTypes of Callback Functions.
Parameters
predictorsInput training variables.
responsesOutputs results from input training variables.
callbacksCallback function for ensmallen optimizer OptimizerType. See https://www.ensmallen.org/docs.html#callback-documentation.
Returns
The final objective of the trained model (NaN or Inf on error).

◆ WarnMessageMaxIterations() [1/2]

std::enable_if< HasMaxIterations<OptimizerType, size_t&(OptimizerType::*)()>::value, void>::type WarnMessageMaxIterations ( OptimizerType &  optimizer,
size_t  samples 
) const

Check if the optimizer has MaxIterations() parameter, if it does then check if it's value is less than the number of datapoints in the dataset.

Template Parameters
OptimizerTypeType of optimizer to use to train the model.
Parameters
optimizeroptimizer used in the training process.
samplesNumber of datapoints in the dataset.

◆ WarnMessageMaxIterations() [2/2]

std::enable_if< !HasMaxIterations<OptimizerType, size_t&(OptimizerType::*)()>::value, void>::type WarnMessageMaxIterations ( OptimizerType &  optimizer,
size_t  samples 
) const

Check if the optimizer has MaxIterations() parameter, if it doesn't then simply return from the function.

Template Parameters
OptimizerTypeType of optimizer to use to train the model.
Parameters
optimizeroptimizer used in the training process.
samplesNumber of datapoints in the dataset.

The documentation for this class was generated from the following file:
  • /home/ryan/src/mlpack.org/_src/mlpack-git/src/mlpack/methods/ann/rnn.hpp