Modifier and Type | Class and Description |
---|---|
class |
EarlyStoppingConfiguration<T extends Model>
Early stopping configuration: Specifies the various configuration options for running training with early stopping.
Users need to specify the following: (a) EarlyStoppingModelSaver: How models will be saved (to disk, to memory, etc) (Default: in memory) (b) Termination conditions: at least one termination condition must be specified (i) Iteration termination conditions: calculated once for each minibatch. |
static class |
EarlyStoppingConfiguration.Builder<T extends Model> |
interface |
EarlyStoppingModelSaver<T extends Model>
Interface for saving MultiLayerNetworks learned during early stopping, and retrieving them again later
|
class |
EarlyStoppingResult<T extends Model>
EarlyStoppingResult: contains the results of the early stopping training, such as:
- Why the training was terminated
- Score vs.
|
Modifier and Type | Interface and Description |
---|---|
interface |
EarlyStoppingListener<T extends Model>
EarlyStoppingListener is a listener interface for conducting early stopping training.
|
Modifier and Type | Class and Description |
---|---|
class |
InMemoryModelSaver<T extends Model>
Save the best (and latest) models for early stopping training to memory for later retrieval
Note: Assumes that network is cloneable via .clone() method
|
Modifier and Type | Interface and Description |
---|---|
interface |
ScoreCalculator<T extends Model>
ScoreCalculator interface is used to calculate a score for a neural network.
|
Modifier and Type | Class and Description |
---|---|
class |
BaseEarlyStoppingTrainer<T extends Model>
Base/abstract class for conducting early stopping training locally (single machine).
Can be used to train a MultiLayerNetwork or a ComputationGraph via early stopping |
interface |
IEarlyStoppingTrainer<T extends Model>
Interface for early stopping trainers
|
Modifier and Type | Field and Description |
---|---|
protected T |
BaseEarlyStoppingTrainer.model |
Modifier and Type | Interface and Description |
---|---|
interface |
Classifier
A classifier (this is for supervised learning)
|
interface |
Layer
Interface for a layer of a neural network.
|
Modifier and Type | Interface and Description |
---|---|
interface |
IOutputLayer
Interface for output layers (those that calculate gradients with respect to a labels array)
|
interface |
RecurrentLayer
Created by Alex on 28/08/2016.
|
Modifier and Type | Class and Description |
---|---|
class |
ComputationGraph
A ComputationGraph network is a neural network with arbitrary (directed acyclic graph) connection structure.
|
Modifier and Type | Class and Description |
---|---|
class |
ActivationLayer
Activation Layer
Used to apply activation on input and corresponding derivative on epsilon.
|
class |
BaseLayer<LayerConfT extends Layer>
A layer with a bias
and activation function
|
class |
BaseOutputLayer<LayerConfT extends BaseOutputLayer>
Output layer with different objective
in co-occurrences for different objectives.
|
class |
BasePretrainNetwork<LayerConfT extends BasePretrainNetwork>
Baseline class for any Neural Network used
as a layer in a deep network *
|
class |
LossLayer
LossLayer is a flexible output "layer" that performs a loss function on
an input without MLP logic.
|
class |
OutputLayer
Output layer with different objective
incooccurrences for different objectives.
|
Modifier and Type | Class and Description |
---|---|
class |
ConvolutionLayer
Convolution layer
|
Modifier and Type | Class and Description |
---|---|
class |
SubsamplingLayer
Subsampling layer.
|
Modifier and Type | Class and Description |
---|---|
class |
AutoEncoder
Autoencoder.
|
Modifier and Type | Class and Description |
---|---|
class |
DenseLayer |
Modifier and Type | Class and Description |
---|---|
class |
EmbeddingLayer
Embedding layer: feed-forward layer that expects single integers per example as input (class numbers, in range 0 to numClass-1)
as input.
|
Modifier and Type | Class and Description |
---|---|
class |
RBM
Restricted Boltzmann Machine.
|
Modifier and Type | Class and Description |
---|---|
class |
BatchNormalization
Batch normalization layer.
|
class |
LocalResponseNormalization
Deep neural net normalization approach normalizes activations between layers
"brightness normalization"
Used for nets like AlexNet
|
Modifier and Type | Class and Description |
---|---|
class |
BaseRecurrentLayer<LayerConfT extends Layer> |
class |
GravesBidirectionalLSTM
RNN tutorial: http://deeplearning4j.org/usingrnns.html
READ THIS FIRST
Bdirectional LSTM layer implementation.
|
class |
GravesLSTM
LSTM layer implementation.
|
class |
RnnOutputLayer
Recurrent Neural Network Output Layer.
Handles calculation of gradients etc for various objective functions. Functionally the same as OutputLayer, but handles output and label reshaping automatically. Input and output activations are same as other RNN layers: 3 dimensions with shape [miniBatchSize,nIn,timeSeriesLength] and [miniBatchSize,nOut,timeSeriesLength] respectively. |
Modifier and Type | Class and Description |
---|---|
class |
MultiLayerNetwork
MultiLayerNetwork is a neural network with multiple layers in a stack, and usually an output layer.
|
Modifier and Type | Method and Description |
---|---|
static Updater |
UpdaterCreator.getUpdater(Model layer) |
Modifier and Type | Method and Description |
---|---|
Solver.Builder |
Solver.Builder.model(Model model) |
Modifier and Type | Method and Description |
---|---|
void |
IterationListener.iterationDone(Model model,
int iteration)
Event listener for each iteration
|
void |
TrainingListener.onBackwardPass(Model model)
Called once per iteration (backward pass) after gradients have been calculated, and updated
Gradients are available via
gradient() . |
void |
TrainingListener.onEpochEnd(Model model)
Called once at the end of each epoch, when using methods such as
MultiLayerNetwork.fit(DataSetIterator) ,
ComputationGraph.fit(DataSetIterator) or ComputationGraph.fit(MultiDataSetIterator) |
void |
TrainingListener.onEpochStart(Model model)
Called once at the start of each epoch, when using methods such as
MultiLayerNetwork.fit(DataSetIterator) ,
ComputationGraph.fit(DataSetIterator) or ComputationGraph.fit(MultiDataSetIterator) |
void |
TrainingListener.onForwardPass(Model model,
List<org.nd4j.linalg.api.ndarray.INDArray> activations)
Called once per iteration (forward pass) for activations (usually for a
MultiLayerNetwork ),
only at training time |
void |
TrainingListener.onForwardPass(Model model,
Map<String,org.nd4j.linalg.api.ndarray.INDArray> activations)
Called once per iteration (forward pass) for activations (usually for a
ComputationGraph ),
only at training time |
void |
TrainingListener.onGradientCalculation(Model model)
Called once per iteration (backward pass) before the gradients are updated
Gradients are available via
gradient() . |
void |
ConvexOptimizer.updateGradientAccordingToParams(Gradient gradient,
Model model,
int batchSize)
Update the gradient according to the configuration such as adagrad, momentum, and sparsity
|
Modifier and Type | Method and Description |
---|---|
void |
ComposableIterationListener.iterationDone(Model model,
int iteration) |
void |
PerformanceListener.iterationDone(Model model,
int iteration) |
void |
ParamAndGradientIterationListener.iterationDone(Model model,
int iteration) |
void |
CollectScoresIterationListener.iterationDone(Model model,
int iteration) |
void |
ScoreIterationListener.iterationDone(Model model,
int iteration) |
Modifier and Type | Field and Description |
---|---|
protected Model |
BaseOptimizer.model |
Modifier and Type | Method and Description |
---|---|
void |
BaseOptimizer.updateGradientAccordingToParams(Gradient gradient,
Model model,
int batchSize) |
Modifier and Type | Method and Description |
---|---|
static org.nd4j.linalg.heartbeat.reports.Task |
ModelSerializer.taskByModel(Model model) |
static void |
ModelSerializer.writeModel(Model model,
File file,
boolean saveUpdater)
Write a model to a file
|
static void |
ModelSerializer.writeModel(Model model,
OutputStream stream,
boolean saveUpdater)
Write a model to an output stream
|
static void |
ModelSerializer.writeModel(Model model,
String path,
boolean saveUpdater)
Write a model to a file path
|
Copyright © 2016. All Rights Reserved.