public class ElementWiseMultiplicationLayer extends FeedForwardLayer
created by jingshu
Modifier and Type | Class and Description |
---|---|
static class |
ElementWiseMultiplicationLayer.Builder |
nIn, nOut
activationFn, biasInit, biasUpdater, dist, gradientNormalization, gradientNormalizationThreshold, iUpdater, l1, l1Bias, l2, l2Bias, weightInit, weightNoise
constraints, iDropout, layerName
Modifier | Constructor and Description |
---|---|
protected |
ElementWiseMultiplicationLayer() |
protected |
ElementWiseMultiplicationLayer(ElementWiseMultiplicationLayer.Builder builder) |
Modifier and Type | Method and Description |
---|---|
ElementWiseMultiplicationLayer |
clone() |
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
ParamInitializer |
initializer() |
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
org.nd4j.linalg.api.ndarray.INDArray layerParamsView,
boolean initializeParams) |
getL1ByParam, getL2ByParam, getOutputType, getPreProcessorForInputType, isPretrainParam, setNIn
getUpdaterByParam, resetLayerDefaultConfig
initializeConstraints
protected ElementWiseMultiplicationLayer()
protected ElementWiseMultiplicationLayer(ElementWiseMultiplicationLayer.Builder builder)
public ElementWiseMultiplicationLayer clone()
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, org.nd4j.linalg.api.ndarray.INDArray layerParamsView, boolean initializeParams)
instantiate
in class Layer
public ParamInitializer initializer()
initializer
in class Layer
public LayerMemoryReport getMemoryReport(InputType inputType)
getMemoryReport
in class Layer
inputType
- Input type to the layer. Memory consumption is often a function of the input typeCopyright © 2018. All rights reserved.