Class TransferLearning.Builder

    • Constructor Detail

      • Builder

        public Builder​(MultiLayerNetwork origModel)
        Multilayer Network to tweak for transfer learning
        Parameters:
        origModel -
    • Method Detail

      • fineTuneConfiguration

        public TransferLearning.Builder fineTuneConfiguration​(FineTuneConfiguration finetuneConfiguration)
        Fine tune configurations specified will overwrite the existing configuration if any Usage example: specify a learning rate will set specified learning rate on all layers Refer to the fineTuneConfiguration class for more details
        Parameters:
        finetuneConfiguration -
        Returns:
        Builder
      • setFeatureExtractor

        public TransferLearning.Builder setFeatureExtractor​(int layerNum)
        Specify a layer to set as a "feature extractor" The specified layer and the layers preceding it will be "frozen" with parameters staying constant
        Parameters:
        layerNum -
        Returns:
        Builder
      • nOutReplace

        public TransferLearning.Builder nOutReplace​(int layerNum,
                                                    int nOut,
                                                    WeightInit scheme)
        Modify the architecture of a layer by changing nOut Note this will also affect the layer that follows the layer specified, unless it is the output layer
        Parameters:
        layerNum - The index of the layer to change nOut of
        nOut - Value of nOut to change to
        scheme - Weight Init scheme to use for params in layernum and layernum+1
        Returns:
        Builder
      • nOutReplace

        public TransferLearning.Builder nOutReplace​(int layerNum,
                                                    int nOut,
                                                    Distribution dist)
        Modify the architecture of a layer by changing nOut Note this will also affect the layer that follows the layer specified, unless it is the output layer
        Parameters:
        layerNum - The index of the layer to change nOut of
        nOut - Value of nOut to change to
        dist - Distribution to use in conjunction with weight init DISTRIBUTION for params in layernum and layernum+1
        Returns:
        Builder
        See Also:
        DISTRIBUTION
      • nOutReplace

        public TransferLearning.Builder nOutReplace​(int layerNum,
                                                    int nOut,
                                                    WeightInit scheme,
                                                    WeightInit schemeNext)
        Modify the architecture of a layer by changing nOut Note this will also affect the layer that follows the layer specified, unless it is the output layer Can specify different weight init schemes for the specified layer and the layer that follows it.
        Parameters:
        layerNum - The index of the layer to change nOut of
        nOut - Value of nOut to change to
        scheme - Weight Init scheme to use for params in the layerNum
        schemeNext - Weight Init scheme to use for params in the layerNum+1
        Returns:
        Builder
      • nOutReplace

        public TransferLearning.Builder nOutReplace​(int layerNum,
                                                    int nOut,
                                                    Distribution dist,
                                                    Distribution distNext)
        Modify the architecture of a layer by changing nOut Note this will also affect the layer that follows the layer specified, unless it is the output layer Can specify different weight init schemes for the specified layer and the layer that follows it.
        Parameters:
        layerNum - The index of the layer to change nOut of
        nOut - Value of nOut to change to
        dist - Distribution to use for params in the layerNum
        distNext - Distribution to use for parmas in layerNum+1
        Returns:
        Builder
        See Also:
        WeightInitDistribution
      • nOutReplace

        public TransferLearning.Builder nOutReplace​(int layerNum,
                                                    int nOut,
                                                    WeightInit scheme,
                                                    Distribution distNext)
        Modify the architecture of a layer by changing nOut Note this will also affect the layer that follows the layer specified, unless it is the output layer Can specify different weight init schemes for the specified layer and the layer that follows it.
        Parameters:
        layerNum - The index of the layer to change nOut of
        nOut - Value of nOut to change to
        scheme - Weight init scheme to use for params in layerNum
        distNext - Distribution to use for parmas in layerNum+1
        Returns:
        Builder
        See Also:
        WeightInitDistribution
      • nOutReplace

        public TransferLearning.Builder nOutReplace​(int layerNum,
                                                    int nOut,
                                                    Distribution dist,
                                                    WeightInit schemeNext)
        Modify the architecture of a layer by changing nOut Note this will also affect the layer that follows the layer specified, unless it is the output layer Can specify different weight init schemes for the specified layer and the layer that follows it.
        Parameters:
        layerNum - The index of the layer to change nOut of
        nOut - Value of nOut to change to
        dist - Distribution to use for parmas in layerNum
        schemeNext - Weight init scheme to use for params in layerNum+1
        Returns:
        Builder
        See Also:
        WeightInitDistribution
      • nOutReplace

        public TransferLearning.Builder nOutReplace​(int layerNum,
                                                    int nOut,
                                                    IWeightInit scheme,
                                                    IWeightInit schemeNext)
        Modify the architecture of a layer by changing nOut Note this will also affect the layer that follows the layer specified, unless it is the output layer Can specify different weight init schemes for the specified layer and the layer that follows it.
        Parameters:
        layerNum - The index of the layer to change nOut of
        nOut - Value of nOut to change to
        scheme - Weight Init scheme to use for params in the layerNum
        schemeNext - Weight Init scheme to use for params in the layerNum+1
      • nInReplace

        public TransferLearning.Builder nInReplace​(int layerNum,
                                                   int nIn,
                                                   WeightInit scheme)
        Modify the architecture of a vertex layer by changing nIn of the specified layer.
        Note that only the specified layer will be modified - all other layers will not be changed by this call.
        Parameters:
        layerNum - The number of the layer to change nIn of
        nIn - Value of nIn to change to
        scheme - Weight init scheme to use for params in layerName
        Returns:
        Builder
      • nInReplace

        public TransferLearning.Builder nInReplace​(int layerNum,
                                                   int nIn,
                                                   WeightInit scheme,
                                                   Distribution dist)
        Modify the architecture of a vertex layer by changing nIn of the specified layer.
        Note that only the specified layer will be modified - all other layers will not be changed by this call.
        Parameters:
        layerNum - The number of the layer to change nIn of
        nIn - Value of nIn to change to
        scheme - Weight init scheme to use for params in layerName
        Returns:
        Builder
      • nInReplace

        public TransferLearning.Builder nInReplace​(int layerNum,
                                                   int nIn,
                                                   IWeightInit scheme)
        Modify the architecture of a vertex layer by changing nIn of the specified layer.
        Note that only the specified layer will be modified - all other layers will not be changed by this call.
        Parameters:
        layerNum - The number of the layer to change nIn of
        nIn - Value of nIn to change to
        scheme - Weight init scheme to use for params in layerName
        Returns:
        Builder
      • removeOutputLayer

        public TransferLearning.Builder removeOutputLayer()
        Helper method to remove the outputLayer of the net. Only one of the two - removeOutputLayer() or removeLayersFromOutput(layerNum) - can be specified When removing layers at the very least an output layer should be added with .addLayer(...)
        Returns:
        Builder
      • removeLayersFromOutput

        public TransferLearning.Builder removeLayersFromOutput​(int layerNum)
        Remove last "n" layers of the net At least an output layer must be added back in
        Parameters:
        layerNum - number of layers to remove
        Returns:
        Builder
      • addLayer

        public TransferLearning.Builder addLayer​(Layer layer)
        Add layers to the net Required if layers are removed. Can be called multiple times and layers will be added in the order with which they were called. At the very least an outputLayer must be added (output layer should be added last - as per the note on order) Learning configs (like updaters, learning rate etc) specified with the layer here will be honored
        Parameters:
        layer - layer conf to add (similar to the NeuralNetConfiguration .list().layer(...)
        Returns:
        Builder
      • setInputPreProcessor

        public TransferLearning.Builder setInputPreProcessor​(int layer,
                                                             InputPreProcessor processor)
        Specify the preprocessor for the added layers for cases where they cannot be inferred automatically.
        Parameters:
        processor - to be used on the data
        Returns:
        Builder
      • build

        public MultiLayerNetwork build()
        Returns a model with the fine tune configuration and specified architecture changes. .init() need not be called. Can be directly fit.
        Returns:
        MultiLayerNetwork