Packages

  • package root
    Definition Classes
    root
  • package com
    Definition Classes
    root
  • package thoughtworks
    Definition Classes
    com
  • package deeplearning

    This is the documentation for the DeepLearning.Scala

    This is the documentation for the DeepLearning.Scala

    Overview

    BufferedLayer, DifferentiableAny, DifferentiableNothing, Layer, Poly and Symbolic are base packages which contains necessary operations , all other packages dependent on those base packages.

    If you want to implement a layer, you need to know how to use base packages.

    Imports guidelines

    If you want use some operations of Type T, you should import:

    import com.thoughtworks.deeplearning.DifferentiableT._

    it means: If you want use some operations of INDArray, you should import:

    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If you write something like this:

    def softmax(implicit scores: INDArray @Symbolic): INDArray @Symbolic = {
      val expScores = exp(scores)
      expScores / expScores.sum(1)
    }

    If compiler shows error :

    Could not infer implicit value for com.thoughtworks.deeplearning.Symbolic[org.nd4j.linalg.api.ndarray.INDArray]

    you need add import this time :

    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If you write something like this:

    def crossEntropyLossFunction(
      implicit pair: (INDArray :: INDArray :: HNil) @Symbolic)
    : Double @Symbolic = {
     val score = pair.head
     val label = pair.tail.head
     -(label * log(score * 0.9 + 0.1) + (1.0 - label) * log(1.0 - score * 0.9)).mean
    }

    If the compiler shows error:

    value * is not a member of com.thoughtworks.deeplearning.Layer.Aux[com.thoughtworks.deeplearning.Layer.Tape.Aux[org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray],com.thoughtworks.deeplearning.DifferentiableINDArray.INDArrayPlaceholder.Tape]val bias = Nd4j.ones(numberOfOutputKernels).toWeight * 0.1...

    you need add import :

    import com.thoughtworks.deeplearning.Poly.MathMethods.*
    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If the compiler shows error:

    not found: value log -(label * log(score * 0.9 + 0.1) + (1.0 - label) * log(1.0 - score * 0.9)).mean...

    you need add import:

    import com.thoughtworks.deeplearning.Poly.MathFunctions.*
    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    Those + - * / and log exp abs max min are defined at MathMethods and MathFunctions,those method are been implemented at DifferentiableType,so you need to import the implicit of DifferentiableType.

    Composability

    Neural networks created by DeepLearning.scala are composable. You can create large networks by combining smaller networks. If two larger networks share some sub-networks, the weights in shared sub-networks trained with one network affect the other network.

    Definition Classes
    thoughtworks
    See also

    Compose

  • CumulativeLayer
  • DifferentiableAny
  • DifferentiableBoolean
  • DifferentiableCoproduct
  • DifferentiableDouble
  • DifferentiableFloat
  • DifferentiableHList
  • DifferentiableInt
  • DifferentiableNothing
  • DifferentiableSeq
  • Layer
  • Poly
  • Symbolic
o

com.thoughtworks.deeplearning

DifferentiableFloat

object DifferentiableFloat

A namespace of common operators for Float layers.

Author:

杨博 (Yang Bo) <[email protected]>

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DifferentiableFloat
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. final class FloatLayerOps [Input <: Tape] extends AnyRef
  2. implicit final class NativeFloatOps extends AnyRef
  3. trait OptimizerFactory extends AnyRef

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. implicit def Float*Float[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts two Float Layers.

    Returns a Case that accepts two Float Layers.

    The returned Case is used by the polymorphic function *, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        inputFloatLayer * anotherFloatLayer
      }
  5. implicit def Float+Float[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts two Float Layers.

    Returns a Case that accepts two Float Layers.

    The returned Case is used by the polymorphic function +, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathMethods.+(inputFloatLayer,anotherFloatLayer)
      }
  6. implicit def Float-Float[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts two Float Layers.

    Returns a Case that accepts two Float Layers. The returned Case is used by the polymorphic function -, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathMethods.-(inputFloatLayer,anotherFloatLayer)
      }
  7. implicit def Float/Float[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts two Float Layers.

    Returns a Case that accepts two Float Layers.

    The returned Case is used by the polymorphic function /, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathMethods./(inputFloatLayer,anotherFloatLayer)
      }
  8. implicit def abs(Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts Float Layer for the polymorphic function abs

    Returns a Case that accepts Float Layer for the polymorphic function abs

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.abs(inputFloatLayer)
      }
  9. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  10. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  11. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  12. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  13. implicit def exp(Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts Float Layer for the polymorphic function exp

    Returns a Case that accepts Float Layer for the polymorphic function exp

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.exp(inputFloatLayer)
      }
  14. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  15. implicit def floatToLiteral: Aux[Float, Float, Float]
  16. implicit def floatTrainable: Trainable[Float, Float]

    See also

    Trainable

  17. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
  18. def hashCode(): Int
    Definition Classes
    AnyRef → Any
  19. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  20. implicit def log(Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts Float Layer for the polymorphic function log

    Returns a Case that accepts Float Layer for the polymorphic function log

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.log(inputFloatLayer)
      }
  21. implicit def max(Float,Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts two Float Layers for the polymorphic function max

    Returns a Case that accepts two Float Layers for the polymorphic function max

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.max(inputFloatLayer,anotherFloatLayer)
      }
  22. implicit def min(Float,Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts two Float Layers for the polymorphic function min

    Returns a Case that accepts two Float Layers for the polymorphic function min

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.min(inputFloatLayer,anotherFloatLayer)
      }
  23. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  24. final def notify(): Unit
    Definition Classes
    AnyRef
  25. final def notifyAll(): Unit
    Definition Classes
    AnyRef
  26. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  27. implicit def toFloatLayerOps[From, Input <: Tape](from: From)(implicit toLayer: OfPlaceholder[From, Input, FloatPlaceholder]): FloatLayerOps[Input]

    Implicitly converts any layer to FloatLayerOps, which enables common methods for Float layers.

    Implicitly converts any layer to FloatLayerOps, which enables common methods for Float layers.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
  28. def toString(): String
    Definition Classes
    AnyRef → Any
  29. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  31. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  32. object Layers
  33. object OptimizerFactory
  34. object Optimizers

    Optimizers of Float.

    Optimizers of Float.

    Example:
    1. implicit val optimizerFactory = new DifferentiableFloat.OptimizerFactory {
        override def floatOptimizer(weight: Weight): Optimizer = {
          new LearningRate with L2Regularization {
      
            var learningRate = 0.00003
      
            override protected def l2Regularization: Float = 0.003
      
            override protected def currentLearningRate(): Float = {
            learningRate * 0.75
            learningRate
           }
         }
       }
      }

Inherited from AnyRef

Inherited from Any

Ungrouped