Packages

  • package root
    Definition Classes
    root
  • package com
    Definition Classes
    root
  • package thoughtworks
    Definition Classes
    com
  • package deeplearning

    This is the documentation for the DeepLearning.Scala

    This is the documentation for the DeepLearning.Scala

    Overview

    BufferedLayer, DifferentiableAny, DifferentiableNothing, Layer, Poly and Symbolic are base packages which contains necessary operations , all other packages dependent on those base packages.

    If you want to implement a layer, you need to know how to use base packages.

    Imports guidelines

    If you want use some operations of Type T, you should import:

    import com.thoughtworks.deeplearning.DifferentiableT._

    it means: If you want use some operations of INDArray, you should import:

    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If you write something like this:

    def softmax(implicit scores: INDArray @Symbolic): INDArray @Symbolic = {
      val expScores = exp(scores)
      expScores / expScores.sum(1)
    }

    If compiler shows error :

    Could not infer implicit value for com.thoughtworks.deeplearning.Symbolic[org.nd4j.linalg.api.ndarray.INDArray]

    you need add import this time :

    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If you write something like this:

    def crossEntropyLossFunction(
      implicit pair: (INDArray :: INDArray :: HNil) @Symbolic)
    : Double @Symbolic = {
     val score = pair.head
     val label = pair.tail.head
     -(label * log(score * 0.9 + 0.1) + (1.0 - label) * log(1.0 - score * 0.9)).mean
    }

    If the compiler shows error:

    value * is not a member of com.thoughtworks.deeplearning.Layer.Aux[com.thoughtworks.deeplearning.Layer.Tape.Aux[org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray],com.thoughtworks.deeplearning.DifferentiableINDArray.INDArrayPlaceholder.Tape]val bias = Nd4j.ones(numberOfOutputKernels).toWeight * 0.1...

    you need add import :

    import com.thoughtworks.deeplearning.Poly.MathMethods.*
    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If the compiler shows error:

    not found: value log -(label * log(score * 0.9 + 0.1) + (1.0 - label) * log(1.0 - score * 0.9)).mean...

    you need add import:

    import com.thoughtworks.deeplearning.Poly.MathFunctions.*
    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    Those + - * / and log exp abs max min are defined at MathMethods and MathFunctions,those method are been implemented at DifferentiableType,so you need to import the implicit of DifferentiableType.

    Composability

    Neural networks created by DeepLearning.scala are composable. You can create large networks by combining smaller networks. If two larger networks share some sub-networks, the weights in shared sub-networks trained with one network affect the other network.

    Definition Classes
    thoughtworks
    See also

    Compose

  • CumulativeLayer
  • DifferentiableAny
  • DifferentiableBoolean
  • DifferentiableCoproduct
  • DifferentiableDouble
  • DifferentiableFloat
  • DifferentiableHList
  • DifferentiableInt
  • DifferentiableNothing
  • DifferentiableSeq
  • Layer
  • Poly
  • Symbolic
o

com.thoughtworks.deeplearning

DifferentiableInt

object DifferentiableInt

A namespace of common operators for Int layers.

Author:

杨博 (Yang Bo) <[email protected]>

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DifferentiableInt
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. implicit final class ScalaIntOps extends AnyRef

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. implicit def Int*Int[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Poly.MathMethods.*.Case that accepts two Int Layers.

    Returns a Poly.MathMethods.*.Case that accepts two Int Layers.

    The returned Case is used by the polymorphic function Poly.MathMethods.*, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableInt._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputIntLayer: Int @Symbolic)(anotherIntLayer: Int @Symbolic) = {
        Poly.MathMethods.*(inputIntLayer,anotherIntLayer)
      }
  5. implicit def Int+Int[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Poly.MathMethods.+.Case that accepts two Int Layers.

    Returns a Poly.MathMethods.+.Case that accepts two Int Layers.

    The returned Case is used by the polymorphic function Poly.MathMethods.+, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableInt._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputIntLayer: Int @Symbolic)(anotherIntLayer: Int @Symbolic) = {
        Poly.MathMethods.+(inputIntLayer,anotherIntLayer)
      }
  6. implicit def Int-Int[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Poly.MathMethods.-.Case that accepts two Int Layers.

    Returns a Poly.MathMethods.-.Case that accepts two Int Layers.

    The returned Case is used by the polymorphic function Poly.MathMethods.-, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableInt._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputIntLayer: Int @Symbolic)(anotherIntLayer: Int @Symbolic) = {
        Poly.MathMethods.-(inputIntLayer,anotherIntLayer)
      }
  7. implicit def Int/Int[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Poly.MathMethods./.Case that accepts two Int Layers.

    Returns a Poly.MathMethods./.Case that accepts two Int Layers.

    The returned Case is used by the polymorphic function Poly.MathMethods./, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableInt._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputIntLayer: Int @Symbolic)(anotherIntLayer: Int @Symbolic) = {
        Poly.MathMethods./(inputIntLayer,anotherIntLayer)
      }
  8. val Optimizers: DifferentiableDouble.Optimizers.type
  9. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  10. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  11. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  12. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  13. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  14. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
  15. def hashCode(): Int
    Definition Classes
    AnyRef → Any
  16. implicit def intToLiteral: Aux[Int, Int, Float]
  17. implicit def intTrainable: Trainable[Int, Float]

    See also

    Trainable

  18. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  19. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  20. final def notify(): Unit
    Definition Classes
    AnyRef
  21. final def notifyAll(): Unit
    Definition Classes
    AnyRef
  22. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  23. def toString(): String
    Definition Classes
    AnyRef → Any
  24. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  25. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  26. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  27. object Layers

Inherited from AnyRef

Inherited from Any

Ungrouped