Packages

  • package root
    Definition Classes
    root
  • package com
    Definition Classes
    root
  • package thoughtworks
    Definition Classes
    com
  • package deeplearning

    This is the documentation for the DeepLearning.Scala

    This is the documentation for the DeepLearning.Scala

    Overview

    BufferedLayer, DifferentiableAny, DifferentiableNothing, Layer, Poly and Symbolic are base packages which contains necessary operations , all other packages dependent on those base packages.

    If you want to implement a layer, you need to know how to use base packages.

    Imports guidelines

    If you want use some operations of Type T, you should import:

    import com.thoughtworks.deeplearning.DifferentiableT._

    it means: If you want use some operations of INDArray, you should import:

    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If you write something like this:

    def softmax(implicit scores: INDArray @Symbolic): INDArray @Symbolic = {
      val expScores = exp(scores)
      expScores / expScores.sum(1)
    }

    If compiler shows error :

    Could not infer implicit value for com.thoughtworks.deeplearning.Symbolic[org.nd4j.linalg.api.ndarray.INDArray]

    you need add import this time :

    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If you write something like this:

    def crossEntropyLossFunction(
      implicit pair: (INDArray :: INDArray :: HNil) @Symbolic)
    : Double @Symbolic = {
     val score = pair.head
     val label = pair.tail.head
     -(label * log(score * 0.9 + 0.1) + (1.0 - label) * log(1.0 - score * 0.9)).mean
    }

    If the compiler shows error:

    value * is not a member of com.thoughtworks.deeplearning.Layer.Aux[com.thoughtworks.deeplearning.Layer.Tape.Aux[org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray],com.thoughtworks.deeplearning.DifferentiableINDArray.INDArrayPlaceholder.Tape]val bias = Nd4j.ones(numberOfOutputKernels).toWeight * 0.1...

    you need add import :

    import com.thoughtworks.deeplearning.Poly.MathMethods.*
    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If the compiler shows error:

    not found: value log -(label * log(score * 0.9 + 0.1) + (1.0 - label) * log(1.0 - score * 0.9)).mean...

    you need add import:

    import com.thoughtworks.deeplearning.Poly.MathFunctions.*
    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    Those + - * / and log exp abs max min are defined at MathMethods and MathFunctions,those method are been implemented at DifferentiableType,so you need to import the implicit of DifferentiableType.

    Composability

    Neural networks created by DeepLearning.scala are composable. You can create large networks by combining smaller networks. If two larger networks share some sub-networks, the weights in shared sub-networks trained with one network affect the other network.

    Definition Classes
    thoughtworks
    See also

    Compose

  • CumulativeLayer
  • DifferentiableAny
  • DifferentiableBoolean
  • DifferentiableCoproduct
  • DifferentiableDouble
  • DifferentiableFloat
  • DifferentiableHList
  • DifferentiableInt
  • DifferentiableNothing
  • DifferentiableSeq
  • Layer
  • Poly
  • Symbolic

trait Layer extends AnyRef

A Layer represents a neural network. Each Layer can be included in as a sub-network of another Layer, forming a more complex neural network. The nesting structure of Layer can be used to represent mathematical expression or Coarse-grain neural network structure. When a neural network is written, the most elements in it are placeholders. When network training begins, the data enter into the network.

Tree structure of Layer
val myLayer: Layer.Aux[Tape.Aux[Double, Double], Tape.Aux[Double, Double]] = {
  Times(
    Plus(
      Literal(1.0),
      Identity[Double, Double]()
    ),
    Weight(2.0)
  )
}

The above mathematical expression with equivalent codes can be written, by Symbolic, as: (1.0 + x) * 2.0.toWeight. 2.0.toWeight represents a variable, of which the initial value is 2. The value updates during neural network iteration.

Both Times and Plus are of case class, therefore, myLayer is a tree in nested structure consisted of the case class. Times and Plus are placeholders.

Weight is a Layer containing weight, of which the initial value is 2.0.

Identity is a Layer with equal input and output, which return the same input back. The Identity here is the placeholder of Input.

Literal is a Layer containing a constant.

Iteration

Each training of the network is called as an iteration, including two stages: forward and backward, forming a complete process of https://en.wikipedia.org/wiki/Backpropagation

Forward

When invoking forward in Layer.Aux[A,B], A is input type, B is output type, and both A and B are Tape. Now, the codes are interpreted segment by segment as follows.

For example:

val inputTape: Tape.Aux[Double, Double] = Literal(a)
val outputTape = myLayer.forward(inputTape)

When invoking myLayer.forward(inputData), forward of Times shall be invoked first, of which the pseudo codes are as follows:

final case class Times(operand1: Layer, operand2: Layer) extends Layer {
  def forward(inputData: Tape): Output = {
    val upstream1 = operand1.forward(input)
    val upstream2 = operand2.forward(input)
    new Output(upstream1, upstream2)// the concrete realization is ignored here, and recursion details are focused on
  }
  final class Output(upstream1: Tape, upstream2: Tape) extends Tape { ... }
}

It is Plus at myLayer.operand1, and Weight at myLayer.operand2. Therefore, upstream1 and upstream2 are the results of forward of operand1 and operand2 respectively.

In a similar way, the forward code of Plus is similar to forward of Times. During the invoking for forward of Plus, operand1 is Literal, and operand2 is Identity. At this point, forward of Literal and Identity of each are invoked respectively.

During the invoking for forward of Identity, the same input will be returned. The pseudo code for forward of Identity is as follows:

def forward(inputTape: Tape.Aux[Double, Double]) = inputTape

Therefore, Input is the x in mathematical expression (1.0 + x) * 2.0.toWeight, and in this way, Input is propagated to the neural network.

The return value outputTape of myLayer.forward is in Tape type. Therefore, a tree consisted of Tape will be generated finally with the structure similar to that of myLayer.

Therefore, via layer-by-layer propagation, the same myLayer.forward(inputTape) is finally returned by Identity and combined into the newly generated Tape tree.

The computation result including forward of outputTape can be used for outputTape, for example:

try {
  val loss = outputTape.value
  outputTape.backward(loss)
  loss
} finally {
  outputTape.close()
}

outputTape.value is the computation result of mathematical expression (1.0 + x) * 2.0.toWeight

Backward

outputTape.backward is the outputTape.backward of Times.Output, of which the pseudo code is as follows:

case class Times(operand1: Layer, operand2: Layer) extends Layer {
  def forward = ...
  class Output(upstream1, upstream2) extends Tape {
    private def upstreamDelta1(outputDelta: Double) = ???
    private def upstreamDelta2(outputDelta: Double) = ???
    override protected def backward(outputDelta: Double): Unit = {
      upstream1.backward(upstreamDelta1(outputDelta))
      upstream2.backward(upstreamDelta2(outputDelta))
    }
  }
}

outputTape.upstream1 and outputTape.upstream2 are the results of forward of operand1 and operand2 respectively, which are followed by backward of outputTape.upstream1 and outputTape.upstream2.

In a similar way, the backward code of Plus is similar to backward of Times. During the invoking for backward of Plus, upstream1 and upstream2 are the results of forward of Literal and Identity respectively. At this point, backward of upstream1 and upstream2 of each are invoked respectively.

Weight updates during backward, refer to updateDouble

Aux & Symbolic API

Layer.Aux[A,B] represents that Input is of A type, and Output is of B type. Tape.Aux[C,D] represents that Data is of C type, and Delta is of D type.

Layer.Aux and Type.Aux can be combined for use. For example, Layer.Aux[Tape.Aux[A,B],Tape.Aux[C,D]] can be used to represent that the input type of a layer is a Tape, and the data type of this Tape is A, delta type is B; the output type of a layer is a Tape, and the data type of this Tape is C, delta type is D.

Aux is a design pattern which realized type refinement and can be used to limit the range of type parameters.

Generally, we will not handwrite Aux type, because we can use Symbolic to acquire the same effect. For example, when used for symbolic method internal variable and return value: Layer.Aux[Tape.Aux[INDArray, INDArray], Tape.Aux[INDArray, INDArray and INDArray @Symbolic are equivalent, so we usually use Symbolic to replace the writing method of Aux.

See also

Symbolic

Backpropagation

type refinement

aux pattern evolution

aux pattern

Linear Supertypes
Type Hierarchy
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Layer
  2. AnyRef
  3. Any
Implicitly
  1. by any2stringadd
  2. by StringFormat
  3. by Ensuring
  4. by ArrowAssoc
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. abstract type Input <: Tape
  2. abstract type Output <: Tape

Abstract Value Members

  1. abstract def forward(input: Input): Output

Concrete Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. def +(other: String): String
    Implicit
    This member is added by an implicit conversion from Layer to any2stringadd[Layer] performed by method any2stringadd in scala.Predef.
    Definition Classes
    any2stringadd
  4. def ->[B](y: B): (Layer, B)
    Implicit
    This member is added by an implicit conversion from Layer to ArrowAssoc[Layer] performed by method ArrowAssoc in scala.Predef.
    Definition Classes
    ArrowAssoc
    Annotations
    @inline()
  5. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  6. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  7. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. def ensuring(cond: (Layer) ⇒ Boolean, msg: ⇒ Any): Layer
    Implicit
    This member is added by an implicit conversion from Layer to Ensuring[Layer] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  9. def ensuring(cond: (Layer) ⇒ Boolean): Layer
    Implicit
    This member is added by an implicit conversion from Layer to Ensuring[Layer] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  10. def ensuring(cond: Boolean, msg: ⇒ Any): Layer
    Implicit
    This member is added by an implicit conversion from Layer to Ensuring[Layer] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  11. def ensuring(cond: Boolean): Layer
    Implicit
    This member is added by an implicit conversion from Layer to Ensuring[Layer] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  12. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  14. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  15. def formatted(fmtstr: String): String
    Implicit
    This member is added by an implicit conversion from Layer to StringFormat[Layer] performed by method StringFormat in scala.Predef.
    Definition Classes
    StringFormat
    Annotations
    @inline()
  16. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
  17. def hashCode(): Int
    Definition Classes
    AnyRef → Any
  18. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  19. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  20. final def notify(): Unit
    Definition Classes
    AnyRef
  21. final def notifyAll(): Unit
    Definition Classes
    AnyRef
  22. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  23. def toString(): String
    Definition Classes
    AnyRef → Any
  24. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  25. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  26. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  27. def [B](y: B): (Layer, B)
    Implicit
    This member is added by an implicit conversion from Layer to ArrowAssoc[Layer] performed by method ArrowAssoc in scala.Predef.
    Definition Classes
    ArrowAssoc

Inherited from AnyRef

Inherited from Any

Inherited by implicit conversion any2stringadd from Layer to any2stringadd[Layer]

Inherited by implicit conversion StringFormat from Layer to StringFormat[Layer]

Inherited by implicit conversion Ensuring from Layer to Ensuring[Layer]

Inherited by implicit conversion ArrowAssoc from Layer to ArrowAssoc[Layer]

Ungrouped