Packages

  • package root
    Definition Classes
    root
  • package com
    Definition Classes
    root
  • package thoughtworks
    Definition Classes
    com
  • package deeplearning

    This is the documentation for the DeepLearning.Scala

    This is the documentation for the DeepLearning.Scala

    Overview

    BufferedLayer, DifferentiableAny, DifferentiableNothing, Layer, Poly and Symbolic are base packages which contains necessary operations , all other packages dependent on those base packages.

    If you want to implement a layer, you need to know how to use base packages.

    Imports guidelines

    If you want use some operations of Type T, you should import:

    import com.thoughtworks.deeplearning.DifferentiableT._

    it means: If you want use some operations of INDArray, you should import:

    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If you write something like this:

    def softmax(implicit scores: INDArray @Symbolic): INDArray @Symbolic = {
      val expScores = exp(scores)
      expScores / expScores.sum(1)
    }

    If compiler shows error :

    Could not infer implicit value for com.thoughtworks.deeplearning.Symbolic[org.nd4j.linalg.api.ndarray.INDArray]

    you need add import this time :

    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If you write something like this:

    def crossEntropyLossFunction(
      implicit pair: (INDArray :: INDArray :: HNil) @Symbolic)
    : Double @Symbolic = {
     val score = pair.head
     val label = pair.tail.head
     -(label * log(score * 0.9 + 0.1) + (1.0 - label) * log(1.0 - score * 0.9)).mean
    }

    If the compiler shows error:

    value * is not a member of com.thoughtworks.deeplearning.Layer.Aux[com.thoughtworks.deeplearning.Layer.Tape.Aux[org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray],com.thoughtworks.deeplearning.DifferentiableINDArray.INDArrayPlaceholder.Tape]val bias = Nd4j.ones(numberOfOutputKernels).toWeight * 0.1...

    you need add import :

    import com.thoughtworks.deeplearning.Poly.MathMethods.*
    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If the compiler shows error:

    not found: value log -(label * log(score * 0.9 + 0.1) + (1.0 - label) * log(1.0 - score * 0.9)).mean...

    you need add import:

    import com.thoughtworks.deeplearning.Poly.MathFunctions.*
    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    Those + - * / and log exp abs max min are defined at MathMethods and MathFunctions,those method are been implemented at DifferentiableType,so you need to import the implicit of DifferentiableType.

    Composability

    Neural networks created by DeepLearning.scala are composable. You can create large networks by combining smaller networks. If two larger networks share some sub-networks, the weights in shared sub-networks trained with one network affect the other network.

    Definition Classes
    thoughtworks
    See also

    Compose

  • trait CumulativeLayer extends Layer

    A Layer that minimizes the computation during both forward pass and backward pass.

    A Layer that minimizes the computation during both forward pass and backward pass.

    For forward pass, the result will be cached.

    For backward pass, the Tape is accumulated until flush.

    Author:

    杨博 (Yang Bo) <[email protected]>

    Definition Classes
    deeplearning
    See also

    Layer.Output

  • CumulativeTape
  • Input
  • MonoidTape
  • ReferenceCount
  • SemigroupTape

type Output = CumulativeTape.Self

A cumulative Tape returned by forward.

When this Output is backwarding, the delta parameter will not be back-propagated to its upstreams immediately. Instead, the delta parameter will be accumulated internally. Then, when this Output is flushing, the delta accumulator will be processed and back-propagated to its upstreams.

This Output is reference counted. When the last instance of all this Output's duplicates is closed, flush will be called and all the upstreams will be closed as well.

Definition Classes
CumulativeLayerLayer
Linear Supertypes
Tape, AutoCloseable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Output
  2. Tape
  3. AutoCloseable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. abstract type Data <: ReferenceCount._1.type.Data

    Type of the result of forward pass.

    Type of the result of forward pass.

    Definition Classes
    <refinement> → Tape
    See also

    value

  2. abstract type Delta >: ReferenceCount._1.type.Delta

    Type of the information passing in backward pass, usually the partial derivative of Data.

    Type of the information passing in backward pass, usually the partial derivative of Data.

    Definition Classes
    <refinement> → Tape
    See also

    backward

Abstract Value Members

  1. abstract def close(): Unit
    Definition Classes
    AutoCloseable
    Annotations
    @throws( classOf[java.lang.Exception] )
  2. abstract def duplicate(): Aux[Layer.Tape.Data, Layer.Tape.Delta]

    Returns a new Tape that shares the same value and backward behavior with this Tape.

    Returns a new Tape that shares the same value and backward behavior with this Tape.

    Definition Classes
    Tape
    Note

    The newly created Tape and this Tape must be closed independently.

  3. abstract def forceBackward(delta: Layer.Tape.Delta): Unit
    Attributes
    protected
    Definition Classes
    Tape
  4. abstract def isTrainable: Boolean
    Definition Classes
    Tape
  5. abstract def value: Layer.Tape.Data

    Value of the result of forward pass.

    Value of the result of forward pass.

    Definition Classes
    Tape
    See also

    Data

Concrete Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. final def backward(delta: ⇒ Layer.Tape.Delta): Unit

    invoke forceBackward if isTrainable is true

    invoke forceBackward if isTrainable is true

    Definition Classes
    Tape
    Annotations
    @inline()
    See also

    Delta

  6. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  7. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  8. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  9. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  10. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
  11. def hashCode(): Int
    Definition Classes
    AnyRef → Any
  12. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  13. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  14. final def notify(): Unit
    Definition Classes
    AnyRef
  15. final def notifyAll(): Unit
    Definition Classes
    AnyRef
  16. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  17. def toString(): String
    Definition Classes
    AnyRef → Any
  18. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  19. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  20. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Tape

Inherited from AutoCloseable

Inherited from AnyRef

Inherited from Any

Ungrouped