Packages

  • package root
    Definition Classes
    root
  • package com
    Definition Classes
    root
  • package thoughtworks
    Definition Classes
    com
  • package deeplearning

    This is the documentation for the DeepLearning.Scala

    This is the documentation for the DeepLearning.Scala

    Overview

    BufferedLayer, DifferentiableAny, DifferentiableNothing, Layer, Poly and Symbolic are base packages which contains necessary operations , all other packages dependent on those base packages.

    If you want to implement a layer, you need to know how to use base packages.

    Imports guidelines

    If you want use some operations of Type T, you should import:

    import com.thoughtworks.deeplearning.DifferentiableT._

    it means: If you want use some operations of INDArray, you should import:

    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If you write something like this:

    def softmax(implicit scores: INDArray @Symbolic): INDArray @Symbolic = {
      val expScores = exp(scores)
      expScores / expScores.sum(1)
    }

    If compiler shows error :

    Could not infer implicit value for com.thoughtworks.deeplearning.Symbolic[org.nd4j.linalg.api.ndarray.INDArray]

    you need add import this time :

    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If you write something like this:

    def crossEntropyLossFunction(
      implicit pair: (INDArray :: INDArray :: HNil) @Symbolic)
    : Double @Symbolic = {
     val score = pair.head
     val label = pair.tail.head
     -(label * log(score * 0.9 + 0.1) + (1.0 - label) * log(1.0 - score * 0.9)).mean
    }

    If the compiler shows error:

    value * is not a member of com.thoughtworks.deeplearning.Layer.Aux[com.thoughtworks.deeplearning.Layer.Tape.Aux[org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray],com.thoughtworks.deeplearning.DifferentiableINDArray.INDArrayPlaceholder.Tape]val bias = Nd4j.ones(numberOfOutputKernels).toWeight * 0.1...

    you need add import :

    import com.thoughtworks.deeplearning.Poly.MathMethods.*
    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If the compiler shows error:

    not found: value log -(label * log(score * 0.9 + 0.1) + (1.0 - label) * log(1.0 - score * 0.9)).mean...

    you need add import:

    import com.thoughtworks.deeplearning.Poly.MathFunctions.*
    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    Those + - * / and log exp abs max min are defined at MathMethods and MathFunctions,those method are been implemented at DifferentiableType,so you need to import the implicit of DifferentiableType.

    Composability

    Neural networks created by DeepLearning.scala are composable. You can create large networks by combining smaller networks. If two larger networks share some sub-networks, the weights in shared sub-networks trained with one network affect the other network.

    Definition Classes
    thoughtworks
    See also

    Compose

  • CumulativeLayer
  • DifferentiableAny
  • DifferentiableBoolean
  • DifferentiableCoproduct
  • DifferentiableDouble
  • DifferentiableFloat
  • DifferentiableHList
  • DifferentiableInt
  • DifferentiableNothing
  • DifferentiableSeq
  • Layer
  • Poly
  • Symbolic

object Layer

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Layer
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. type Aux[-Input0 <: Tape, +Output0 <: Tape] = Layer { ... /* 2 definitions in type refinement */ }

    A Layer whose Input and Output are specified from type parameter Input0 and Output0.

    A Layer whose Input and Output are specified from type parameter Input0 and Output0.

    See also

    aux pattern evolution

    aux pattern

  2. trait Tape extends AutoCloseable

    Tape is a intermediate data structure generated by the forward of neural network , which contains result of the forward, and can perform backward for back-propagation.

    Tape is a intermediate data structure generated by the forward of neural network , which contains result of the forward, and can perform backward for back-propagation.

    Note

    close method must be called when this Tape will not be used any more.

    See also

    tape auto differentiation

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
  10. def hashCode(): Int
    Definition Classes
    AnyRef → Any
  11. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  12. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. final def notify(): Unit
    Definition Classes
    AnyRef
  14. final def notifyAll(): Unit
    Definition Classes
    AnyRef
  15. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  16. def toString(): String
    Definition Classes
    AnyRef → Any
  17. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  18. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  19. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  20. object Tape

Inherited from AnyRef

Inherited from Any

Ungrouped