Packages

  • package root
    Definition Classes
    root
  • package com
    Definition Classes
    root
  • package thoughtworks
    Definition Classes
    com
  • package deeplearning

    This is the documentation for the DeepLearning.Scala

    This is the documentation for the DeepLearning.Scala

    Overview

    BufferedLayer, DifferentiableAny, DifferentiableNothing, Layer, Poly and Symbolic are base packages which contains necessary operations , all other packages dependent on those base packages.

    If you want to implement a layer, you need to know how to use base packages.

    Imports guidelines

    If you want use some operations of Type T, you should import:

    import com.thoughtworks.deeplearning.DifferentiableT._

    it means: If you want use some operations of INDArray, you should import:

    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If you write something like this:

    def softmax(implicit scores: INDArray @Symbolic): INDArray @Symbolic = {
      val expScores = exp(scores)
      expScores / expScores.sum(1)
    }

    If compiler shows error :

    Could not infer implicit value for com.thoughtworks.deeplearning.Symbolic[org.nd4j.linalg.api.ndarray.INDArray]

    you need add import this time :

    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If you write something like this:

    def crossEntropyLossFunction(
      implicit pair: (INDArray :: INDArray :: HNil) @Symbolic)
    : Double @Symbolic = {
     val score = pair.head
     val label = pair.tail.head
     -(label * log(score * 0.9 + 0.1) + (1.0 - label) * log(1.0 - score * 0.9)).mean
    }

    If the compiler shows error:

    value * is not a member of com.thoughtworks.deeplearning.Layer.Aux[com.thoughtworks.deeplearning.Layer.Tape.Aux[org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray],com.thoughtworks.deeplearning.DifferentiableINDArray.INDArrayPlaceholder.Tape]val bias = Nd4j.ones(numberOfOutputKernels).toWeight * 0.1...

    you need add import :

    import com.thoughtworks.deeplearning.Poly.MathMethods.*
    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    If the compiler shows error:

    not found: value log -(label * log(score * 0.9 + 0.1) + (1.0 - label) * log(1.0 - score * 0.9)).mean...

    you need add import:

    import com.thoughtworks.deeplearning.Poly.MathFunctions.*
    import com.thoughtworks.deeplearning.DifferentiableINDArray._

    Those + - * / and log exp abs max min are defined at MathMethods and MathFunctions,those method are been implemented at DifferentiableType,so you need to import the implicit of DifferentiableType.

    Composability

    Neural networks created by DeepLearning.scala are composable. You can create large networks by combining smaller networks. If two larger networks share some sub-networks, the weights in shared sub-networks trained with one network affect the other network.

    Definition Classes
    thoughtworks
    See also

    Compose

  • object DifferentiableFloat

    A namespace of common operators for Float layers.

    A namespace of common operators for Float layers.

    Author:

    杨博 (Yang Bo) <[email protected]>

    Definition Classes
    deeplearning
  • object Optimizers

    Optimizers of Float.

    Optimizers of Float.

    Definition Classes
    DifferentiableFloat
    Example:
    1. implicit val optimizerFactory = new DifferentiableFloat.OptimizerFactory {
        override def floatOptimizer(weight: Weight): Optimizer = {
          new LearningRate with L2Regularization {
      
            var learningRate = 0.00003
      
            override protected def l2Regularization: Float = 0.003
      
            override protected def currentLearningRate(): Float = {
            learningRate * 0.75
            learningRate
           }
         }
       }
      }
  • L1Regularization
  • L2Regularization
  • LearningRate
  • Optimizer

trait L2Regularization extends Optimizer

Linear Supertypes
Type Hierarchy
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. L2Regularization
  2. Optimizer
  3. AnyRef
  4. Any
Implicitly
  1. by toFloatLayerOps
  2. by any2stringadd
  3. by StringFormat
  4. by Ensuring
  5. by ArrowAssoc
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Abstract Value Members

  1. abstract def l2Regularization: Float
    Attributes
    protected

Concrete Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. def +(other: String): String
    Implicit
    This member is added by an implicit conversion from L2Regularization to any2stringadd[L2Regularization] performed by method any2stringadd in scala.Predef.
    Definition Classes
    any2stringadd
  4. def ->[B](y: B): (L2Regularization, B)
    Implicit
    This member is added by an implicit conversion from L2Regularization to ArrowAssoc[L2Regularization] performed by method ArrowAssoc in scala.Predef.
    Definition Classes
    ArrowAssoc
    Annotations
    @inline()
  5. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  6. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  7. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. def currentDelta(oldValue: Float, delta: Float): Float
    Definition Classes
    L2RegularizationOptimizer
  9. def ensuring(cond: (L2Regularization) ⇒ Boolean, msg: ⇒ Any): L2Regularization
    Implicit
    This member is added by an implicit conversion from L2Regularization to Ensuring[L2Regularization] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  10. def ensuring(cond: (L2Regularization) ⇒ Boolean): L2Regularization
    Implicit
    This member is added by an implicit conversion from L2Regularization to Ensuring[L2Regularization] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  11. def ensuring(cond: Boolean, msg: ⇒ Any): L2Regularization
    Implicit
    This member is added by an implicit conversion from L2Regularization to Ensuring[L2Regularization] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  12. def ensuring(cond: Boolean): L2Regularization
    Implicit
    This member is added by an implicit conversion from L2Regularization to Ensuring[L2Regularization] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  13. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  14. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  15. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  16. def formatted(fmtstr: String): String
    Implicit
    This member is added by an implicit conversion from L2Regularization to StringFormat[L2Regularization] performed by method StringFormat in scala.Predef.
    Definition Classes
    StringFormat
    Annotations
    @inline()
  17. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
  18. def hashCode(): Int
    Definition Classes
    AnyRef → Any
  19. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  20. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  21. final def notify(): Unit
    Definition Classes
    AnyRef
  22. final def notifyAll(): Unit
    Definition Classes
    AnyRef
  23. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  24. def toString(): String
    Definition Classes
    AnyRef → Any
  25. def unary_-: Aux[Input, Tape]

    Opposite number

    Opposite number

    Implicit
    This member is added by an implicit conversion from L2Regularization to FloatLayerOps[Input] performed by method toFloatLayerOps in com.thoughtworks.deeplearning.DifferentiableFloat. This conversion will take place only if an implicit value of type OfPlaceholder[L2Regularization, Input, FloatPlaceholder] is in scope.
    Definition Classes
    FloatLayerOps
  26. final def updateFloat(oldValue: Float, delta: Float): Float
    Definition Classes
    Optimizer
  27. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  28. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  29. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. def [B](y: B): (L2Regularization, B)
    Implicit
    This member is added by an implicit conversion from L2Regularization to ArrowAssoc[L2Regularization] performed by method ArrowAssoc in scala.Predef.
    Definition Classes
    ArrowAssoc

Inherited from Optimizer

Inherited from AnyRef

Inherited from Any

Inherited by implicit conversion toFloatLayerOps from L2Regularization to FloatLayerOps[Input]

Inherited by implicit conversion any2stringadd from L2Regularization to any2stringadd[L2Regularization]

Inherited by implicit conversion StringFormat from L2Regularization to StringFormat[L2Regularization]

Inherited by implicit conversion Ensuring from L2Regularization to Ensuring[L2Regularization]

Inherited by implicit conversion ArrowAssoc from L2Regularization to ArrowAssoc[L2Regularization]

Ungrouped