Object

com.thoughtworks.deeplearning

DifferentiableFloat

Related Doc: package deeplearning

Permalink

object DifferentiableFloat

Author:

杨博 (Yang Bo) <[email protected]>

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DifferentiableFloat
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. final class FloatLayerOps[Input <: Batch] extends AnyRef

    Permalink
  2. implicit final class NativeFloatOps extends AnyRef

    Permalink
  3. trait OptimizerFactory extends AnyRef

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. implicit def Float*Float[Input <: Batch]: Aux[Aux[Input, Batch], Aux[Input, Batch], Aux[Input, Batch]]

    Permalink

    Returns a Poly.MathMethods.*.Case that accepts two Float Layers for the polymorphic function Poly.MathMethods.*

    Returns a Poly.MathMethods.*.Case that accepts two Float Layers for the polymorphic function Poly.MathMethods.*

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathMethods.*(inputFloatLayer,anotherFloatLayer)
      }
  5. implicit def Float+Float[Input <: Batch]: Aux[Aux[Input, Batch], Aux[Input, Batch], Aux[Input, Batch]]

    Permalink

    Returns a Poly.MathMethods.+.Case that accepts two Float Layers for the polymorphic function Poly.MathMethods.+

    Returns a Poly.MathMethods.+.Case that accepts two Float Layers for the polymorphic function Poly.MathMethods.+

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathMethods.+(inputFloatLayer,anotherFloatLayer)
      }
  6. implicit def Float-Float[Input <: Batch]: Aux[Aux[Input, Batch], Aux[Input, Batch], Aux[Input, Batch]]

    Permalink

    Returns a Poly.MathMethods.-.Case that accepts two Float Layers for the polymorphic function Poly.MathMethods.-

    Returns a Poly.MathMethods.-.Case that accepts two Float Layers for the polymorphic function Poly.MathMethods.-

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathMethods.-(inputFloatLayer,anotherFloatLayer)
      }
  7. implicit def Float/Float[Input <: Batch]: Aux[Aux[Input, Batch], Aux[Input, Batch], Aux[Input, Batch]]

    Permalink

    Returns a Poly.MathMethods./.Case that accepts two Float Layers for the polymorphic function Poly.MathMethods./

    Returns a Poly.MathMethods./.Case that accepts two Float Layers for the polymorphic function Poly.MathMethods./

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathMethods./(inputFloatLayer,anotherFloatLayer)
      }
  8. object Layers

    Permalink
  9. object OptimizerFactory

    Permalink
  10. object Optimizers

    Permalink

    Optimizers of Float

  11. implicit def abs(Float)[Input <: Batch]: Aux[Aux[Input, Batch], Aux[Input, Batch]]

    Permalink

    Returns a Poly.MathFunctions.abs.Case that accepts Float Layer for the polymorphic function Poly.MathFunctions.abs

    Returns a Poly.MathFunctions.abs.Case that accepts Float Layer for the polymorphic function Poly.MathFunctions.abs

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.abs(inputFloatLayer)
      }
  12. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  13. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  14. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  15. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  16. implicit def exp(Float)[Input <: Batch]: Aux[Aux[Input, Batch], Aux[Input, Batch]]

    Permalink

    Returns a Poly.MathFunctions.exp.Case that accepts Float Layer for the polymorphic function Poly.MathFunctions.exp

    Returns a Poly.MathFunctions.exp.Case that accepts Float Layer for the polymorphic function Poly.MathFunctions.exp

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.exp(inputFloatLayer)
      }
  17. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  18. implicit def floatToLiteral: Aux[Float, Float, Float]

    Permalink
  19. implicit def floatTrainable: Trainable[Float, Float]

    Permalink
  20. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  21. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  22. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  23. implicit def log(Float)[Input <: Batch]: Aux[Aux[Input, Batch], Aux[Input, Batch]]

    Permalink

    Returns a Poly.MathFunctions.log.Case that accepts Float Layer for the polymorphic function Poly.MathFunctions.log

    Returns a Poly.MathFunctions.log.Case that accepts Float Layer for the polymorphic function Poly.MathFunctions.log

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.log(inputFloatLayer)
      }
  24. implicit def max(Float,Float)[Input <: Batch]: Aux[Aux[Input, Batch], Aux[Input, Batch], Aux[Input, Batch]]

    Permalink

    Returns a Poly.MathFunctions.max.Case that accepts two Float Layers for the polymorphic function Poly.MathFunctions.max

    Returns a Poly.MathFunctions.max.Case that accepts two Float Layers for the polymorphic function Poly.MathFunctions.max

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.max(inputFloatLayer,anotherFloatLayer)
      }
  25. implicit def min(Float,Float)[Input <: Batch]: Aux[Aux[Input, Batch], Aux[Input, Batch], Aux[Input, Batch]]

    Permalink

    Returns a Poly.MathFunctions.min.Case that accepts two Float Layers for the polymorphic function Poly.MathFunctions.min

    Returns a Poly.MathFunctions.min.Case that accepts two Float Layers for the polymorphic function Poly.MathFunctions.min

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.min(inputFloatLayer,anotherFloatLayer)
      }
  26. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  27. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  28. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  29. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  30. implicit def toFloatLayerOps[From, Input <: Batch](from: From)(implicit toLayer: OfPlaceholder[From, Input, FloatPlaceholder]): FloatLayerOps[Input]

    Permalink

    A helper that contains common boilerplate code for all Float layers.

    A helper that contains common boilerplate code for all Float layers.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
  31. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  32. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  33. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  34. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from AnyRef

Inherited from Any

Ungrouped