Packages

case class LayerNorm(scale: Constant, bias: Constant, eps: Double, normalizedDim: List[Int]) extends Module with Product with Serializable

Linear Supertypes
Serializable, Serializable, Product, Equals, GenericModule[Variable, Variable], AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. LayerNorm
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. GenericModule
  7. AnyRef
  8. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new LayerNorm(scale: Constant, bias: Constant, eps: Double, normalizedDim: List[Int])

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def apply[S](a: Variable)(implicit arg0: Sc[S]): Variable

    Alias of forward

    Alias of forward

    Definition Classes
    GenericModule
  5. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  6. val bias: Constant
  7. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  8. val eps: Double
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. def forward[S](x: Variable)(implicit arg0: Sc[S]): Variable

    The implementation of the function.

    The implementation of the function.

    In addition of x it can also use all the state to compute its value.

    Definition Classes
    LayerNormGenericModule
  12. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  13. final def gradients(loss: Variable, zeroGrad: Boolean = true): Seq[Option[STen]]

    Computes the gradient of loss with respect to the parameters.

    Computes the gradient of loss with respect to the parameters.

    Definition Classes
    GenericModule
  14. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  15. final def learnableParameters: Long

    Returns the total number of optimizable parameters.

    Returns the total number of optimizable parameters.

    Definition Classes
    GenericModule
  16. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  17. val normalizedDim: List[Int]
  18. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  19. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  20. final def parameters: Seq[(Constant, PTag)]

    Returns the state variables which need gradient computation.

    Returns the state variables which need gradient computation.

    Definition Classes
    GenericModule
  21. val scale: Constant
  22. val state: List[(Constant, LeafTag with Product with Serializable)]

    List of optimizable, or non-optimizable, but stateful parameters

    List of optimizable, or non-optimizable, but stateful parameters

    Stateful means that the state is carried over the repeated forward calls.

    Definition Classes
    LayerNormGenericModule
  23. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  24. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  25. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  26. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  27. final def zeroGrad(): Unit
    Definition Classes
    GenericModule

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from GenericModule[Variable, Variable]

Inherited from AnyRef

Inherited from Any

Ungrouped