kr.ac.kaist.ir.deep.layer

DropoutOperation

class DropoutOperation extends Layer

Layer that drop-outs its input.

This layer has a function of "pipeline" with drop-out possibility. Because dropping out neurons occurr in the hidden layer, we need some intermediate pipe that handle this feature. This layer only conveys its input to its output synapse if that output is alive.

Linear Supertypes
Layer, Serializable, Serializable, (ScalarMatrix) ⇒ ScalarMatrix, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. DropoutOperation
  2. Layer
  3. Serializable
  4. Serializable
  5. Function1
  6. AnyRef
  7. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new DropoutOperation(presence: Probability = 1.0f)

    presence

    The probability of the neuron is alive. (Default: 1.0, 100%)

Value Members

  1. final def !=(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  4. val W: IndexedSeq[ScalarMatrix]

    weights for update

    weights for update

    returns

    weights

    Definition Classes
    DropoutOperationLayer
  5. val act: Null

    Null activation

    Null activation

    Attributes
    protected
    Definition Classes
    DropoutOperationLayer
  6. def andThen[A](g: (ScalarMatrix) ⇒ A): (ScalarMatrix) ⇒ A

    Definition Classes
    Function1
    Annotations
    @unspecialized()
  7. def apply(x: ScalarMatrix): ScalarMatrix

    Forward computation

    Forward computation

    x

    input matrix

    returns

    output matrix

    Definition Classes
    DropoutOperationLayer → Function1
  8. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  9. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  10. def compose[A](g: (A) ⇒ ScalarMatrix): (A) ⇒ ScalarMatrix

    Definition Classes
    Function1
    Annotations
    @unspecialized()
  11. val dW: IndexedSeq[ScalarMatrix]

    accumulated delta values

    accumulated delta values

    returns

    delta-weight

    Definition Classes
    DropoutOperationLayer
  12. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  13. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  14. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  15. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  16. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  17. def into_:(x: ScalarMatrix): ScalarMatrix

    Sugar: Forward computation.

    Sugar: Forward computation. Calls apply(x)

    x

    input matrix

    returns

    output matrix

    Attributes
    protected[kr.ac.kaist.ir.deep]
    Definition Classes
    DropoutOperationLayer
  18. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  19. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  20. final def notify(): Unit

    Definition Classes
    AnyRef
  21. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  22. var onoff: ScalarMatrix

    Attributes
    protected
  23. val presence: Probability

    The probability of the neuron is alive.

    The probability of the neuron is alive. (Default: 1.0, 100%)

    Attributes
    protected
  24. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  25. def toJSON: JsObject

    Translate this layer into JSON object (in Play! framework)

    Translate this layer into JSON object (in Play! framework)

    returns

    JSON object describes this layer

    Definition Classes
    DropoutOperationLayer
  26. def toString(): String

    Definition Classes
    Function1 → AnyRef → Any
  27. def updateBy(error: ScalarMatrix, input: ScalarMatrix, output: ScalarMatrix): ScalarMatrix

    Backward computation.

    Backward computation.

    error

    to be propagated ( dG / dF is propagated from higher layer )

    input

    of this layer (in this case, x = entry of dX / dw)

    output

    of this layer (in this case, y)

    returns

    propagated error (in this case, dG/dx )

    Attributes
    protected[kr.ac.kaist.ir.deep]
    Definition Classes
    DropoutOperationLayer
    Note

    Because this layer only mediates two layers, this layer just remove propagated error for unused elements.

  28. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  29. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Layer

Inherited from Serializable

Inherited from Serializable

Inherited from (ScalarMatrix) ⇒ ScalarMatrix

Inherited from AnyRef

Inherited from Any

Ungrouped