Variable

sealed trait Variable

A value of a tensor valued function, a vertex in the computational graph.

A Variable may be constant, i.e. depends on no other Variables. Constant variables may or may not need their partial derivatives computed.

Companion:
object
class Object
trait Matchable
class Any

Value members

Abstract methods

def op: Option[Op]

The parent operation of this value in the computational graph. Empty for constants.

The parent operation of this value in the computational graph. Empty for constants.

def partialDerivative: Option[STen]

The partial derivative, or a placeholder tensor for the partial derivative.

The partial derivative, or a placeholder tensor for the partial derivative.

Returns empty iff this Variable needs no gradient computation. Otherwise a placeholder tensor is allocated upfront when the Variable is allocated.

def value: STen

The actual tensor value of this Variable.

The actual tensor value of this Variable.

Concrete methods

def *[S : Sc](other: Variable): Variable
def *[S : Sc](other: Double): Variable
def +[S : Sc](other: Variable): Variable
def +[S : Sc](other: Double): Variable
def -[S : Sc](other: Variable): Variable
def /[S : Sc](other: Variable): Variable
def argmax[S : Sc](dim: Long, keepDim: Boolean): Variable
def assign[S : Sc](other: Variable): Variable
def atan[S : Sc]: Variable
def backprop(): Unit

Runs the backpropagation algorithm starting from this value

Runs the backpropagation algorithm starting from this value

Only meaningful if this is scalar i.e. the number of elements in the value tensor is 1.

def binaryCrossEntropyWithLogitsLoss[S : Sc](target: STen, posWeights: Option[STen], reduction: Reduction): Variable
def bmm[S : Sc](other: Variable): Variable
def cast[S : Sc](precision: FloatingPointPrecision): Variable
def cat[S : Sc](other: Variable, dim: Long): Variable
def choleskyLower[S : Sc]: Variable
def choleskySolve[S : Sc](factor: Variable, upper: Boolean): Variable
def clamp[S : Sc](min: Variable, max: Variable): Variable
def colSum[S : Sc]: Variable
def cos[S : Sc]: Variable
def cross[S : Sc](other: Variable, dim: Int): Variable
def crossEntropy[S : Sc](other: Variable): Variable
def debug[S : Sc](fun: (STen, Boolean, Boolean) => Unit): Variable

Returns an other Variable wrapping the same value tensor, without any parent and with needsGrad=false.

Returns an other Variable wrapping the same value tensor, without any parent and with needsGrad=false.

def diag[S : Sc](diagonal: Long): Variable
def dropout[S : Sc](prob: Double, train: Boolean): Variable
def euclideanDistance[S : Sc](b: Variable, dim: Int): Variable
def exp[S : Sc]: Variable
def expand[S : Sc](shape: List[Long]): Variable
def expandAs[S : Sc](other: STen): Variable
def flatten[S : Sc]: Variable
def flatten[S : Sc](startDim: Int): Variable
def flatten[S : Sc](startDim: Int, endDim: Int): Variable
def flattenLastDimensions[S : Sc](dims: Int): Variable
def gelu[S : Sc]: Variable
def hardSwish[S : Sc]: Variable
def indexAdd[S : Sc](index: Variable, dim: Int, maxIndex: Long): Variable
def indexAddFromSource[S : Sc](index: Variable, dim: Int, source: Variable): Variable
def indexFill[S : Sc](index: Variable, dim: Int, fillValue: Double): Variable
def indexSelect[S : Sc](dim: Long, index: Variable): Variable
def inv[S : Sc]: Variable
def l1Loss[S : Sc](target: STen, reduction: Reduction): Variable
def leakyRelu[S : Sc](negativeSlope: Double): Variable
def log[S : Sc]: Variable
def log1p[S : Sc]: Variable
def logSoftMax[S : Sc](dim: Int): Variable
def logdet[S : Sc]: Variable
def makeBooleanMask[S : Sc](q: Long): Variable
def maskFill[S : Sc](mask: Variable, fill: Double): Variable
def maskSelect[S : Sc](mask: Variable): Variable
def maximum[S : Sc](other: Variable): Variable
def mean[S : Sc](dim: List[Int]): Variable
def mean[S : Sc](dim: List[Int], keepDim: Boolean): Variable
def minimum[S : Sc](other: Variable): Variable
def mm[S : Sc](other: Variable): Variable
def mseLoss[S : Sc](target: STen, reduction: Reduction): Variable
def needsGrad: Boolean

Returns true if lamp.autograd.Variable.partialDerivative is defined.

Returns true if lamp.autograd.Variable.partialDerivative is defined.

def nllLoss[S : Sc](target: STen, weights: STen, reduction: Reduction, ignore: Long): Variable
def norm2[S : Sc](dim: List[Int]): Variable
def norm2[S : Sc](dim: List[Int], keepDim: Boolean): Variable
def normalize[S : Sc](dim: List[Int], eps: Double): Variable
def oneHot[S : Sc](numClasses: Int): Variable
def options[S : Sc]: STenOptions

Returns the tensor options of its value.

Returns the tensor options of its value.

def pinv[S : Sc](rcond: Double): Variable
def pow[S : Sc](const: Double): Variable
def pow[S : Sc](exponent: Variable): Variable
def relu[S : Sc]: Variable
def repeatInterleave[S : Sc](repeats: Variable, dim: Int): Variable
def reshape[S : Sc](shape: List[Long]): Variable
def rowSum[S : Sc]: Variable
def scatterAdd[S : Sc](index: Variable, dim: Int, maxIndex: Long): Variable
def select[S : Sc](dim: Long, index: Long): Variable
def shape: List[Long]

Returns the shape of its value.

Returns the shape of its value.

def sigmoid[S : Sc]: Variable
def sin[S : Sc]: Variable
def slice[S : Sc](dim: Long, start: Long, end: Long, step: Long): Variable
def softplus[S : Sc](beta: Double, threshold: Double): Variable
def sum[S : Sc]: Variable
def sum[S : Sc](dim: List[Int], keepDim: Boolean): Variable
def swish1[S : Sc]: Variable
def t[S : Sc]: Variable

Returns a new variable with the first two dimensions transposed.

Returns a new variable with the first two dimensions transposed.

def tan[S : Sc]: Variable
def tanh[S : Sc]: Variable
def toDense[S : Sc]: Variable
def toDoubleArray: Array[Double]
def toLongArray: Array[Long]
override def toString: String
Definition Classes
Any
def transpose[S : Sc](dim1: Int, dim2: Int): Variable

Returns a new variable with the respective dimensions transposed.

Returns a new variable with the respective dimensions transposed.

def variance[S : Sc](dim: List[Int]): Variable
def view[S : Sc](shape: List[Long]): Variable

Returns an other Variable wrapping the same value tensor, without any parent and with needsGrad=true.

Returns an other Variable wrapping the same value tensor, without any parent and with needsGrad=true.

def zeroGrad(): Unit

In place zeros out the partial derivative

In place zeros out the partial derivative

def zipBackward(fn: (STen, STen) => Unit): (Variable, (STen, STen) => Unit)

Returns a pair of this instance and the supplied function

Returns a pair of this instance and the supplied function

Concrete fields

val id: UUID

Returns unique, stable and random UUID.

Returns unique, stable and random UUID.

val sizes: List[Long]

Returns the shape of its value.

Returns the shape of its value.

lazy val wengert: Seq[Variable]

Returns the Wengert list

Returns the Wengert list