BertEncoder

lamp.nn.bert.BertEncoder$
See theBertEncoder companion class
object BertEncoder

Attributes

Companion
class
Graph
Supertypes
trait Product
trait Mirror
class Object
trait Matchable
class Any
Self type

Members list

Type members

Classlikes

Attributes

Supertypes
trait LeafTag
trait PTag
class Object
trait Matchable
class Any
Self type

Inherited types

type MirroredElemLabels <: Tuple

The names of the product elements

The names of the product elements

Attributes

Inherited from:
Mirror
type MirroredLabel <: String

The name of the type

The name of the type

Attributes

Inherited from:
Mirror

Value members

Concrete methods

def apply[S : Sc](maxLength: Int, vocabularySize: Int, segmentVocabularySize: Int, numBlocks: Int, embeddingDim: Int, attentionHiddenPerHeadDim: Int, attentionNumHeads: Int, mlpHiddenDim: Int, dropout: Double, tOpt: STenOptions, linearized: Boolean, positionEmbedding: Option[STen]): BertEncoder

Factory for the encoder module of Bert

Factory for the encoder module of Bert

Input is (tokens, segments) where tokens and segments are both (batch,num tokens) long tensor.

Value parameters

attentionHiddenPerHeadDim

size of hidden attention dimension of each attention head

attentionNumHeads

number of attention heads

dropout

dropout rate

embeddingDim

input embedding dimension

maxLength

maximum num token length

mlpHiddenDim

size of hidden dimension of the two layer perceptron

numBlocks

number of transformer blocks to create

out

output dimension

positionEmbedding

optional float tensor of size (sequence length, embedding dimension) if missing the absolute positional embeddings from Vaswani et al 2017 is used Following the Bert paper the position embeddings are summed

tOpt

tensor options

vocabularySize

vocabulary size

Attributes

Returns

a module

Implicits

Implicits

implicit val load: Load[BertEncoder]