Bert Model wrapper with TensorFlow Wrapper
Id of sentence start Token
Id of sentence end Token.
Configuration for TensorFlow session Paper: https://arxiv.org/abs/1810.04805 Source: https://github.com/google-research/bert
Bert Model wrapper with TensorFlow Wrapper
BERT (Bidirectional Encoder Representations from Transformers) provides dense vector representations for natural language by using a deep, pre-trained neural network with the Transformer architecture
See https://github.com/JohnSnowLabs/spark-nlp/blob/master/src/test/scala/com/johnsnowlabs/nlp/embeddings/BertEmbeddingsTestSpec.scala for further reference on how to use this API. Sources:
0 : corresponds to first layer (embeddings)
-1 : corresponds to last layer
2 : second-to-last layer