Transformations > Vector Embedding transformation > Vector embedding techniques
  

Vector embedding techniques

Use a vector embedding technique to create a vector embedding for input text. You can choose a technique based on the pre-trained model you want to use to convert the text to a vector.
A vector embedding represents the text as an array of numbers. Each element in the array represents a different dimension of the text. To create vector embeddings, select an input column for embedding and then select one of the following vector embedding techniques:
Word embedding
Convert each word to a vector using the Word2Vec Gigaword 5th Edition model with 300 dimensions (word2vec_giga_300). Useful for text classification and sentiment analysis.
BERT embedding
Convert each sentence to a vector using the Smaller BERT Embedding (L-2_H-768_A-12) model with 768 dimensions (small_bert_L2_768). Useful for text classification and semantic search.
Consider the following rules and guidelines for vector embeddings: