PositionalEmbedding¶
-
class
PositionalEmbedding
(num_embeddings, embedding_dim, padding_idx, **kwargs)[source]¶ Class for learned positional embeddings. Behaviour is similar to torch.nn.Embedding. Each position index will be assigned its own embedding. Note that this restricts the maximum length of the input sequence length.
This is not the sinosoidal positional embedding introducted in the Transformer model architecture (Vaswani et al., 2017). These embeddings do not have any functional constraints on how they should behave. The embeddings are learned entirely from scratch.