Embedding
Embedding
Bases: Module
Source code in src/transformer/modules/embedding.py
6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
|
__init__(d_model, vocab_size)
Transformer Input Embedding
Parameters:
Name | Type | Description | Default |
---|---|---|---|
d_model |
int
|
the Transformer model dimension |
required |
vocab_size |
int
|
number of terms in our vocabulary |
required |
Source code in src/transformer/modules/embedding.py
8 9 10 11 12 13 14 15 16 17 18 |
|
forward(x)
Embed our tokenized inputs
Note
Following section 3.4 in "Attention is All You Need", we multiply the embeddings by the square root of the model's dimension
Source code in src/transformer/modules/embedding.py
20 21 22 23 24 25 26 27 28 |
|