Skip to main content
Log in

Padding tokens

Extra tokens (usually zeros or special tokens) that are added to the input for a model so that the input matches the model's fixed input length or to ensure that all sequences in a batch have the same length.

In transformer models, padding tokens have been mostly replaced with ragged tensors.