Encoder-decoder architectures

Encoder-decoder architectures

The encoder-decoder architecture was the original transformer architecture that was proposed by Vaswani et al. The aim here will be to do what we did for encoders (MiniLM) and decoders (GPT-2) and provide a decoupled reconstruction of the operations of an encoder-decoder model. The notebook is not yet ready but we are working on T5-small.

Next page

Training a small LLM yourself

Last page

Decoder architectures

Return home

Psychometrics.ai