Encoder-decoder architectures

Encoder-decoder architectures

The encoder-decoder architecture was the original transformer architecture that was proposed by Vaswani et al. The aim here will be to do what we did for encoders (MiniLM) and decoders (GPT-2) and provide a decoupled reconstruction of the operations of an encoder-decoder model. The notebook is not yet ready but we are working on T5-small.

Next page

Seedling: LLM training case study

Last page

Decoder architectures

Return home

Psychometrics.ai

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0).

image

image