The encoder-decoder architecture was the original transformer architecture that was proposed by Vaswani et al. The aim here will be to do what we did for encoders (MiniLM) and decoders (GPT-2) and provide a decoupled reconstruction of the operations of an encoder-decoder model. The notebook is not yet ready but we are working on T5-small.
Next page
How LLMs learn (1 of 4): tokenization
Last page
How LLMs learn (1 of 4): tokenization
Return home
Psychometrics.ai
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0).
image