Autoregressive Models
← Back to Generative Models
Generate output one token/element at a time, where each generation step conditions on all previous outputs. The foundation of modern LLMs (GPT, Claude, LLaMA). Naturally model sequential dependencies but generation is inherently sequential.
Related
- Decoder-Only Models (autoregressive Transformers)
- Large Language Models (built on autoregressive generation)