Self-Supervised Learning

Back to Learning Types

Generating supervisory signals from the data itself, without manual labeling. The foundation of modern large language models and many computer vision breakthroughs.

Key Properties

How It Works

A pretext task is designed where part of the input is hidden and the model must predict it. Examples include masked language modeling (BERT), next token prediction (GPT), and contrastive learning (SimCLR, CLIP).


ml self-supervised