Perplexity (Metric)
Category: Technical Terms
Category: Technical Terms
Definition
Perplexity measures how well a language model predicts text - lower perplexity means better prediction accuracy.
How It Works
Perplexity calculates how "surprised" a model is by text. If the model assigns high probability to the actual next words, perplexity is low. If it's constantly wrong about what comes next, perplexity is high.
Think of it as a test score where lower is better - a perplexity of 10 is much better than 100.
Why It Matters
Perplexity provides an objective way to compare language models. Researchers use it to track progress and identify which training techniques work best.
It's one of the key metrics for evaluating whether a new model actually improves on existing ones.
← Back to Technical Terms | All Terms