Distillation
Category: Emerging Concepts
Category: Emerging Concepts
Definition
Distillation creates smaller, faster AI models that mimic the behavior of larger, more complex ones while maintaining most of their capabilities.
How It Works
A large "teacher" model trains a smaller "student" model by sharing its outputs on training data. The student learns to match the teacher's responses, capturing its knowledge in a compact form.
This transfers capabilities without copying the full architecture.
Why It Matters
Distillation makes powerful AI accessible on phones and edge devices. It reduces costs and energy usage while maintaining performance for most tasks.
Companies use distillation to deploy AI at scale without expensive infrastructure.
← Back to Emerging Concepts | All Terms