Category: Technical Terms
Definition
Attention mechanisms help AI focus on relevant parts of input when making decisions.
How It Works
When reading "The cat sat on the mat," attention helps the AI understand that "cat" and "sat" are closely related, even though other words appear between them.
The AI learns which parts of input matter most for each decision.
Why It Matters
Attention mechanisms enabled modern AI breakthroughs. They're the core technology behind transformers, which power ChatGPT, Claude, and most language AI.
Without attention, AI couldn't handle long texts or complex relationships between words.
← Back to Technical Terms | All Terms