Category: Technical Terms
Definition
Gradient descent is an algorithm that helps AI models learn by gradually adjusting their parameters to reduce errors.
How It Works
Think of rolling a ball down a hill to find the lowest point. Gradient descent does the same thing with math - it finds the combination of settings that gives the model the fewest mistakes.
The algorithm takes small steps in the direction that reduces errors most quickly.
Why It Matters
Gradient descent is how most AI systems learn. Without it, neural networks couldn't adjust their billions of parameters to get better at their tasks.
It's the engine behind nearly every AI training process.
← Back to Technical Terms | All Terms