Category: Safety & Ethics
Definition
Explainable AI creates systems that can describe how they make decisions in terms humans understand.
How It Works
Instead of just giving answers, explainable AI shows its reasoning. It might highlight which words influenced a decision or show which parts of an image mattered most.
The explanations help people understand whether the AI's logic makes sense.
Why It Matters
When AI decides loan applications or medical diagnoses, people need to know why. Explanations build trust and help catch mistakes or biases.
Many industries require explainable AI for regulatory compliance and ethical responsibility.
← Back to Safety & Ethics | All Terms