BERT
Google's language model that understands context by reading text in both directions simultaneously
Definition
BERT is Google's language model that understands context by reading text in both directions simultaneously.
How It Works
Unlike models that read left-to-right, BERT reads entire sentences at once. This helps it understand that "bank" means different things in "river bank" versus "savings bank."
BERT learned language patterns from millions of web pages and books, then gets fine-tuned for specific tasks.
Why It Matters
BERT dramatically improved search results, translation, and question-answering systems. It showed that understanding context in both directions makes AI much better at language tasks.
Google uses BERT to power search and many other services.
← Back to Current AI Models | All Terms