BERT

Google's language model that understands context by reading text in both directions simultaneously

Definition

BERT is Google's language model that understands context by reading text in both directions simultaneously.

How It Works

Unlike models that read left-to-right, BERT reads entire sentences at once. This helps it understand that "bank" means different things in "river bank" versus "savings bank."

BERT learned language patterns from millions of web pages and books, then gets fine-tuned for specific tasks.

Why It Matters

BERT dramatically improved search results, translation, and question-answering systems. It showed that understanding context in both directions makes AI much better at language tasks.

Google uses BERT to power search and many other services.


Back to Current AI Models | All Terms

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to implicator.ai.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.