A→Z
A2ZAI
Back to Glossary
models

BERT

Bidirectional Encoder Representations from Transformers - Google's influential language model.

Share:

Definition

BERT is a transformer-based model from Google that revolutionized NLP by introducing bidirectional pre-training.

Key Innovation: - Reads text bidirectionally (both left-to-right and right-to-left) - Unlike GPT which only reads left-to-right - Better understanding of context

  • **Training Objectives:**
  • Masked Language Modeling (MLM): Predict masked words
  • Next Sentence Prediction (NSP): Predict if sentences follow each other

Impact: - Released 2018, transformed NLP - Powers Google Search improvements - Spawned many variants (RoBERTa, ALBERT, DistilBERT)

Use Cases: - Text classification - Named entity recognition - Question answering - Sentiment analysis

Examples

Google Search uses BERT to better understand search queries.

Want more AI knowledge?

Get bite-sized AI concepts delivered to your inbox.

Free daily digest. No spam, unsubscribe anytime.

Discussion