A→Z
A2ZAI
Back to Glossary
techniques

Embeddings

Numerical representations of data (text, images) that capture semantic meaning.

Share:

Definition

Embeddings are dense vector representations of data where similar items are mapped to nearby points in a high-dimensional space.

Why Embeddings Matter: - Convert unstructured data to numbers computers can process - Capture semantic relationships (king - man + woman ≈ queen) - Enable similarity search and clustering - Foundation for RAG and recommendation systems

  • **Types:**
  • Word Embeddings: Represent individual words (Word2Vec, GloVe)
  • Sentence Embeddings: Represent entire sentences
  • Image Embeddings: Represent images (CLIP)

Common Dimensions: - OpenAI ada-002: 1,536 dimensions - BERT: 768 dimensions - Large models: Up to 4,096+ dimensions

Examples

Searching for "automobile" and finding documents about "cars" because their embeddings are similar.

Want more AI knowledge?

Get bite-sized AI concepts delivered to your inbox.

Free daily digest. No spam, unsubscribe anytime.

Discussion