Definition
Embeddings are dense vector representations of data where similar items are mapped to nearby points in a high-dimensional space.
Why Embeddings Matter: - Convert unstructured data to numbers computers can process - Capture semantic relationships (king - man + woman ≈ queen) - Enable similarity search and clustering - Foundation for RAG and recommendation systems
- **Types:**
- Word Embeddings: Represent individual words (Word2Vec, GloVe)
- Sentence Embeddings: Represent entire sentences
- Image Embeddings: Represent images (CLIP)
Common Dimensions: - OpenAI ada-002: 1,536 dimensions - BERT: 768 dimensions - Large models: Up to 4,096+ dimensions
Examples
Searching for "automobile" and finding documents about "cars" because their embeddings are similar.
Related Terms
A technique combining information retrieval with text generation to improve accuracy.
Database optimized for storing and searching high-dimensional vector embeddings.
Search that understands meaning and intent, not just keyword matching.
Want more AI knowledge?
Get bite-sized AI concepts delivered to your inbox.
Free daily digest. No spam, unsubscribe anytime.