Definition
GPT is a family of large language models developed by OpenAI, known for their ability to generate human-like text.
- **GPT Evolution:**
- GPT-1: (2018): 117M parameters, proved the concept
- GPT-2: (2019): 1.5B parameters, shockingly good text generation
- GPT-3: (2020): 175B parameters, few-shot learning breakthrough
- GPT-4: (2023): ~1.7T parameters (estimated), multimodal
- GPT-4o: (2024): Optimized for speed and multimodal
Key Features: - Decoder-only transformer architecture - Trained on diverse internet text - Instruction-tuned versions (InstructGPT, ChatGPT) - Available via API and ChatGPT interface
Examples
ChatGPT is powered by GPT-4, while the API offers various GPT models.
Related Terms
AI models trained on massive text datasets that can understand and generate human-like text.
A neural network architecture using self-attention mechanisms, the foundation of modern LLMs.
AI research company that created GPT, ChatGPT, DALL-E, and other leading AI systems.
OpenAI's conversational AI interface, the most widely used AI chatbot globally.
Want more AI knowledge?
Get bite-sized AI concepts delivered to your inbox.
Free daily digest. No spam, unsubscribe anytime.