A→Z
A2ZAI
Back to Glossary
models

GPT (Generative Pre-trained Transformer)

OpenAI's series of large language models that power ChatGPT.

Share:

Definition

GPT is a family of large language models developed by OpenAI, known for their ability to generate human-like text.

  • **GPT Evolution:**
  • GPT-1: (2018): 117M parameters, proved the concept
  • GPT-2: (2019): 1.5B parameters, shockingly good text generation
  • GPT-3: (2020): 175B parameters, few-shot learning breakthrough
  • GPT-4: (2023): ~1.7T parameters (estimated), multimodal
  • GPT-4o: (2024): Optimized for speed and multimodal

Key Features: - Decoder-only transformer architecture - Trained on diverse internet text - Instruction-tuned versions (InstructGPT, ChatGPT) - Available via API and ChatGPT interface

Examples

ChatGPT is powered by GPT-4, while the API offers various GPT models.

Want more AI knowledge?

Get bite-sized AI concepts delivered to your inbox.

Free daily digest. No spam, unsubscribe anytime.

Discussion