Definition
An epoch represents one full cycle through all training data during model training.
Training Process: - Multiple epochs typically needed - Each epoch: Model sees all examples once - Parameters updated many times per epoch
How Many Epochs? - Depends on dataset size and complexity - Too few: Underfitting - Too many: Overfitting - Use validation loss to decide
Typical Ranges: - Image classification: 50-200 epochs - LLM pre-training: 1-2 epochs (data is huge) - Fine-tuning: 3-10 epochs
Early Stopping: - Monitor validation loss - Stop when it starts increasing - Prevents overfitting
Examples
Training for 100 epochs means the model sees each training image 100 times.
Related Terms
Want more AI knowledge?
Get bite-sized AI concepts delivered to your inbox.
Free intelligence briefs. No spam, unsubscribe anytime.