mirror of https://github.com/hpcaitech/ColossalAI
ac18a445fa
* [example] updated large-batch optimizer tutorial * polish code * polish code |
||
---|---|---|
.. | ||
README.md | ||
config.py | ||
requirements.txt | ||
test_ci.sh | ||
train.py |
README.md
Comparison of Large Batch Training Optimization
Table of contents
📚 Overview
This example lets you to quickly try out the large batch training optimization provided by Colossal-AI. We use synthetic dataset to go through the process, thus, you don't need to prepare any dataset. You can try out the Lamb
and Lars
optimizers from Colossal-AI with the following code.
from colossalai.nn.optimizer import Lamb, Lars
🚀 Quick Start
-
Install PyTorch
-
Install the dependencies.
pip install -r requirements.txt
- Run the training scripts with synthetic data.
# run on 4 GPUs
# run with lars
colossalai run --nproc_per_node 4 train.py --config config.py --optimizer lars
# run with lamb
colossalai run --nproc_per_node 4 train.py --config config.py --optimizer lamb