ColossalAI/examples/language/gpt/auto_parallel_with_gpt
YuliangLiu0306 4b29112ab2
[autoparallel] gpt2 autoparallel examples (#2267)
* [autoparallel] gpt2 autoparallel examples

* polish code

* polish code
2023-01-03 14:23:33 +08:00
..
README.md [autoparallel] gpt2 autoparallel examples (#2267) 2023-01-03 14:23:33 +08:00
auto_parallel_with_gpt.py [autoparallel] gpt2 autoparallel examples (#2267) 2023-01-03 14:23:33 +08:00
gpt_modules.py [autoparallel] gpt2 autoparallel examples (#2267) 2023-01-03 14:23:33 +08:00
requirements.txt [autoparallel] gpt2 autoparallel examples (#2267) 2023-01-03 14:23:33 +08:00

README.md

Auto-Parallelism with GPT2

Requirements

Before you can launch training, you need to install the following requirements.

Install PyTorch

#conda
conda install pytorch==1.12.0 torchvision==0.13.0 torchaudio==0.12.0 cudatoolkit=11.3 -c pytorch
#pip
pip install torch==1.12.0+cu113 torchvision==0.13.0+cu113 torchaudio==0.12.0 --extra-index-url https://download.pytorch.org/whl/cu113

Install Colossal-AI v0.1.12 From Official Website

pip install colossalai==0.1.12+torch1.12cu11.3 -f https://release.colossalai.org

Install transformers

pip install transformers

Install pulp and coin-or-cbc

pip install pulp
conda install -c conda-forge coin-or-cbc

Dataset

For simplicity, the input data is randonly generated here.

Training

#Run the auto parallel resnet example with 4 GPUs with a dummy dataset.
colossalai run --nproc_per_node 4 auto_parallel_with_gpt.py