mirror of https://github.com/hpcaitech/ColossalAI
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
YuliangLiu0306
2731531bc2
|
2 years ago | |
---|---|---|
.. | ||
saved_solution | 2 years ago | |
README.md | 2 years ago | |
auto_parallel_with_gpt.py | 2 years ago | |
gpt_modules.py | 2 years ago | |
requirements.txt | 2 years ago |
README.md
Auto-Parallelism with GPT2
Requirements
Before you can launch training, you need to install the following requirements.
Install PyTorch
#conda
conda install pytorch==1.12.0 torchvision==0.13.0 torchaudio==0.12.0 cudatoolkit=11.3 -c pytorch
#pip
pip install torch==1.12.0+cu113 torchvision==0.13.0+cu113 torchaudio==0.12.0 --extra-index-url https://download.pytorch.org/whl/cu113
Install Colossal-AI v0.2.0 From Official Website
pip install colossalai==0.2.0+torch1.12cu11.3 -f https://release.colossalai.org
Install transformers
pip install transformers
Install pulp and coin-or-cbc
pip install pulp
conda install -c conda-forge coin-or-cbc
Dataset
For simplicity, the input data is randonly generated here.
Training
#Run the auto parallel resnet example with 4 GPUs with a dummy dataset.
colossalai run --nproc_per_node 4 auto_parallel_with_gpt.py