mirror of https://github.com/hpcaitech/ColossalAI
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Jiarui Fang
a25f755331
|
2 years ago | |
---|---|---|
.. | ||
README.md | 2 years ago | |
requirements.txt | 2 years ago | |
run.sh | 2 years ago | |
train_gpt_demo.py | 2 years ago |
README.md
Overview
This example shows how to use ColossalAI to run huggingface GPT training in distributed manners.
GPT
We use the huggingface transformers GPT2 model. The input data is randonly generated.
Our Modifications
We adapt the OPT training code to ColossalAI by leveraging Gemini and ZeRO DDP.
Quick Start
You can launch training by using the following bash script
pip install -r requirements.txt
bash run.sh