mirror of https://github.com/hpcaitech/ColossalAI
![]() * Add handson to ColossalAI. * Change names of handsons and edit sequence parallel example. * Edit wrong folder name * resolve conflict * delete readme |
||
---|---|---|
.. | ||
README.md | ||
requirements.txt | ||
run.sh | ||
train_gpt_demo.py |
README.md
Overview
This example shows how to use ColossalAI to run huggingface GPT training with Gemini and ZeRO DDP.
GPT
We use the huggingface transformers GPT2 model. The input data is randonly generated.
Our Modifications
We adapt the OPT training code to ColossalAI by leveraging Gemini and ZeRO DDP.
Quick Start
You can launch training by using the following bash script
pip install -r requirements.txt
bash run.sh