ColossalAI/examples/tutorial/opt/zero
BoxiangW ca6e75bc28
[tutorial] edited hands-on practices (#1899)
* Add handson to ColossalAI.

* Change names of handsons and edit sequence parallel example.

* Edit wrong folder name

* resolve conflict

* delete readme
2022-11-11 17:08:17 +08:00
..
README.md [tutorial] edited hands-on practices (#1899) 2022-11-11 17:08:17 +08:00
requirements.txt [tutorial] edited hands-on practices (#1899) 2022-11-11 17:08:17 +08:00
run.sh [tutorial] edited hands-on practices (#1899) 2022-11-11 17:08:17 +08:00
train_gpt_demo.py [tutorial] edited hands-on practices (#1899) 2022-11-11 17:08:17 +08:00

README.md

Overview

This example shows how to use ColossalAI to run huggingface GPT training with Gemini and ZeRO DDP.

GPT

We use the huggingface transformers GPT2 model. The input data is randonly generated.

Our Modifications

We adapt the OPT training code to ColossalAI by leveraging Gemini and ZeRO DDP.

Quick Start

You can launch training by using the following bash script

pip install -r requirements.txt
bash run.sh