ColossalAI/examples/language/gpt
Jiarui Fang a25f755331
[example] add TP to GPT example (#1828)
2022-11-08 17:17:19 +08:00
..
README.md [example] simplify the GPT2 huggingface example (#1826) 2022-11-08 16:14:07 +08:00
requirements.txt [example] simplify the GPT2 huggingface example (#1826) 2022-11-08 16:14:07 +08:00
run.sh [example] add TP to GPT example (#1828) 2022-11-08 17:17:19 +08:00
train_gpt_demo.py [example] add TP to GPT example (#1828) 2022-11-08 17:17:19 +08:00

README.md

Overview

This example shows how to use ColossalAI to run huggingface GPT training in distributed manners.

GPT

We use the huggingface transformers GPT2 model. The input data is randonly generated.

Our Modifications

We adapt the OPT training code to ColossalAI by leveraging Gemini and ZeRO DDP.

Quick Start

You can launch training by using the following bash script

pip install -r requirements.txt
bash run.sh