You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/examples/language/gpt
Jiarui Fang a25f755331
[example] add TP to GPT example (#1828)
2 years ago
..
README.md [example] simplify the GPT2 huggingface example (#1826) 2 years ago
requirements.txt [example] simplify the GPT2 huggingface example (#1826) 2 years ago
run.sh [example] add TP to GPT example (#1828) 2 years ago
train_gpt_demo.py [example] add TP to GPT example (#1828) 2 years ago

README.md

Overview

This example shows how to use ColossalAI to run huggingface GPT training in distributed manners.

GPT

We use the huggingface transformers GPT2 model. The input data is randonly generated.

Our Modifications

We adapt the OPT training code to ColossalAI by leveraging Gemini and ZeRO DDP.

Quick Start

You can launch training by using the following bash script

pip install -r requirements.txt
bash run.sh