2022-11-08 02:58:17 +00:00
|
|
|
## Overview
|
2022-11-08 08:14:07 +00:00
|
|
|
This example shows how to use ColossalAI to run huggingface GPT training in distributed manners.
|
2022-11-08 02:58:17 +00:00
|
|
|
|
2022-11-08 08:14:07 +00:00
|
|
|
## GPT
|
|
|
|
We use the huggingface transformers GPT2 model. The input data is randonly generated.
|
2022-11-08 02:58:17 +00:00
|
|
|
|
2022-11-08 08:14:07 +00:00
|
|
|
## Our Modifications
|
|
|
|
We adapt the OPT training code to ColossalAI by leveraging Gemini and ZeRO DDP.
|
2022-11-08 02:58:17 +00:00
|
|
|
|
2022-11-08 08:14:07 +00:00
|
|
|
## Quick Start
|
|
|
|
You can launch training by using the following bash script
|
2022-11-08 02:58:17 +00:00
|
|
|
|
|
|
|
```bash
|
2022-11-08 08:14:07 +00:00
|
|
|
pip install -r requirements.txt
|
|
|
|
bash run.sh
|
2022-11-08 02:58:17 +00:00
|
|
|
```
|