ColossalAI/examples/tutorial/new_api/glue_bert
Hongxin Liu 3bf09efe74
[booster] update prepare dataloader method for plugin (#3706)
* [booster] add prepare dataloader method for plug

* [booster] update examples and docstr
2023-05-08 15:44:03 +08:00
..
README.md [example] add train resnet/vit with booster example (#3694) 2023-05-08 10:42:30 +08:00
data.py [booster] update prepare dataloader method for plugin (#3706) 2023-05-08 15:44:03 +08:00
finetune.py [example] add finetune bert with booster example (#3693) 2023-05-06 11:53:13 +08:00
requirements.txt [example] add train resnet/vit with booster example (#3694) 2023-05-08 10:42:30 +08:00
test_ci.sh [example] add train resnet/vit with booster example (#3694) 2023-05-08 10:42:30 +08:00

README.md

Finetune BERT on GLUE

🚀 Quick Start

This example provides a training script, which provides an example of finetuning BERT on GLUE dataset.

  • Training Arguments
    • -t, --task: GLUE task to run. Defaults to mrpc.
    • -p, --plugin: Plugin to use. Choices: torch_ddp, torch_ddp_fp16, gemini, low_level_zero. Defaults to torch_ddp.
    • --target_f1: Target f1 score. Raise exception if not reached. Defaults to None.

Install requirements

pip install -r requirements.txt

Train

# train with torch DDP with fp32
colossalai run --nproc_per_node 4 finetune.py

# train with torch DDP with mixed precision training
colossalai run --nproc_per_node 4 finetune.py -p torch_ddp_fp16

# train with gemini
colossalai run --nproc_per_node 4 finetune.py -p gemini

# train with low level zero
colossalai run --nproc_per_node 4 finetune.py -p low_level_zero

Expected F1-score will be:

Model Single-GPU Baseline FP32 Booster DDP with FP32 Booster DDP with FP16 Booster Gemini Booster Low Level Zero
bert-base-uncased 0.86 0.88 0.87 0.88 0.89