You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/examples/tutorial/new_api/glue_bert
Hongxin Liu 3bf09efe74
[booster] update prepare dataloader method for plugin (#3706)
2 years ago
..
README.md [example] add train resnet/vit with booster example (#3694) 2 years ago
data.py [booster] update prepare dataloader method for plugin (#3706) 2 years ago
finetune.py [example] add finetune bert with booster example (#3693) 2 years ago
requirements.txt [example] add train resnet/vit with booster example (#3694) 2 years ago
test_ci.sh [example] add train resnet/vit with booster example (#3694) 2 years ago

README.md

Finetune BERT on GLUE

🚀 Quick Start

This example provides a training script, which provides an example of finetuning BERT on GLUE dataset.

  • Training Arguments
    • -t, --task: GLUE task to run. Defaults to mrpc.
    • -p, --plugin: Plugin to use. Choices: torch_ddp, torch_ddp_fp16, gemini, low_level_zero. Defaults to torch_ddp.
    • --target_f1: Target f1 score. Raise exception if not reached. Defaults to None.

Install requirements

pip install -r requirements.txt

Train

# train with torch DDP with fp32
colossalai run --nproc_per_node 4 finetune.py

# train with torch DDP with mixed precision training
colossalai run --nproc_per_node 4 finetune.py -p torch_ddp_fp16

# train with gemini
colossalai run --nproc_per_node 4 finetune.py -p gemini

# train with low level zero
colossalai run --nproc_per_node 4 finetune.py -p low_level_zero

Expected F1-score will be:

Model Single-GPU Baseline FP32 Booster DDP with FP32 Booster DDP with FP16 Booster Gemini Booster Low Level Zero
bert-base-uncased 0.86 0.88 0.87 0.88 0.89