You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/examples/language/bert
wukong1992 a55fb00c18
[booster] update bert example, using booster api (#3885)
1 year ago
..
README.md [booster] update bert example, using booster api (#3885) 1 year ago
benchmark.py [booster] update bert example, using booster api (#3885) 1 year ago
benchmark.sh [booster] update bert example, using booster api (#3885) 1 year ago
benchmark_utils.py [booster] update bert example, using booster api (#3885) 1 year ago
data.py [booster] update bert example, using booster api (#3885) 1 year ago
finetune.py [booster] update bert example, using booster api (#3885) 1 year ago
requirements.txt [booster] update bert example, using booster api (#3885) 1 year ago
test_ci.sh [booster] update bert example, using booster api (#3885) 1 year ago

README.md

Overview

This directory includes two parts: Using the Booster API fintune Huggingface Bert and AlBert models and benchmarking Bert and AlBert models with different Booster Plugin.

Finetune

bash test_ci.sh

Benchmark

bash benchmark.sh

Now include these metrics in benchmark: CUDA mem occupy, throughput and the number of model parameters. If you have custom metrics, you can add them to benchmark_util.

Results

Bert

max cuda mem throughput(sample/s) params
ddp 21.44 GB 3.0 82M
ddp_fp16 16.26 GB 11.3 82M
gemini 11.0 GB 12.9 82M
low_level_zero 11.29 G 14.7 82M

AlBert

max cuda mem throughput(sample/s) params
ddp OOM
ddp_fp16 OOM
gemini 69.39 G 1.3 208M
low_level_zero 56.89 G 1.4 208M