You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/examples/language/opt
digger-yu b9a8dff7e5
[doc] Fix typo under colossalai and doc(#3618)
2 years ago
..
README.md [example] simplify opt example (#2344) 2 years ago
benchmark.sh [example] simplify opt example (#2344) 2 years ago
requirements.txt [example] fix requirements (#2488) 2 years ago
run_gemini.sh support shardinit option to avoid OPT OOM initializing problem (#3037) 2 years ago
test_ci.sh [CI] add test_ci.sh for palm, opt and gpt (#2475) 2 years ago
train_gemini_opt.py [doc] Fix typo under colossalai and doc(#3618) 2 years ago

README.md

OPT

Meta recently released Open Pretrained Transformer (OPT), a 175-Billion parameter AI language model, which stimulates AI programmers to perform various downstream tasks and application deployments.

The following example of Colossal-AI demonstrates fine-tuning Casual Language Modelling at low cost.

We are using the pre-training weights of the OPT model provided by Hugging Face Hub on the raw WikiText-2 (no tokens were replaced before the tokenization). This training script is adapted from the HuggingFace Language Modelling examples.

Our Modifications

We adapt the OPT training code to ColossalAI by leveraging Gemini and ZeRO DDP.

Quick Start

You can launch training by using the following bash script

bash ./run_gemini.sh