You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/examples/language/gpt/hybridparallelism
Wenxuan Tan 8fd25d6e09
[Feature] Split cross-entropy computation in SP (#5959)
3 months ago
..
benchmark.py [Feature] Split cross-entropy computation in SP (#5959) 3 months ago
data.py [hotfix] Fix examples no pad token & auto parallel codegen bug; (#5606) 7 months ago
finetune.py [Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928) 4 months ago
run.sh [example] add gpt2 HybridParallelPlugin example (#4653) 1 year ago