You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/examples/language
botbw c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048)
3 months ago
..
bert [Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928) 4 months ago
commons [example] make gpt example directory more clear (#2353) 2 years ago
deepseek [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 3 months ago
gpt [Feature] Split cross-entropy computation in SP (#5959) 3 months ago
grok-1 [misc] refactor launch API and tensor constructor (#5666) 7 months ago
llama [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 3 months ago
mixtral [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 3 months ago
opt [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
palm [misc] refactor launch API and tensor constructor (#5666) 7 months ago
__init__.py [example]add gpt2 benchmark example script. (#5295) 9 months ago
data_utils.py [devops] remove post commit ci (#5566) 8 months ago
model_utils.py [example]add gpt2 benchmark example script. (#5295) 9 months ago
performance_evaluator.py [Feature] Split cross-entropy computation in SP (#5959) 3 months ago