You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/examples/language
flybird11111 64f74a157e
[NPU]support npu (#6089)
5 days ago
..
bert [NPU]support npu (#6089) 5 days ago
commons [example] make gpt example directory more clear (#2353) 2 years ago
deepseek [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 3 months ago
gpt [Feature] Split cross-entropy computation in SP (#5959) 3 months ago
grok-1 [misc] refactor launch API and tensor constructor (#5666) 7 months ago
llama [NPU]support npu (#6089) 5 days ago
mixtral [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 3 months ago
opt [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
palm [misc] refactor launch API and tensor constructor (#5666) 7 months ago
__init__.py [example]add gpt2 benchmark example script. (#5295) 9 months ago
data_utils.py [devops] remove post commit ci (#5566) 8 months ago
model_utils.py [example]add gpt2 benchmark example script. (#5295) 9 months ago
performance_evaluator.py [Feature] Split cross-entropy computation in SP (#5959) 3 months ago