You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests
botbw 4fa6b9509c
[moe] add parallel strategy for shared_expert && fix test for deepseek (#6063)
2 months ago
..
kit [Feature] Split cross-entropy computation in SP (#5959) 3 months ago
test_analyzer
test_auto_parallel
test_autochunk
test_booster [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
test_checkpoint_io [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
test_cluster
test_config
test_device
test_fp8 [fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059) 3 months ago
test_fx [hotfix] fix testcase in test_fx/test_tracer (#5779) 6 months ago
test_infer [release] update version (#6041) 3 months ago
test_lazy [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
test_legacy [FP8] rebase main (#5963) 4 months ago
test_lora [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
test_moe [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 3 months ago
test_optimizer [MoE/ZeRO] Moe refactor with zero refactor (#5821) 5 months ago
test_pipeline [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
test_shardformer [moe] add parallel strategy for shared_expert && fix test for deepseek (#6063) 2 months ago
test_smoothquant
test_tensor
test_zero [FP8] rebase main (#5963) 4 months ago
__init__.py