Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
botbw ff14144d9c [tmp] add write_tensor 4 weeks ago
..
_C Clean up 6 months ago
_analyzer [test] Fix/fix testcase (#5770) 6 months ago
accelerator [hotfix] fix typo change MoECheckpintIO to MoECheckpointIO (#5335) 9 months ago
amp [npu] change device to accelerator api (#5239) 11 months ago
auto_parallel [pre-commit.ci] pre-commit autoupdate (#5572) 5 months ago
autochunk [hotfix] Fix examples no pad token & auto parallel codegen bug; (#5606) 7 months ago
booster [shardformer] optimize seq parallelism (#6086) 1 month ago
checkpoint_io [colossalai/checkpoint_io/...] fix bug in load_state_dict_into_model; format error msg (#6020) 3 months ago
cli [devops] fix extention building (#5427) 9 months ago
cluster [FP8] rebase main (#5963) 4 months ago
context [Fix]: implement thread-safety singleton to avoid deadlock for very large-scale training scenarios (#5625) 7 months ago
device [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 6 months ago
fx [test] Fix/fix testcase (#5770) 6 months ago
inference [shardformer] fix linear 1d row and support uneven splits for fused qkv linear (#6084) 1 month ago
interface [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 6 months ago
kernel [pre-commit.ci] auto fixes from pre-commit.com hooks 2 months ago
lazy [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
legacy [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
logging [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
moe [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 2 months ago
nn [misc] fix dist logger (#5782) 6 months ago
pipeline [fp8] hotfix backward hook (#6053) 2 months ago
quantization [fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059) 2 months ago
shardformer [Coati] Train DPO using PP (#6054) 1 month ago
tensor [fp8] support fp8 amp for hybrid parallel plugin (#5975) 4 months ago
testing [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
utils [tmp] add write_tensor 4 weeks ago
zero [fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059) 2 months ago
__init__.py [devops] remove post commit ci (#5566) 8 months ago
initialize.py [fp8] hotfix backward hook (#6053) 2 months ago