Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
botbw ff14144d9c [tmp] add write_tensor 1 month ago
..
_C Clean up 6 months ago
_analyzer [test] Fix/fix testcase (#5770) 6 months ago
accelerator
amp
auto_parallel [pre-commit.ci] pre-commit autoupdate (#5572) 5 months ago
autochunk
booster [shardformer] optimize seq parallelism (#6086) 1 month ago
checkpoint_io [colossalai/checkpoint_io/...] fix bug in load_state_dict_into_model; format error msg (#6020) 3 months ago
cli
cluster [FP8] rebase main (#5963) 4 months ago
context
device
fx [test] Fix/fix testcase (#5770) 6 months ago
inference [shardformer] fix linear 1d row and support uneven splits for fused qkv linear (#6084) 1 month ago
interface
kernel [pre-commit.ci] auto fixes from pre-commit.com hooks 2 months ago
lazy [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
legacy [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
logging [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
moe [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 2 months ago
nn [misc] fix dist logger (#5782) 6 months ago
pipeline [fp8] hotfix backward hook (#6053) 2 months ago
quantization [fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059) 2 months ago
shardformer [Coati] Train DPO using PP (#6054) 1 month ago
tensor [fp8] support fp8 amp for hybrid parallel plugin (#5975) 4 months ago
testing [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
utils [tmp] add write_tensor 1 month ago
zero [fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059) 2 months ago
__init__.py
initialize.py [fp8] hotfix backward hook (#6053) 2 months ago