Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
Hongxin Liu c2e8f61592
[checkpointio] fix hybrid plugin model save (#6106)
3 weeks ago
..
_C Clean up 6 months ago
_analyzer [test] Fix/fix testcase (#5770) 6 months ago
accelerator [misc] fit torch api upgradation and remove legecy import (#6093) 1 month ago
amp [npu] change device to accelerator api (#5239) 11 months ago
auto_parallel [pre-commit.ci] pre-commit autoupdate (#5572) 5 months ago
autochunk [hotfix] Fix examples no pad token & auto parallel codegen bug; (#5606) 7 months ago
booster pre-commit fix 1 month ago
checkpoint_io [checkpointio] fix hybrid plugin model save (#6106) 3 weeks ago
cli [devops] fix extention building (#5427) 9 months ago
cluster [FP8] rebase main (#5963) 4 months ago
context [Fix]: implement thread-safety singleton to avoid deadlock for very large-scale training scenarios (#5625) 7 months ago
device [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 6 months ago
fx [test] Fix/fix testcase (#5770) 6 months ago
inference [shardformer] fix linear 1d row and support uneven splits for fused qkv linear (#6084) 1 month ago
interface [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 6 months ago
kernel [misc] fit torch api upgradation and remove legecy import (#6093) 1 month ago
lazy [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
legacy [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
logging [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
moe [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 2 months ago
nn [misc] fix dist logger (#5782) 6 months ago
pipeline [misc] fit torch api upgradation and remove legecy import (#6093) 1 month ago
quantization [fp8] add fallback and make compile option configurable (#6092) 1 month ago
shardformer [Ring Attention] Improve comments (#6085) 1 month ago
tensor [fp8] support fp8 amp for hybrid parallel plugin (#5975) 4 months ago
testing [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
utils [checkpointio] fix hybrid plugin model save (#6106) 3 weeks ago
zero [checkpointio] fix hybrid plugin model save (#6106) 3 weeks ago
__init__.py [devops] remove post commit ci (#5566) 8 months ago
initialize.py [fp8] hotfix backward hook (#6053) 2 months ago