Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
Hongxin Liu cf519dac6a
[optim] hotfix adam load (#6146)
2 days ago
..
_C Clean up 6 months ago
_analyzer [test] Fix/fix testcase (#5770) 6 months ago
accelerator [misc] fit torch api upgradation and remove legecy import (#6093) 1 month ago
amp [Zerobubble] merge main. (#6142) 2 days ago
auto_parallel [pre-commit.ci] pre-commit autoupdate (#5572) 5 months ago
autochunk [hotfix] Fix examples no pad token & auto parallel codegen bug; (#5606) 7 months ago
booster [optim] hotfix adam load (#6146) 2 days ago
checkpoint_io [async io]supoort async io (#6137) 3 days ago
cli [cli] support run as module option (#6135) 1 week ago
cluster Revert "[moe] implement submesh initialization" 4 months ago
context [Fix]: implement thread-safety singleton to avoid deadlock for very large-scale training scenarios (#5625) 7 months ago
device [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 6 months ago
fx [test] Fix/fix testcase (#5770) 6 months ago
inference [shardformer] fix linear 1d row and support uneven splits for fused qkv linear (#6084) 1 month ago
interface [Zerobubble] merge main. (#6142) 2 days ago
kernel [misc] fit torch api upgradation and remove legecy import (#6093) 1 month ago
lazy [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
legacy [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
logging [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
moe [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 2 months ago
nn [optim] hotfix adam load (#6146) 2 days ago
pipeline [Zerobubble] merge main. (#6142) 2 days ago
quantization [fp8] add fallback and make compile option configurable (#6092) 1 month ago
shardformer [Zerobubble] merge main. (#6142) 2 days ago
tensor [fp8] support fp8 amp for hybrid parallel plugin (#5975) 4 months ago
testing [optim] hotfix adam load (#6146) 2 days ago
utils [optim] hotfix adam load (#6146) 2 days ago
zero [Zerobubble] merge main. (#6142) 2 days ago
__init__.py [devops] remove post commit ci (#5566) 8 months ago
initialize.py [fp8] hotfix backward hook (#6053) 2 months ago