Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
duanjunwen 41fdd2139b [fix] rm unused comments 4 days ago
..
_C Clean up 6 months ago
_analyzer [test] Fix/fix testcase (#5770) 6 months ago
accelerator [misc] fit torch api upgradation and remove legecy import (#6093) 1 month ago
amp [feat] zerobubble support moehybridplugin; 2 months ago
auto_parallel [pre-commit.ci] pre-commit autoupdate (#5572) 5 months ago
autochunk
booster [fix] fix hybridparall use_fp8 config 3 weeks ago
checkpoint_io [checkpointio] fix hybrid plugin model save (#6106) 3 weeks ago
cli
cluster [FP8] rebase main (#5963) 4 months ago
context
device
fx [test] Fix/fix testcase (#5770) 6 months ago
inference [shardformer] fix linear 1d row and support uneven splits for fused qkv linear (#6084) 1 month ago
interface [feat] update optimizer bwd; ä¸ 2 months ago
kernel [misc] fit torch api upgradation and remove legecy import (#6093) 1 month ago
lazy [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
legacy [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
logging [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
moe [zerobubble] rebase main (#6075) 1 month ago
nn [misc] fix dist logger (#5782) 6 months ago
pipeline [fix] rm unused comments 4 days ago
quantization [fp8] add fallback and make compile option configurable (#6092) 1 month ago
shardformer [fix] fix test_lora in llama policy 7 days ago
tensor [zerobubble] rebase main (#6075) 1 month ago
testing [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
utils [checkpointio] fix hybrid plugin model save (#6106) 3 weeks ago
zero Merge branch 'main' into dev/zero_bubble 3 weeks ago
__init__.py
initialize.py [zerobubble] rebase main (#6075) 1 month ago