You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
flybird11111 eb69e640e5
[async io]supoort async io (#6137)
6 days ago
..
_C
_analyzer
accelerator [misc] fit torch api upgradation and remove legecy import (#6093) 1 month ago
amp [plugin] support get_grad_norm (#6115) 3 weeks ago
auto_parallel
autochunk
booster [async io]supoort async io (#6137) 6 days ago
checkpoint_io [async io]supoort async io (#6137) 6 days ago
cli [cli] support run as module option (#6135) 2 weeks ago
cluster
context
device
fx
inference [shardformer] fix linear 1d row and support uneven splits for fused qkv linear (#6084) 2 months ago
interface [plugin] support get_grad_norm (#6115) 3 weeks ago
kernel [misc] fit torch api upgradation and remove legecy import (#6093) 1 month ago
lazy
legacy
logging
moe [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 3 months ago
nn
pipeline [misc] fit torch api upgradation and remove legecy import (#6093) 1 month ago
quantization [fp8] add fallback and make compile option configurable (#6092) 1 month ago
shardformer [hotfix] fix flash attn window_size err (#6132) 2 weeks ago
tensor
testing [async io]supoort async io (#6137) 6 days ago
utils [async io]supoort async io (#6137) 6 days ago
zero [async io]supoort async io (#6137) 6 days ago
__init__.py
initialize.py [fp8] hotfix backward hook (#6053) 3 months ago