..
_C
Clean up
2024-06-07 09:09:29 +00:00
_analyzer
[test] Fix/fix testcase ( #5770 )
2024-06-03 15:26:01 +08:00
accelerator
[misc] fit torch api upgradation and remove legecy import ( #6093 )
2024-10-18 16:48:52 +08:00
amp
[plugin] support get_grad_norm ( #6115 )
2024-11-05 18:12:47 +08:00
auto_parallel
[pre-commit.ci] pre-commit autoupdate ( #5572 )
2024-07-01 17:16:41 +08:00
autochunk
[hotfix] Fix examples no pad token & auto parallel codegen bug; ( #5606 )
2024-04-18 18:15:50 +08:00
booster
[checkpointio] support async model save ( #6131 )
2024-11-19 14:51:39 +08:00
checkpoint_io
[checkpointio] support async model save ( #6131 )
2024-11-19 14:51:39 +08:00
cli
[cli] support run as module option ( #6135 )
2024-11-14 18:10:37 +08:00
cluster
[FP8] rebase main ( #5963 )
2024-08-06 16:29:37 +08:00
context
[Fix]: implement thread-safety singleton to avoid deadlock for very large-scale training scenarios ( #5625 )
2024-04-25 14:45:52 +08:00
device
[Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor ( #5694 )
2024-05-14 13:52:45 +08:00
fx
[test] Fix/fix testcase ( #5770 )
2024-06-03 15:26:01 +08:00
inference
[shardformer] fix linear 1d row and support uneven splits for fused qkv linear ( #6084 )
2024-10-10 14:34:45 +08:00
interface
[plugin] support get_grad_norm ( #6115 )
2024-11-05 18:12:47 +08:00
kernel
[misc] fit torch api upgradation and remove legecy import ( #6093 )
2024-10-18 16:48:52 +08:00
lazy
[fp8] Merge feature/fp8_comm to main branch of Colossalai ( #6016 )
2024-08-22 09:21:34 +08:00
legacy
[fp8] Merge feature/fp8_comm to main branch of Colossalai ( #6016 )
2024-08-22 09:21:34 +08:00
logging
[fp8] Merge feature/fp8_comm to main branch of Colossalai ( #6016 )
2024-08-22 09:21:34 +08:00
moe
[hotfix] moe hybrid parallelism benchmark & follow-up fix ( #6048 )
2024-09-10 17:30:53 +08:00
nn
[misc] fix dist logger ( #5782 )
2024-06-05 15:04:22 +08:00
pipeline
[misc] fit torch api upgradation and remove legecy import ( #6093 )
2024-10-18 16:48:52 +08:00
quantization
[fp8] add fallback and make compile option configurable ( #6092 )
2024-10-18 13:55:31 +08:00
shardformer
[hotfix] fix flash attn window_size err ( #6132 )
2024-11-14 17:11:35 +08:00
tensor
[fp8] support fp8 amp for hybrid parallel plugin ( #5975 )
2024-08-07 18:21:08 +08:00
testing
[fp8] Merge feature/fp8_comm to main branch of Colossalai ( #6016 )
2024-08-22 09:21:34 +08:00
utils
[checkpointio] support async model save ( #6131 )
2024-11-19 14:51:39 +08:00
zero
[fix] multi-node backward slowdown ( #6134 )
2024-11-14 17:45:49 +08:00
__init__.py
[devops] remove post commit ci ( #5566 )
2024-04-08 15:09:40 +08:00
initialize.py
[fp8] hotfix backward hook ( #6053 )
2024-09-11 16:11:25 +08:00