Commit Graph

204 Commits (e0c68ab6d3d64f401208d6ec66815995cee233c3)

Author SHA1 Message Date
duanjunwen e0c68ab6d3
[Zerobubble] merge main. (#6142)
1 week ago
flybird11111 eb69e640e5 [async io]supoort async io (#6137)
1 week ago
Wang Binluo 8e08c27e19 [ckpt] Add async ckpt api (#6136)
1 week ago
Hongxin Liu d4a436051d [checkpointio] support async model save (#6131)
1 week ago
Hongxin Liu a2596519fd
[zero] support extra dp (#6123)
2 weeks ago
Hongxin Liu a15ab139ad
[plugin] support get_grad_norm (#6115)
3 weeks ago
BurkeHulk 6d6cafabe2 pre-commit fix
1 month ago
BurkeHulk b10339df7c fix lora ckpt save format (ColoTensor to Tensor)
1 month ago
Wang Binluo dcd41d0973
Merge pull request #6071 from wangbluo/ring_attention
1 month ago
wangbluo 6be9862aaf fix
1 month ago
wangbluo 3dc08c8a5a fix
1 month ago
Hongxin Liu dc2cdaf3e8
[shardformer] optimize seq parallelism (#6086)
2 months ago
Guangyao Zhang bdb125f83f
[doc] FP8 training and communication document (#6050)
2 months ago
Hongxin Liu 13946c4448
[fp8] hotfix backward hook (#6053)
3 months ago
botbw c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048)
3 months ago
Hongxin Liu 26e553937b
[fp8] fix linear hook (#6046)
3 months ago
Hongxin Liu cc1b0efc17
[plugin] hotfix zero plugin (#6036)
3 months ago
pre-commit-ci[bot] 80d24ae519 [pre-commit.ci] auto fixes from pre-commit.com hooks
3 months ago
Hongxin Liu caab4a307f
Merge branch 'main' into feature/fp8_comm
3 months ago
pre-commit-ci[bot] a292554179 [pre-commit.ci] auto fixes from pre-commit.com hooks
3 months ago
wangbluo 971b16a74f fix
3 months ago
Wang Binluo eea37da6fa
[fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016)
3 months ago
Hongxin Liu 0d3b0bd864
[plugin] add cast inputs option for zero (#6003) (#6022)
3 months ago
Edenzzzz dcc44aab8d
[misc] Use dist logger in plugins (#6011)
3 months ago
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905)
3 months ago
flybird11111 0a51319113
[fp8] zero support fp8 linear. (#6006)
3 months ago
Wang Binluo 3f09a6145f
[fp8] add use_fp8 option for MoeHybridParallelPlugin (#6009)
3 months ago
Hongxin Liu 406f984063
[plugin] add cast inputs option for zero (#6003)
3 months ago
Tong Li ceb1e262e7
fix sync condition (#6000)
4 months ago
flybird11111 f1a3a326c4
[fp8]Moe support fp8 communication (#5977)
4 months ago
Hongxin Liu 8241c0c054
[fp8] support gemini plugin (#5978)
4 months ago
Hanks b480eec738
[Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928)
4 months ago
Hongxin Liu ccabcf6485
[fp8] support fp8 amp for hybrid parallel plugin (#5975)
4 months ago
flybird11111 0c10afd372
[FP8] rebase main (#5963)
4 months ago
ver217 ae486ce005 [fp8] add fp8 comm for low level zero
4 months ago
Wang Binluo 75c963686f
[lora] lora support hybrid parallel plugin (#5956)
4 months ago
botbw d1d1ab871e [moe] solve dp axis issue
4 months ago
botbw 65daa87627 [doc] add MoeHybridParallelPlugin docstring
4 months ago
hxwang 7bedd03739 [moe] remove force_overlap_comm flag and add warning instead
4 months ago
hxwang f7c5485ed6 [chore] docstring
4 months ago
haze188 70793ce9ed [misc] fix ci failure: change default value to false in moe plugin
4 months ago
hxwang 606b0891ed [chore] change moe_pg_mesh to private
4 months ago
hxwang cb01c0d5ce [moe] refactor mesh assignment
4 months ago
hxwang 6c39f0b144 [test] add check
4 months ago
botbw 96d0fbc531 [bug] fix: somehow logger hangs the program
4 months ago
hxwang 70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure
4 months ago
hxwang 46037c2ccd [chore] minor fix after rebase
4 months ago
hxwang 803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix)
4 months ago
hxwang 7077d38d5a [moe] finalize test (no pp)
4 months ago
haze188 2cddeac717 moe sp + ep bug fix
4 months ago