Commit Graph

2157 Commits (dabc2e7430bb5d6bad1fed7629b534c10ae8c609)

Author SHA1 Message Date
botbw 4fa6b9509c
[moe] add parallel strategy for shared_expert && fix test for deepseek (#6063)
2 months ago
wangbluo 10e4f7da72 fix
2 months ago
Wang Binluo 37e35230ff
Merge pull request #6061 from wangbluo/sp_fix
3 months ago
wangbluo 827ef3ee9a fix
3 months ago
Guangyao Zhang bdb125f83f
[doc] FP8 training and communication document (#6050)
3 months ago
Guangyao Zhang f20b066c59
[fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059)
3 months ago
wangbluo b582319273 fix
3 months ago
wangbluo 0ad3129cb9 fix
3 months ago
wangbluo 0b14a5512e fix
3 months ago
botbw 696fced0d7
[fp8] fix missing fp8_comm flag in mixtral (#6057)
3 months ago
wangbluo dc032172c3 fix
3 months ago
wangbluo f393867cff fix
3 months ago
wangbluo 6eb8832366 fix
3 months ago
wangbluo 683179cefd fix
3 months ago
wangbluo 0a01e2a453 fix the attn
3 months ago
pre-commit-ci[bot] 216d54e374 [pre-commit.ci] auto fixes from pre-commit.com hooks
3 months ago
wangbluo fdd84b9087 fix the sp
3 months ago
Hongxin Liu 13946c4448
[fp8] hotfix backward hook (#6053)
3 months ago
botbw c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048)
3 months ago
Wenxuan Tan 8fd25d6e09
[Feature] Split cross-entropy computation in SP (#5959)
3 months ago
Hanks 5ce6dd75bf
[fp8] disable all_to_all_fp8 in intranode (#6045)
3 months ago
Hongxin Liu 26e553937b
[fp8] fix linear hook (#6046)
3 months ago
Hongxin Liu c3b5caff0e
[fp8] optimize all-gather (#6043)
3 months ago
Gao, Ruiyuan e9032fb0b2
[colossalai/checkpoint_io/...] fix bug in load_state_dict_into_model; format error msg (#6020)
3 months ago
Guangyao Zhang e96a0761ea
[FP8] unsqueeze scale to make it compatible with torch.compile (#6040)
3 months ago
Hongxin Liu cc1b0efc17
[plugin] hotfix zero plugin (#6036)
3 months ago
Hongxin Liu 17904cb5bf
Merge pull request #6012 from hpcaitech/feature/fp8_comm
3 months ago
pre-commit-ci[bot] 80d24ae519 [pre-commit.ci] auto fixes from pre-commit.com hooks
3 months ago
wangbluo dae39999d7 fix
3 months ago
Wenxuan Tan 7cf9df07bc
[Hotfix] Fix llama fwd replacement bug (#6031)
3 months ago
Hongxin Liu caab4a307f
Merge branch 'main' into feature/fp8_comm
3 months ago
pre-commit-ci[bot] a292554179 [pre-commit.ci] auto fixes from pre-commit.com hooks
3 months ago
wangbluo 971b16a74f fix
3 months ago
Wang Binluo eea37da6fa
[fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016)
3 months ago
Hongxin Liu 0d3b0bd864
[plugin] add cast inputs option for zero (#6003) (#6022)
3 months ago
Edenzzzz dcc44aab8d
[misc] Use dist logger in plugins (#6011)
3 months ago
Edenzzzz f1c3266a94
overlap kv comm with output rescale (#6017)
3 months ago
Hongxin Liu 26493b97d3
[misc] update compatibility (#6008)
3 months ago
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905)
3 months ago
flybird11111 0a51319113
[fp8] zero support fp8 linear. (#6006)
3 months ago
Wang Binluo 3f09a6145f
[fp8] add use_fp8 option for MoeHybridParallelPlugin (#6009)
3 months ago
Haze188 887d2d579b
[misc] Bypass the huggingface bug to solve the mask mismatch problem (#5991)
4 months ago
botbw 1a2e90dcc1 [fp8] linear perf enhancement
4 months ago
Hongxin Liu 406f984063
[plugin] add cast inputs option for zero (#6003)
4 months ago
botbw 88fa096d78
[fp8] update torch.compile for linear_fp8 to >= 2.4.0 (#6004)
4 months ago
flybird11111 597b206001
[fp8] support asynchronous FP8 communication (#5997)
4 months ago
Tong Li ceb1e262e7
fix sync condition (#6000)
4 months ago
Hongxin Liu 0978080a69
[fp8] refactor fp8 linear with compile (#5993)
4 months ago
Wang Binluo b2483c8e31
[fp8] support hybrid parallel plugin (#5982)
4 months ago
flybird11111 f1a3a326c4
[fp8]Moe support fp8 communication (#5977)
4 months ago