Commit Graph

2193 Commits (89a9a600bc4802c912b0ed48d48f70bbcdd8142b)

Author SHA1 Message Date
wangbluo 0a01e2a453 fix the attn
3 months ago
pre-commit-ci[bot] 216d54e374 [pre-commit.ci] auto fixes from pre-commit.com hooks
3 months ago
wangbluo fdd84b9087 fix the sp
3 months ago
Hongxin Liu 13946c4448
[fp8] hotfix backward hook (#6053)
3 months ago
botbw c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048)
3 months ago
Wenxuan Tan 8fd25d6e09
[Feature] Split cross-entropy computation in SP (#5959)
3 months ago
Hanks 5ce6dd75bf
[fp8] disable all_to_all_fp8 in intranode (#6045)
3 months ago
Hongxin Liu 26e553937b
[fp8] fix linear hook (#6046)
3 months ago
Hongxin Liu c3b5caff0e
[fp8] optimize all-gather (#6043)
3 months ago
Gao, Ruiyuan e9032fb0b2
[colossalai/checkpoint_io/...] fix bug in load_state_dict_into_model; format error msg (#6020)
3 months ago
Guangyao Zhang e96a0761ea
[FP8] unsqueeze scale to make it compatible with torch.compile (#6040)
3 months ago
Hongxin Liu cc1b0efc17
[plugin] hotfix zero plugin (#6036)
3 months ago
Hongxin Liu 17904cb5bf
Merge pull request #6012 from hpcaitech/feature/fp8_comm
3 months ago
pre-commit-ci[bot] 80d24ae519 [pre-commit.ci] auto fixes from pre-commit.com hooks
3 months ago
wangbluo dae39999d7 fix
3 months ago
Wenxuan Tan 7cf9df07bc
[Hotfix] Fix llama fwd replacement bug (#6031)
3 months ago
Hongxin Liu caab4a307f
Merge branch 'main' into feature/fp8_comm
3 months ago
pre-commit-ci[bot] a292554179 [pre-commit.ci] auto fixes from pre-commit.com hooks
3 months ago
wangbluo 971b16a74f fix
3 months ago
Wang Binluo eea37da6fa
[fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016)
3 months ago
Hongxin Liu 0d3b0bd864
[plugin] add cast inputs option for zero (#6003) (#6022)
3 months ago
Edenzzzz dcc44aab8d
[misc] Use dist logger in plugins (#6011)
3 months ago
Edenzzzz f1c3266a94
overlap kv comm with output rescale (#6017)
3 months ago
Hongxin Liu 26493b97d3
[misc] update compatibility (#6008)
3 months ago
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905)
4 months ago
flybird11111 0a51319113
[fp8] zero support fp8 linear. (#6006)
4 months ago
Wang Binluo 3f09a6145f
[fp8] add use_fp8 option for MoeHybridParallelPlugin (#6009)
4 months ago
Haze188 887d2d579b
[misc] Bypass the huggingface bug to solve the mask mismatch problem (#5991)
4 months ago
botbw 1a2e90dcc1 [fp8] linear perf enhancement
4 months ago
Hongxin Liu 406f984063
[plugin] add cast inputs option for zero (#6003)
4 months ago
botbw 88fa096d78
[fp8] update torch.compile for linear_fp8 to >= 2.4.0 (#6004)
4 months ago
flybird11111 597b206001
[fp8] support asynchronous FP8 communication (#5997)
4 months ago
Tong Li ceb1e262e7
fix sync condition (#6000)
4 months ago
Hongxin Liu 0978080a69
[fp8] refactor fp8 linear with compile (#5993)
4 months ago
Wang Binluo b2483c8e31
[fp8] support hybrid parallel plugin (#5982)
4 months ago
flybird11111 f1a3a326c4
[fp8]Moe support fp8 communication (#5977)
4 months ago
Edenzzzz b4d2377d4c
[Hotfix] Avoid fused RMSnorm import error without apex (#5985)
4 months ago
botbw e4aadeee20
[fp8] use torch compile (torch >= 2.3.0) (#5979)
4 months ago
Hongxin Liu 8241c0c054
[fp8] support gemini plugin (#5978)
4 months ago
Hanks b480eec738
[Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928)
4 months ago
flybird11111 7739629b9d
fix (#5976)
4 months ago
Hongxin Liu ccabcf6485
[fp8] support fp8 amp for hybrid parallel plugin (#5975)
4 months ago
Hongxin Liu 76ea16466f
[fp8] add fp8 linear (#5967)
4 months ago
flybird11111 afb26de873
[fp8]support all2all fp8 (#5953)
4 months ago
flybird11111 0c10afd372
[FP8] rebase main (#5963)
4 months ago
Guangyao Zhang 53cb9606bd
[Feature] llama shardformer fp8 support (#5938)
4 months ago
ver217 ae486ce005 [fp8] add fp8 comm for low level zero
4 months ago
Wang Binluo 75c963686f
[lora] lora support hybrid parallel plugin (#5956)
4 months ago
botbw 62cdac6b7b [chore] remove redundant test case, print string & reduce test tokens
4 months ago
botbw d1d1ab871e [moe] solve dp axis issue
4 months ago
botbw 65daa87627 [doc] add MoeHybridParallelPlugin docstring
4 months ago
hxwang 7bedd03739 [moe] remove force_overlap_comm flag and add warning instead
4 months ago
hxwang f7c5485ed6 [chore] docstring
4 months ago
haze188 7e737df5ad [misc] remove useless condition
4 months ago
haze188 70793ce9ed [misc] fix ci failure: change default value to false in moe plugin
4 months ago
hxwang 606b0891ed [chore] change moe_pg_mesh to private
4 months ago
hxwang 5b4c12381b Revert "[moe] implement submesh initialization"
4 months ago
hxwang cb01c0d5ce [moe] refactor mesh assignment
4 months ago
haze188 034020bd04 [misc] remove debug/print code
4 months ago
hxwang c3dc9b4dba [deepseek] replace attn (a workaround for bug in transformers)
4 months ago
hxwang 6c39f0b144 [test] add check
4 months ago
haze188 b2952a5982 [moe] deepseek moe sp support
4 months ago
botbw 96d0fbc531 [bug] fix: somehow logger hangs the program
4 months ago
hxwang 067e18f7e9 [test] fix test: test_zero1_2
4 months ago
hxwang 74b03de3f9 [moe] remove ops
4 months ago
hxwang 70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure
4 months ago
hxwang 46037c2ccd [chore] minor fix after rebase
4 months ago
hxwang 803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix)
4 months ago
hxwang 7077d38d5a [moe] finalize test (no pp)
4 months ago
haze188 2cddeac717 moe sp + ep bug fix
4 months ago
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp
4 months ago
hxwang 09d6280d3e [chore] minor fix
4 months ago
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918)
4 months ago
hxwang 3e2b6132b7 [moe] clean legacy code
4 months ago
hxwang 74eccac0db [moe] test deepseek
4 months ago
botbw dc583aa576 [moe] implement tp
4 months ago
hxwang 102b784a10 [chore] arg pass & remove drop token
4 months ago
botbw 8dbb86899d [chore] trivial fix
4 months ago
botbw 014faf6c5a [chore] manually revert unintended commit
4 months ago
botbw 9b9b76bdcd [moe] add mixtral dp grad scaling when not all experts are activated
4 months ago
botbw e28e05345b [moe] implement submesh initialization
4 months ago
haze188 5ed5e8cfba solve hang when parallel mode = pp + dp
4 months ago
botbw 13b48ac0aa [zero] solve hang
4 months ago
botbw b5bfeb2efd [moe] implement transit between non moe tp and ep
4 months ago
botbw 37443cc7e4 [test] pass mixtral shardformer test
4 months ago
hxwang 46c069b0db [zero] solve hang
4 months ago
hxwang 0fad23c691 [chore] handle non member group
4 months ago
hxwang a249e71946 [test] mixtra pp shard test
4 months ago
hxwang 8ae8525bdf [moe] fix plugin
4 months ago
hxwang 0b76b57cd6 [test] add mixtral transformer test
4 months ago
hxwang f9b6fcf81f [test] add mixtral for sequence classification
4 months ago
Hongxin Liu 060892162a
[zero] hotfix update master params (#5951)
4 months ago
Runyu Lu bcf0181ecd
[Feat] Distrifusion Acceleration Support for Diffusion Inference (#5895)
4 months ago
Hongxin Liu 7b38964e3a
[shardformer] hotfix attn mask (#5947)
4 months ago
Hongxin Liu 9664b1bc19
[shardformer] hotfix attn mask (#5945)
4 months ago
Edenzzzz 2069472e96
[Hotfix] Fix ZeRO typo #5936
4 months ago
Hongxin Liu 5fd0592767
[fp8] support all-gather flat tensor (#5932)
4 months ago
Gao, Ruiyuan 5fb958cc83
[FIX BUG] convert env param to int in (#5934)
4 months ago
Insu Jang a521ffc9f8
Add n_fused as an input from native_module (#5894)
4 months ago
Hongxin Liu e86127925a
[plugin] support all-gather overlap for hybrid parallel (#5919)
4 months ago