Commit Graph

180 Commits (80a8ca916a740e913cbedf60caeadc0bab5cb4fa)

Author SHA1 Message Date
Wang Binluo dcd41d0973
Merge pull request #6071 from wangbluo/ring_attention
1 month ago
wangbluo fd92789af2 fix
1 month ago
wangbluo 6be9862aaf fix
1 month ago
wangbluo 3dc08c8a5a fix
1 month ago
wangbluo 8ff7d0c780 fix
1 month ago
wangbluo 3201377e94 fix
1 month ago
wangbluo 23199e34cc fix
1 month ago
wangbluo 703bb5c18d fix the test
2 months ago
wangbluo 4e0e99bb6a fix the test
2 months ago
Hongxin Liu dc2cdaf3e8
[shardformer] optimize seq parallelism (#6086)
2 months ago
Hongxin Liu 646b3c5a90
[shardformer] fix linear 1d row and support uneven splits for fused qkv linear (#6084)
2 months ago
botbw 4fa6b9509c
[moe] add parallel strategy for shared_expert && fix test for deepseek (#6063)
2 months ago
botbw c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048)
3 months ago
Wenxuan Tan 8fd25d6e09
[Feature] Split cross-entropy computation in SP (#5959)
3 months ago
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905)
3 months ago
botbw 62cdac6b7b [chore] remove redundant test case, print string & reduce test tokens
4 months ago
haze188 70793ce9ed [misc] fix ci failure: change default value to false in moe plugin
4 months ago
haze188 12d043ca00 [misc] remove incompatible test config
4 months ago
hxwang cb01c0d5ce [moe] refactor mesh assignment
4 months ago
haze188 59bcf56c60 [misc] skip redunant test
4 months ago
hxwang c3dc9b4dba [deepseek] replace attn (a workaround for bug in transformers)
4 months ago
hxwang 6c39f0b144 [test] add check
4 months ago
haze188 b2952a5982 [moe] deepseek moe sp support
4 months ago
hxwang 067e18f7e9 [test] fix test: test_zero1_2
4 months ago
hxwang 70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure
4 months ago
hxwang 46037c2ccd [chore] minor fix after rebase
4 months ago
hxwang 803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix)
4 months ago
haze188 2cddeac717 moe sp + ep bug fix
4 months ago
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp
4 months ago
hxwang 09d6280d3e [chore] minor fix
4 months ago
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918)
4 months ago
botbw e28e05345b [moe] implement submesh initialization
4 months ago
haze188 5ed5e8cfba solve hang when parallel mode = pp + dp
4 months ago
botbw 13b48ac0aa [zero] solve hang
4 months ago
botbw b5bfeb2efd [moe] implement transit between non moe tp and ep
4 months ago
botbw 37443cc7e4 [test] pass mixtral shardformer test
4 months ago
hxwang 46c069b0db [zero] solve hang
4 months ago
hxwang a249e71946 [test] mixtra pp shard test
4 months ago
hxwang 0b76b57cd6 [test] add mixtral transformer test
4 months ago
Guangyao Zhang 1c961b20f3
[ShardFormer] fix qwen2 sp (#5903)
4 months ago
Guangyao Zhang 669849d74b
[ShardFormer] Add Ulysses Sequence Parallelism support for Command-R, Qwen2 and ChatGLM (#5897)
5 months ago
Edenzzzz fbf33ecd01
[Feature] Enable PP + SP for llama (#5868)
5 months ago
Haze188 416580b314
[MoE/ZeRO] Moe refactor with zero refactor (#5821)
5 months ago
Guangyao Zhang d9d5e7ea1f
[shardformer] Support the T5ForTokenClassification model (#5816)
5 months ago
Guangyao Zhang fd1dc417d8
[shardformer] Change atol in test command-r weight-check to pass pytest (#5835)
5 months ago
GuangyaoZhang fe2e74c03a fix precommit
5 months ago
GuangyaoZhang 98da648a4a Fix Code Factor check
5 months ago
GuangyaoZhang f656d61778 change command
5 months ago
Edenzzzz 8795bb2e80
Support 4d parallel + flash attention (#5789)
5 months ago
Guangyao Zhang aac941ef78
[test] fix qwen2 pytest distLarge (#5797)
6 months ago