Commit Graph

165 Commits (887d2d579b522cadab12571f2357d9e2cbd23aed)

Author SHA1 Message Date
botbw 62cdac6b7b [chore] remove redundant test case, print string & reduce test tokens
4 months ago
haze188 70793ce9ed [misc] fix ci failure: change default value to false in moe plugin
4 months ago
haze188 12d043ca00 [misc] remove incompatible test config
4 months ago
hxwang cb01c0d5ce [moe] refactor mesh assignment
4 months ago
haze188 59bcf56c60 [misc] skip redunant test
4 months ago
hxwang c3dc9b4dba [deepseek] replace attn (a workaround for bug in transformers)
4 months ago
hxwang 6c39f0b144 [test] add check
4 months ago
haze188 b2952a5982 [moe] deepseek moe sp support
4 months ago
hxwang 067e18f7e9 [test] fix test: test_zero1_2
4 months ago
hxwang 70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure
4 months ago
hxwang 46037c2ccd [chore] minor fix after rebase
4 months ago
hxwang 803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix)
4 months ago
haze188 2cddeac717 moe sp + ep bug fix
4 months ago
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp
4 months ago
hxwang 09d6280d3e [chore] minor fix
4 months ago
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918)
4 months ago
botbw e28e05345b [moe] implement submesh initialization
4 months ago
haze188 5ed5e8cfba solve hang when parallel mode = pp + dp
4 months ago
botbw 13b48ac0aa [zero] solve hang
4 months ago
botbw b5bfeb2efd [moe] implement transit between non moe tp and ep
4 months ago
botbw 37443cc7e4 [test] pass mixtral shardformer test
4 months ago
hxwang 46c069b0db [zero] solve hang
4 months ago
hxwang a249e71946 [test] mixtra pp shard test
4 months ago
hxwang 0b76b57cd6 [test] add mixtral transformer test
4 months ago
Guangyao Zhang 1c961b20f3
[ShardFormer] fix qwen2 sp (#5903)
4 months ago
Guangyao Zhang 669849d74b
[ShardFormer] Add Ulysses Sequence Parallelism support for Command-R, Qwen2 and ChatGLM (#5897)
5 months ago
Edenzzzz fbf33ecd01
[Feature] Enable PP + SP for llama (#5868)
5 months ago
Haze188 416580b314
[MoE/ZeRO] Moe refactor with zero refactor (#5821)
5 months ago
Guangyao Zhang d9d5e7ea1f
[shardformer] Support the T5ForTokenClassification model (#5816)
5 months ago
Guangyao Zhang fd1dc417d8
[shardformer] Change atol in test command-r weight-check to pass pytest (#5835)
5 months ago
GuangyaoZhang fe2e74c03a fix precommit
5 months ago
GuangyaoZhang 98da648a4a Fix Code Factor check
5 months ago
GuangyaoZhang f656d61778 change command
5 months ago
Edenzzzz 8795bb2e80
Support 4d parallel + flash attention (#5789)
5 months ago
Guangyao Zhang aac941ef78
[test] fix qwen2 pytest distLarge (#5797)
6 months ago
Haze188 e22b82755d
[CI/tests] simplify some test case to reduce testing time (#5755)
6 months ago
Hongxin Liu 68359ed1e1
[release] update version (#5752)
6 months ago
Edenzzzz 5f8c0a0ac3
[Feature] auto-cast optimizers to distributed version (#5746)
6 months ago
Edenzzzz 43995ee436
[Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694)
7 months ago
Wang Binluo a3cc68ca93
[Shardformer] Support the Qwen2 model (#5699)
7 months ago
flybird11111 77ec773388
[zero]remove registered gradients hooks (#5687)
7 months ago
Hongxin Liu 7f8b16635b
[misc] refactor launch API and tensor constructor (#5666)
7 months ago
Hongxin Liu 1b387ca9fe
[shardformer] refactor pipeline grad ckpt config (#5646)
7 months ago
Hongxin Liu bbb2c21f16
[shardformer] fix chatglm implementation (#5644)
7 months ago
flybird11111 148506c828
[coloattention]modify coloattention (#5627)
7 months ago
Wang Binluo 0d0a582033
[shardformer] update transformers (#5583)
7 months ago
flybird11111 a0ad587c24
[shardformer] refactor embedding resize (#5603)
7 months ago
Zhongkai Zhao 8e412a548e
[shardformer] Sequence Parallelism Optimization (#5533)
8 months ago
Wenhao Chen e614aa34f3
[shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508)
8 months ago
Insu Jang 00525f7772
[shardformer] fix pipeline forward error if custom layer distribution is used (#5189)
8 months ago