1201 Commits (main)

Author SHA1 Message Date
hxwang 7077d38d5a [moe] finalize test (no pp) 4 months ago
haze188 2cddeac717 moe sp + ep bug fix 4 months ago
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp 4 months ago
hxwang 09d6280d3e [chore] minor fix 4 months ago
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918) 4 months ago
hxwang 3e2b6132b7 [moe] clean legacy code 4 months ago
hxwang 74eccac0db [moe] test deepseek 4 months ago
botbw dc583aa576 [moe] implement tp 4 months ago
botbw 0b5bbe9ce4 [test] add mixtral modelling test 4 months ago
hxwang 102b784a10 [chore] arg pass & remove drop token 4 months ago
botbw 9b9b76bdcd [moe] add mixtral dp grad scaling when not all experts are activated 4 months ago
botbw e28e05345b [moe] implement submesh initialization 4 months ago
haze188 5ed5e8cfba solve hang when parallel mode = pp + dp 4 months ago
haze188 fe24789eb1 [misc] solve booster hang by rename the variable 4 months ago
botbw 13b48ac0aa [zero] solve hang 4 months ago
botbw b5bfeb2efd [moe] implement transit between non moe tp and ep 4 months ago
botbw 37443cc7e4 [test] pass mixtral shardformer test 4 months ago
hxwang 46c069b0db [zero] solve hang 4 months ago
hxwang a249e71946 [test] mixtra pp shard test 4 months ago
hxwang 0b76b57cd6 [test] add mixtral transformer test 4 months ago
Hongxin Liu 5fd0592767
[fp8] support all-gather flat tensor (#5932) 4 months ago
GuangyaoZhang 5a310b9ee1 fix rebase 4 months ago
GuangyaoZhang 457a0de79f shardformer fp8 4 months ago
Guangyao Zhang 1c961b20f3
[ShardFormer] fix qwen2 sp (#5903) 4 months ago
Hongxin Liu c068ef0fa0
[zero] support all-gather overlap (#5898) 4 months ago
Guangyao Zhang 669849d74b
[ShardFormer] Add Ulysses Sequence Parallelism support for Command-R, Qwen2 and ChatGLM (#5897) 4 months ago
Edenzzzz fbf33ecd01
[Feature] Enable PP + SP for llama (#5868) 5 months ago
Haze188 3420921101
[shardformer] DeepseekMoE support (#5871) 5 months ago
pre-commit-ci[bot] 7c2f79fa98
[pre-commit.ci] pre-commit autoupdate (#5572) 5 months ago
Haze188 416580b314
[MoE/ZeRO] Moe refactor with zero refactor (#5821) 5 months ago
Guangyao Zhang d9d5e7ea1f
[shardformer] Support the T5ForTokenClassification model (#5816) 5 months ago
Edenzzzz 2a25a2aff7
[Feature] optimize PP overlap (#5735) 5 months ago
Guangyao Zhang fd1dc417d8
[shardformer] Change atol in test command-r weight-check to pass pytest (#5835) 5 months ago
GuangyaoZhang fe2e74c03a fix precommit 5 months ago
GuangyaoZhang 98da648a4a Fix Code Factor check 5 months ago
GuangyaoZhang f656d61778 change command 5 months ago
Edenzzzz 8795bb2e80
Support 4d parallel + flash attention (#5789) 5 months ago
GuangyaoZhang 9a290ab013 fix precommit 5 months ago
pre-commit-ci[bot] 2a7fa2e7d0 [pre-commit.ci] auto fixes from pre-commit.com hooks 5 months ago
GuangyaoZhang 1016bb3257 Fix Code Factor check 5 months ago
GuangyaoZhang 94fbde6055 change command 5 months ago
flybird11111 2ddf624a86
[shardformer] upgrade transformers to 4.39.3 (#5815) 5 months ago
Li Xingjian 8554585a5f
[Inference] Fix flash-attn import and add model test (#5794) 5 months ago
Guangyao Zhang aac941ef78
[test] fix qwen2 pytest distLarge (#5797) 5 months ago
Hongxin Liu 587bbf4c6d
[test] fix chatglm test kit (#5793) 5 months ago
char-1ee b303976a27 Fix test import 5 months ago
char-1ee 5f398fc000 Pass inference model shard configs for module init 6 months ago
duanjunwen 10a19e22c6
[hotfix] fix testcase in test_fx/test_tracer (#5779) 6 months ago
botbw 80c3c8789b
[Test/CI] remove test cases to reduce CI duration (#5753) 6 months ago
Edenzzzz 79f7a7b211
[misc] Accelerate CI for zero and dist optim (#5758) 6 months ago