Commit Graph

3688 Commits (0ad3129cb95cb74f13dead3f8369837ea997b499)

Author SHA1 Message Date
Wang Binluo 75c963686f
[lora] lora support hybrid parallel plugin (#5956)
* lora support hybrid plugin

* fix

* fix

* fix

* fix
2024-08-02 10:36:58 +08:00
Tong Li 19d1510ea2
[feat] Dist Loader for Eval (#5950)
* support auto distributed data loader

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* support auto distributed data loader

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix tp error

* remove unused parameters

* remove unused

* update inference

* update docs

* update inference

---------

Co-authored-by: Michelle <qianranma8@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-02 10:06:25 +08:00
botbw 62cdac6b7b [chore] remove redundant test case, print string & reduce test tokens 2024-08-01 10:06:59 +08:00
botbw d1d1ab871e [moe] solve dp axis issue 2024-08-01 10:06:59 +08:00
botbw 65daa87627 [doc] add MoeHybridParallelPlugin docstring 2024-08-01 10:06:59 +08:00
hxwang 7bedd03739 [moe] remove force_overlap_comm flag and add warning instead 2024-08-01 10:06:59 +08:00
hxwang f7c5485ed6 [chore] docstring 2024-08-01 10:06:59 +08:00
haze188 7e737df5ad [misc] remove useless condition 2024-08-01 10:06:59 +08:00
haze188 70793ce9ed [misc] fix ci failure: change default value to false in moe plugin 2024-08-01 10:06:59 +08:00
haze188 12d043ca00 [misc] remove incompatible test config 2024-08-01 10:06:59 +08:00
hxwang 606b0891ed [chore] change moe_pg_mesh to private 2024-08-01 10:06:59 +08:00
hxwang 5b4c12381b Revert "[moe] implement submesh initialization"
This reverts commit 2f9bce6686.
2024-08-01 10:06:59 +08:00
hxwang cb01c0d5ce [moe] refactor mesh assignment 2024-08-01 10:06:59 +08:00
haze188 034020bd04 [misc] remove debug/print code 2024-08-01 10:06:59 +08:00
haze188 59bcf56c60 [misc] skip redunant test 2024-08-01 10:06:59 +08:00
hxwang c3dc9b4dba [deepseek] replace attn (a workaround for bug in transformers) 2024-08-01 10:06:59 +08:00
hxwang 6c39f0b144 [test] add check 2024-08-01 10:06:59 +08:00
haze188 b2952a5982 [moe] deepseek moe sp support 2024-08-01 10:06:59 +08:00
botbw 96d0fbc531 [bug] fix: somehow logger hangs the program 2024-08-01 10:06:59 +08:00
hxwang 067e18f7e9 [test] fix test: test_zero1_2 2024-08-01 10:06:59 +08:00
hxwang 74b03de3f9 [moe] remove ops 2024-08-01 10:06:59 +08:00
hxwang 70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure 2024-08-01 10:06:59 +08:00
pre-commit-ci[bot] 52d346f2a5 [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2024-08-01 10:06:59 +08:00
hxwang 46037c2ccd [chore] minor fix after rebase 2024-08-01 10:06:59 +08:00
hxwang 803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix) 2024-08-01 10:06:59 +08:00
hxwang 7077d38d5a [moe] finalize test (no pp) 2024-08-01 10:06:59 +08:00
haze188 2cddeac717 moe sp + ep bug fix 2024-08-01 10:06:59 +08:00
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp 2024-08-01 10:06:59 +08:00
hxwang 09d6280d3e [chore] minor fix 2024-08-01 10:06:59 +08:00
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918)
* moe sp support

* moe sp bug solve

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-01 10:06:59 +08:00
hxwang 3e2b6132b7 [moe] clean legacy code 2024-08-01 10:06:59 +08:00
hxwang 74eccac0db [moe] test deepseek 2024-08-01 10:06:59 +08:00
botbw dc583aa576 [moe] implement tp 2024-08-01 10:06:59 +08:00
botbw 0b5bbe9ce4 [test] add mixtral modelling test 2024-08-01 10:06:59 +08:00
hxwang 102b784a10 [chore] arg pass & remove drop token 2024-08-01 10:06:59 +08:00
botbw 8dbb86899d [chore] trivial fix 2024-08-01 10:06:59 +08:00
botbw 014faf6c5a [chore] manually revert unintended commit 2024-08-01 10:06:59 +08:00
botbw 9b9b76bdcd [moe] add mixtral dp grad scaling when not all experts are activated 2024-08-01 10:06:59 +08:00
botbw e28e05345b [moe] implement submesh initialization 2024-08-01 10:06:59 +08:00
haze188 5ed5e8cfba solve hang when parallel mode = pp + dp 2024-08-01 10:06:59 +08:00
haze188 fe24789eb1 [misc] solve booster hang by rename the variable 2024-08-01 10:06:59 +08:00
botbw 13b48ac0aa [zero] solve hang 2024-08-01 10:06:59 +08:00
botbw b5bfeb2efd [moe] implement transit between non moe tp and ep 2024-08-01 10:06:59 +08:00
botbw 37443cc7e4 [test] pass mixtral shardformer test 2024-08-01 10:06:59 +08:00
hxwang 46c069b0db [zero] solve hang 2024-08-01 10:06:59 +08:00
hxwang 0fad23c691 [chore] handle non member group 2024-08-01 10:06:59 +08:00
hxwang a249e71946 [test] mixtra pp shard test 2024-08-01 10:06:59 +08:00
hxwang 8ae8525bdf [moe] fix plugin 2024-08-01 10:06:59 +08:00
hxwang 0b76b57cd6 [test] add mixtral transformer test 2024-08-01 10:06:59 +08:00
hxwang f9b6fcf81f [test] add mixtral for sequence classification 2024-08-01 10:06:59 +08:00