64 Commits (main)

Author SHA1 Message Date
botbw c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 2 months ago
flybird11111 0c10afd372
[FP8] rebase main (#5963) 4 months ago
hxwang cb01c0d5ce [moe] refactor mesh assignment 4 months ago
hxwang 70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure 4 months ago
pre-commit-ci[bot] 52d346f2a5 [pre-commit.ci] auto fixes from pre-commit.com hooks 4 months ago
hxwang 803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix) 4 months ago
hxwang 7077d38d5a [moe] finalize test (no pp) 4 months ago
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp 4 months ago
hxwang 3e2b6132b7 [moe] clean legacy code 4 months ago
hxwang 74eccac0db [moe] test deepseek 4 months ago
botbw dc583aa576 [moe] implement tp 4 months ago
botbw 0b5bbe9ce4 [test] add mixtral modelling test 4 months ago
hxwang 102b784a10 [chore] arg pass & remove drop token 4 months ago
botbw 9b9b76bdcd [moe] add mixtral dp grad scaling when not all experts are activated 4 months ago
haze188 fe24789eb1 [misc] solve booster hang by rename the variable 4 months ago
botbw 13b48ac0aa [zero] solve hang 4 months ago
hxwang 46c069b0db [zero] solve hang 4 months ago
hxwang 0b76b57cd6 [test] add mixtral transformer test 4 months ago
Haze188 3420921101
[shardformer] DeepseekMoE support (#5871) 5 months ago
Haze188 416580b314
[MoE/ZeRO] Moe refactor with zero refactor (#5821) 5 months ago
Hongxin Liu 7f8b16635b
[misc] refactor launch API and tensor constructor (#5666) 7 months ago
Wenhao Chen bb0a668fee
[hotfix] set return_outputs=False in examples and polish code (#5404) 8 months ago
ver217 06db94fbc9 [moe] fix tests 10 months ago
Xuanlei Zhao 7d8e0338a4 [moe] init mixtral impl 10 months ago
Hongxin Liu d202cc28c0
[npu] change device to accelerator api (#5239) 11 months ago
Wenhao Chen 3c08f17348
[hotfix]: modify create_ep_hierarchical_group and add test (#5032) 1 year ago
Wenhao Chen 724441279b
[moe]: fix ep/tp tests, add hierarchical all2all (#4982) 1 year ago
Xuanlei Zhao f71e63b0f3
[moe] support optimizer checkpoint (#5015) 1 year ago
Xuanlei Zhao dc003c304c
[moe] merge moe into main (#4978) 1 year ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752) 1 year ago
Hongxin Liu b5f9e37c70
[legacy] clean up legacy code (#4743) 1 year ago
Hongxin Liu 8accecd55b [legacy] move engine to legacy (#4560) 1 year ago
digger-yu 1f73609adb
[CI] fix typo with tests/ etc. (#3727) 2 years ago
Frank Lee 80eba05b0a
[test] refactor tests with spawn (#3452) 2 years ago
ver217 933048ad3e
[test] reorganize zero/gemini tests (#3445) 2 years ago
ver217 26b7aac0be
[zero] reorganize zero/gemini folder structure (#3424) 2 years ago
HELSON 1a1d68b053
[moe] add checkpoint for moe models (#3354) 2 years ago
Jiarui Fang 1e885329f4
[test] align model name with the file name. (#2045) 2 years ago
HELSON 95c35f73bd
[moe] initialize MoE groups by ProcessGroup (#1640) 2 years ago
HELSON a088022efc
[moe] fix moe bugs (#1633) 2 years ago
HELSON f7f2248771
[moe] fix MoE bugs (#1628) 2 years ago
Frank Lee 5a1a095b92
[test] refactored with the new rerun decorator (#763) 3 years ago
ver217 e396bb71f2
[zero] add tensor placement policies (#743) 3 years ago
HELSON 22c4b88d56
[zero] refactor ShardedParamV2 for convenience (#742) 3 years ago
Jiarui Fang 53cb584808
[utils] correct cpu memory used and capacity in the context of multi-process (#726) 3 years ago
HELSON b9b469ea50
[moe] add checkpoint for moe zero test (#729) 3 years ago
Jiarui Fang 193dc8dacb
[refactor] refactor the memory utils (#715) 3 years ago
HELSON a9b8300d54
[zero] improve adaptability for not-shard parameters (#708) 3 years ago
HELSON ee112fe1da
[zero] adapt zero hooks for unsharded module (#699) 3 years ago
HELSON d7ecaf362b
[zero] fix init bugs in zero context (#686) 3 years ago