Commit Graph

111 Commits (2d642eea0f92c7f7c1fb7bef3abdfdb0cb61d1bf)

Author SHA1 Message Date
botbw 62cdac6b7b [chore] remove redundant test case, print string & reduce test tokens
4 months ago
hxwang cb01c0d5ce [moe] refactor mesh assignment
4 months ago
haze188 034020bd04 [misc] remove debug/print code
4 months ago
haze188 b2952a5982 [moe] deepseek moe sp support
4 months ago
hxwang 70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure
4 months ago
hxwang 803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix)
4 months ago
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp
4 months ago
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918)
4 months ago
hxwang 3e2b6132b7 [moe] clean legacy code
4 months ago
hxwang 74eccac0db [moe] test deepseek
4 months ago
botbw dc583aa576 [moe] implement tp
4 months ago
hxwang 102b784a10 [chore] arg pass & remove drop token
4 months ago
botbw 9b9b76bdcd [moe] add mixtral dp grad scaling when not all experts are activated
4 months ago
botbw b5bfeb2efd [moe] implement transit between non moe tp and ep
4 months ago
hxwang 0b76b57cd6 [test] add mixtral transformer test
4 months ago
Hongxin Liu 9664b1bc19
[shardformer] hotfix attn mask (#5945)
4 months ago
Guangyao Zhang 1c961b20f3
[ShardFormer] fix qwen2 sp (#5903)
4 months ago
Guangyao Zhang 669849d74b
[ShardFormer] Add Ulysses Sequence Parallelism support for Command-R, Qwen2 and ChatGLM (#5897)
5 months ago
Edenzzzz fbf33ecd01
[Feature] Enable PP + SP for llama (#5868)
5 months ago
Haze188 3420921101
[shardformer] DeepseekMoE support (#5871)
5 months ago
Edenzzzz eb24fcd914
[Hotfix] Fix OPT gradient checkpointing forward
5 months ago
pre-commit-ci[bot] 7c2f79fa98
[pre-commit.ci] pre-commit autoupdate (#5572)
5 months ago
Jianghai 8ab46b4000
[Shardformer] change qwen2 modeling into gradient checkpointing style (#5874)
5 months ago
Haze188 416580b314
[MoE/ZeRO] Moe refactor with zero refactor (#5821)
5 months ago
flybird11111 773d9f964a
[shardformer]delete xformers (#5859)
5 months ago
Guangyao Zhang d9d5e7ea1f
[shardformer] Support the T5ForTokenClassification model (#5816)
5 months ago
GuangyaoZhang d84d68601a change 'xxx if xxx else None' to 'xxx or None'
5 months ago
GuangyaoZhang a83a2336e8 rebase master llama change
5 months ago
GuangyaoZhang 363cde6957 merge model and attention forward
5 months ago
GuangyaoZhang fe2e74c03a fix precommit
5 months ago
GuangyaoZhang f656d61778 change command
5 months ago
GuangyaoZhang 0b81163bc0 Copy llama to command
5 months ago
Edenzzzz 8795bb2e80
Support 4d parallel + flash attention (#5789)
5 months ago
flybird11111 2ddf624a86
[shardformer] upgrade transformers to 4.39.3 (#5815)
5 months ago
Hongxin Liu aa125bcc91
[shardformer] fix modeling of bloom and falcon (#5796)
6 months ago
Hongxin Liu 73e88a5553
[shardformer] fix import (#5788)
6 months ago
flybird11111 50b4c8e8cf
[hotfix] fix llama flash attention forward (#5777)
6 months ago
Haze188 22ce873c3f
[Shardformer] Add parallel output for shardformer models(bloom, falcon) (#5702)
6 months ago
Wang Binluo 537f6a3855
[Shardformer]fix the num_heads assert for llama model and qwen model (#5704)
7 months ago
Wang Binluo a3cc68ca93
[Shardformer] Support the Qwen2 model (#5699)
7 months ago
wangbluo 4e50cce26b fix the mistral model
7 months ago
wangbluo a8408b4d31 remove comment code
7 months ago
pre-commit-ci[bot] ca56b93d83 [pre-commit.ci] auto fixes from pre-commit.com hooks
7 months ago
wangbluo 108ddfb795 add parallel_output for the opt model
7 months ago
wangbluo 2632916329 remove useless code
7 months ago
wangbluo 9efc79ef24 add parallel output for mistral model
7 months ago
flybird11111 6af6d6fc9f
[shardformer] support bias_gelu_jit_fused for models (#5647)
7 months ago
Hongxin Liu 1b387ca9fe
[shardformer] refactor pipeline grad ckpt config (#5646)
7 months ago
Hongxin Liu bbb2c21f16
[shardformer] fix chatglm implementation (#5644)
7 months ago
flybird11111 5d88ef1aaf
[shardformer] remove useless code (#5645)
7 months ago