268 Commits (f5c84af0b01bcd2e993d38dc628793f7f0a8ba64)

Author SHA1 Message Date
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905) 3 months ago
Haze188 887d2d579b
[misc] Bypass the huggingface bug to solve the mask mismatch problem (#5991) 3 months ago
Edenzzzz b4d2377d4c
[Hotfix] Avoid fused RMSnorm import error without apex (#5985) 4 months ago
Wang Binluo 75c963686f
[lora] lora support hybrid parallel plugin (#5956) 4 months ago
botbw 62cdac6b7b [chore] remove redundant test case, print string & reduce test tokens 4 months ago
haze188 7e737df5ad [misc] remove useless condition 4 months ago
haze188 70793ce9ed [misc] fix ci failure: change default value to false in moe plugin 4 months ago
hxwang cb01c0d5ce [moe] refactor mesh assignment 4 months ago
haze188 034020bd04 [misc] remove debug/print code 4 months ago
hxwang c3dc9b4dba [deepseek] replace attn (a workaround for bug in transformers) 4 months ago
haze188 b2952a5982 [moe] deepseek moe sp support 4 months ago
hxwang 70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure 4 months ago
hxwang 803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix) 4 months ago
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp 4 months ago
hxwang 09d6280d3e [chore] minor fix 4 months ago
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918) 4 months ago
hxwang 3e2b6132b7 [moe] clean legacy code 4 months ago
hxwang 74eccac0db [moe] test deepseek 4 months ago
botbw dc583aa576 [moe] implement tp 4 months ago
hxwang 102b784a10 [chore] arg pass & remove drop token 4 months ago
botbw 9b9b76bdcd [moe] add mixtral dp grad scaling when not all experts are activated 4 months ago
botbw b5bfeb2efd [moe] implement transit between non moe tp and ep 4 months ago
botbw 37443cc7e4 [test] pass mixtral shardformer test 4 months ago
hxwang 46c069b0db [zero] solve hang 4 months ago
hxwang 0b76b57cd6 [test] add mixtral transformer test 4 months ago
hxwang f9b6fcf81f [test] add mixtral for sequence classification 4 months ago
Hongxin Liu 7b38964e3a
[shardformer] hotfix attn mask (#5947) 4 months ago
Hongxin Liu 9664b1bc19
[shardformer] hotfix attn mask (#5945) 4 months ago
Insu Jang a521ffc9f8
Add n_fused as an input from native_module (#5894) 4 months ago
Guangyao Zhang 1c961b20f3
[ShardFormer] fix qwen2 sp (#5903) 4 months ago
Guangyao Zhang 669849d74b
[ShardFormer] Add Ulysses Sequence Parallelism support for Command-R, Qwen2 and ChatGLM (#5897) 5 months ago
Edenzzzz fbf33ecd01
[Feature] Enable PP + SP for llama (#5868) 5 months ago
Edenzzzz 8ec24b6a4d
[Hoxfix] Fix CUDA_DEVICE_MAX_CONNECTIONS for comm overlap 5 months ago
Haze188 3420921101
[shardformer] DeepseekMoE support (#5871) 5 months ago
Wang Binluo 6cd4c32be4
[shardformer] fix the moe (#5883) 5 months ago
Edenzzzz eb24fcd914
[Hotfix] Fix OPT gradient checkpointing forward 5 months ago
pre-commit-ci[bot] 7c2f79fa98
[pre-commit.ci] pre-commit autoupdate (#5572) 5 months ago
Jianghai 8ab46b4000
[Shardformer] change qwen2 modeling into gradient checkpointing style (#5874) 5 months ago
Haze188 416580b314
[MoE/ZeRO] Moe refactor with zero refactor (#5821) 5 months ago
flybird11111 773d9f964a
[shardformer]delete xformers (#5859) 5 months ago
Runyu Lu 3c7cda0c9a
[Inference]Lazy Init Support (#5785) 5 months ago
Guangyao Zhang d9d5e7ea1f
[shardformer] Support the T5ForTokenClassification model (#5816) 5 months ago
GuangyaoZhang d84d68601a change 'xxx if xxx else None' to 'xxx or None' 5 months ago
pre-commit-ci[bot] 996c65077e [pre-commit.ci] auto fixes from pre-commit.com hooks 5 months ago
GuangyaoZhang a83a2336e8 rebase master llama change 5 months ago
GuangyaoZhang 363cde6957 merge model and attention forward 5 months ago
GuangyaoZhang 7a2b08646f Remove CohereLayerNorm and use existing layernorm 5 months ago
GuangyaoZhang fe2e74c03a fix precommit 5 months ago
GuangyaoZhang f656d61778 change command 5 months ago
GuangyaoZhang 0b81163bc0 Copy llama to command 5 months ago