1186 Commits (162251ab7844e4116a36d6e0fec2ac7ccd03f74d)

Author SHA1 Message Date
Hongxin Liu dc2cdaf3e8
[shardformer] optimize seq parallelism (#6086) 1 month ago
Hongxin Liu 646b3c5a90
[shardformer] fix linear 1d row and support uneven splits for fused qkv linear (#6084) 1 month ago
botbw 4fa6b9509c
[moe] add parallel strategy for shared_expert && fix test for deepseek (#6063) 2 months ago
Guangyao Zhang f20b066c59
[fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059) 2 months ago
botbw c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 2 months ago
Wenxuan Tan 8fd25d6e09
[Feature] Split cross-entropy computation in SP (#5959) 2 months ago
Hongxin Liu b3db1058ec
[release] update version (#6041) 2 months ago
Wang Binluo eea37da6fa
[fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
wangbluo 2e4cbe3a2d fix 3 months ago
Hongxin Liu 26493b97d3
[misc] update compatibility (#6008) 3 months ago
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905) 3 months ago
flybird11111 20722a8c93
[fp8]update reduce-scatter test (#6002) 3 months ago
flybird11111 597b206001
[fp8] support asynchronous FP8 communication (#5997) 3 months ago
Hongxin Liu 8241c0c054
[fp8] support gemini plugin (#5978) 4 months ago
Hanks b480eec738
[Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928) 4 months ago
Hongxin Liu ccabcf6485
[fp8] support fp8 amp for hybrid parallel plugin (#5975) 4 months ago
Hongxin Liu 76ea16466f
[fp8] add fp8 linear (#5967) 4 months ago
flybird11111 afb26de873
[fp8]support all2all fp8 (#5953) 4 months ago
flybird11111 0c10afd372
[FP8] rebase main (#5963) 4 months ago
Guangyao Zhang 53cb9606bd
[Feature] llama shardformer fp8 support (#5938) 4 months ago
ver217 91e596d017 [test] add zero fp8 test case 4 months ago
Wang Binluo 75c963686f
[lora] lora support hybrid parallel plugin (#5956) 4 months ago
botbw 62cdac6b7b [chore] remove redundant test case, print string & reduce test tokens 4 months ago
haze188 70793ce9ed [misc] fix ci failure: change default value to false in moe plugin 4 months ago
haze188 12d043ca00 [misc] remove incompatible test config 4 months ago
hxwang cb01c0d5ce [moe] refactor mesh assignment 4 months ago
haze188 034020bd04 [misc] remove debug/print code 4 months ago
haze188 59bcf56c60 [misc] skip redunant test 4 months ago
hxwang c3dc9b4dba [deepseek] replace attn (a workaround for bug in transformers) 4 months ago
hxwang 6c39f0b144 [test] add check 4 months ago
haze188 b2952a5982 [moe] deepseek moe sp support 4 months ago
hxwang 067e18f7e9 [test] fix test: test_zero1_2 4 months ago
hxwang 70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure 4 months ago
pre-commit-ci[bot] 52d346f2a5 [pre-commit.ci] auto fixes from pre-commit.com hooks 4 months ago
hxwang 46037c2ccd [chore] minor fix after rebase 4 months ago
hxwang 803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix) 4 months ago
hxwang 7077d38d5a [moe] finalize test (no pp) 4 months ago
haze188 2cddeac717 moe sp + ep bug fix 4 months ago
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp 4 months ago
hxwang 09d6280d3e [chore] minor fix 4 months ago
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918) 4 months ago
hxwang 3e2b6132b7 [moe] clean legacy code 4 months ago
hxwang 74eccac0db [moe] test deepseek 4 months ago
botbw dc583aa576 [moe] implement tp 4 months ago
botbw 0b5bbe9ce4 [test] add mixtral modelling test 4 months ago
hxwang 102b784a10 [chore] arg pass & remove drop token 4 months ago
botbw 9b9b76bdcd [moe] add mixtral dp grad scaling when not all experts are activated 4 months ago
botbw e28e05345b [moe] implement submesh initialization 4 months ago
haze188 5ed5e8cfba solve hang when parallel mode = pp + dp 4 months ago
haze188 fe24789eb1 [misc] solve booster hang by rename the variable 4 months ago