104 Commits (ckpt)

Author SHA1 Message Date
Tong Li 4c8e85ee0d
[Coati] Train DPO using PP (#6054) 1 month ago
botbw 4fa6b9509c
[moe] add parallel strategy for shared_expert && fix test for deepseek (#6063) 2 months ago
botbw 696fced0d7
[fp8] fix missing fp8_comm flag in mixtral (#6057) 2 months ago
botbw c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 2 months ago
Wenxuan Tan 8fd25d6e09
[Feature] Split cross-entropy computation in SP (#5959) 2 months ago
Wang Binluo eea37da6fa
[fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
wangbluo 698c8b9804 fix 3 months ago
wangbluo 193030f696 fix 3 months ago
wangbluo eb5ba40def fix the merge 3 months ago
wangbluo 53823118f2 fix 3 months ago
pre-commit-ci[bot] 81272e9d00 [pre-commit.ci] auto fixes from pre-commit.com hooks 3 months ago
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905) 3 months ago
Haze188 887d2d579b
[misc] Bypass the huggingface bug to solve the mask mismatch problem (#5991) 3 months ago
Wang Binluo b2483c8e31
[fp8] support hybrid parallel plugin (#5982) 3 months ago
flybird11111 f1a3a326c4
[fp8]Moe support fp8 communication (#5977) 4 months ago
flybird11111 0c10afd372
[FP8] rebase main (#5963) 4 months ago
Guangyao Zhang 53cb9606bd
[Feature] llama shardformer fp8 support (#5938) 4 months ago
botbw 62cdac6b7b [chore] remove redundant test case, print string & reduce test tokens 4 months ago
hxwang cb01c0d5ce [moe] refactor mesh assignment 4 months ago
haze188 034020bd04 [misc] remove debug/print code 4 months ago
haze188 b2952a5982 [moe] deepseek moe sp support 4 months ago
hxwang 70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure 4 months ago
hxwang 803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix) 4 months ago
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp 4 months ago
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918) 4 months ago
hxwang 3e2b6132b7 [moe] clean legacy code 4 months ago
hxwang 74eccac0db [moe] test deepseek 4 months ago
botbw dc583aa576 [moe] implement tp 4 months ago
hxwang 102b784a10 [chore] arg pass & remove drop token 4 months ago
botbw 9b9b76bdcd [moe] add mixtral dp grad scaling when not all experts are activated 4 months ago
botbw b5bfeb2efd [moe] implement transit between non moe tp and ep 4 months ago
hxwang 0b76b57cd6 [test] add mixtral transformer test 4 months ago
Hongxin Liu 9664b1bc19
[shardformer] hotfix attn mask (#5945) 4 months ago
GuangyaoZhang 6a20f07b80 remove all to all 4 months ago
GuangyaoZhang 5a310b9ee1 fix rebase 4 months ago
GuangyaoZhang 457a0de79f shardformer fp8 4 months ago
Guangyao Zhang 1c961b20f3
[ShardFormer] fix qwen2 sp (#5903) 4 months ago
Guangyao Zhang 669849d74b
[ShardFormer] Add Ulysses Sequence Parallelism support for Command-R, Qwen2 and ChatGLM (#5897) 5 months ago
Edenzzzz fbf33ecd01
[Feature] Enable PP + SP for llama (#5868) 5 months ago
Haze188 3420921101
[shardformer] DeepseekMoE support (#5871) 5 months ago
Edenzzzz eb24fcd914
[Hotfix] Fix OPT gradient checkpointing forward 5 months ago
pre-commit-ci[bot] 7c2f79fa98
[pre-commit.ci] pre-commit autoupdate (#5572) 5 months ago
Jianghai 8ab46b4000
[Shardformer] change qwen2 modeling into gradient checkpointing style (#5874) 5 months ago
Haze188 416580b314
[MoE/ZeRO] Moe refactor with zero refactor (#5821) 5 months ago
flybird11111 773d9f964a
[shardformer]delete xformers (#5859) 5 months ago
Guangyao Zhang d9d5e7ea1f
[shardformer] Support the T5ForTokenClassification model (#5816) 5 months ago
GuangyaoZhang d84d68601a change 'xxx if xxx else None' to 'xxx or None' 5 months ago
pre-commit-ci[bot] 996c65077e [pre-commit.ci] auto fixes from pre-commit.com hooks 5 months ago
GuangyaoZhang a83a2336e8 rebase master llama change 5 months ago
GuangyaoZhang 363cde6957 merge model and attention forward 5 months ago