281 Commits (d383449fc4300ae3caf9cf481fc87bb4757f00a4)

Author SHA1 Message Date
wangbluo dae39999d7 fix 3 months ago
Wenxuan Tan 7cf9df07bc
[Hotfix] Fix llama fwd replacement bug (#6031) 3 months ago
Wang Binluo eea37da6fa
[fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
wangbluo 698c8b9804 fix 3 months ago
wangbluo 6aface9316 fix 3 months ago
wangbluo 193030f696 fix 3 months ago
wangbluo eb5ba40def fix the merge 3 months ago
wangbluo 2ee6235cfa fix 3 months ago
wangbluo f7acfa1bd5 fix 3 months ago
wangbluo 53823118f2 fix 3 months ago
wangbluo 88b3f0698c fix the merge 3 months ago
Edenzzzz f1c3266a94
overlap kv comm with output rescale (#6017) 3 months ago
wangbluo 1a5847e6d1 fix the merge 3 months ago
wangbluo 02636c5bef fix the merge 3 months ago
pre-commit-ci[bot] 81272e9d00 [pre-commit.ci] auto fixes from pre-commit.com hooks 3 months ago
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905) 3 months ago
Haze188 887d2d579b
[misc] Bypass the huggingface bug to solve the mask mismatch problem (#5991) 3 months ago
Wang Binluo b2483c8e31
[fp8] support hybrid parallel plugin (#5982) 3 months ago
flybird11111 f1a3a326c4
[fp8]Moe support fp8 communication (#5977) 4 months ago
Edenzzzz b4d2377d4c
[Hotfix] Avoid fused RMSnorm import error without apex (#5985) 4 months ago
flybird11111 0c10afd372
[FP8] rebase main (#5963) 4 months ago
Guangyao Zhang 53cb9606bd
[Feature] llama shardformer fp8 support (#5938) 4 months ago
Wang Binluo 75c963686f
[lora] lora support hybrid parallel plugin (#5956) 4 months ago
botbw 62cdac6b7b [chore] remove redundant test case, print string & reduce test tokens 4 months ago
haze188 7e737df5ad [misc] remove useless condition 4 months ago
haze188 70793ce9ed [misc] fix ci failure: change default value to false in moe plugin 4 months ago
hxwang cb01c0d5ce [moe] refactor mesh assignment 4 months ago
haze188 034020bd04 [misc] remove debug/print code 4 months ago
hxwang c3dc9b4dba [deepseek] replace attn (a workaround for bug in transformers) 4 months ago
haze188 b2952a5982 [moe] deepseek moe sp support 4 months ago
hxwang 70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure 4 months ago
hxwang 803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix) 4 months ago
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp 4 months ago
hxwang 09d6280d3e [chore] minor fix 4 months ago
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918) 4 months ago
hxwang 3e2b6132b7 [moe] clean legacy code 4 months ago
hxwang 74eccac0db [moe] test deepseek 4 months ago
botbw dc583aa576 [moe] implement tp 4 months ago
hxwang 102b784a10 [chore] arg pass & remove drop token 4 months ago
botbw 9b9b76bdcd [moe] add mixtral dp grad scaling when not all experts are activated 4 months ago
botbw b5bfeb2efd [moe] implement transit between non moe tp and ep 4 months ago
botbw 37443cc7e4 [test] pass mixtral shardformer test 4 months ago
hxwang 46c069b0db [zero] solve hang 4 months ago
hxwang 0b76b57cd6 [test] add mixtral transformer test 4 months ago
hxwang f9b6fcf81f [test] add mixtral for sequence classification 4 months ago
Hongxin Liu 7b38964e3a
[shardformer] hotfix attn mask (#5947) 4 months ago
Hongxin Liu 9664b1bc19
[shardformer] hotfix attn mask (#5945) 4 months ago
Insu Jang a521ffc9f8
Add n_fused as an input from native_module (#5894) 4 months ago
GuangyaoZhang 5b969fd831 fix shardformer fp8 communication training degradation 4 months ago
GuangyaoZhang 6a20f07b80 remove all to all 4 months ago