Commit Graph

3764 Commits (cf519dac6a5799b8f314aac6f510e2a98d3af9c6)
 

Author SHA1 Message Date
hxwang 803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix)
4 months ago
hxwang 7077d38d5a [moe] finalize test (no pp)
4 months ago
haze188 2cddeac717 moe sp + ep bug fix
4 months ago
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp
4 months ago
hxwang 09d6280d3e [chore] minor fix
4 months ago
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918)
4 months ago
hxwang 3e2b6132b7 [moe] clean legacy code
4 months ago
hxwang 74eccac0db [moe] test deepseek
4 months ago
botbw dc583aa576 [moe] implement tp
4 months ago
botbw 0b5bbe9ce4 [test] add mixtral modelling test
4 months ago
hxwang 102b784a10 [chore] arg pass & remove drop token
4 months ago
botbw 8dbb86899d [chore] trivial fix
4 months ago
botbw 014faf6c5a [chore] manually revert unintended commit
4 months ago
botbw 9b9b76bdcd [moe] add mixtral dp grad scaling when not all experts are activated
4 months ago
botbw e28e05345b [moe] implement submesh initialization
4 months ago
haze188 5ed5e8cfba solve hang when parallel mode = pp + dp
4 months ago
haze188 fe24789eb1 [misc] solve booster hang by rename the variable
4 months ago
botbw 13b48ac0aa [zero] solve hang
4 months ago
botbw b5bfeb2efd [moe] implement transit between non moe tp and ep
4 months ago
botbw 37443cc7e4 [test] pass mixtral shardformer test
4 months ago
hxwang 46c069b0db [zero] solve hang
4 months ago
hxwang 0fad23c691 [chore] handle non member group
4 months ago
hxwang a249e71946 [test] mixtra pp shard test
4 months ago
hxwang 8ae8525bdf [moe] fix plugin
4 months ago
hxwang 0b76b57cd6 [test] add mixtral transformer test
4 months ago
hxwang f9b6fcf81f [test] add mixtral for sequence classification
4 months ago
Tong Li 1aeb5e8847
[hotfix] Remove unused plan section (#5957)
4 months ago
YeAnbang 66fbf2ecb7
Update README.md (#5958)
4 months ago
YeAnbang 30f4e31a33
[Chat] Fix lora (#5946)
4 months ago
Hongxin Liu 09c5f72595
[release] update version (#5952)
4 months ago
Hongxin Liu 060892162a
[zero] hotfix update master params (#5951)
4 months ago
Runyu Lu bcf0181ecd
[Feat] Distrifusion Acceleration Support for Diffusion Inference (#5895)
4 months ago
Hongxin Liu 7b38964e3a
[shardformer] hotfix attn mask (#5947)
4 months ago
Hongxin Liu 9664b1bc19
[shardformer] hotfix attn mask (#5945)
4 months ago
YeAnbang c8332b9cb5
Merge pull request #5922 from hpcaitech/kto
4 months ago
YeAnbang 6fd9e86864 fix style
4 months ago
YeAnbang de1bf08ed0 fix style
4 months ago
YeAnbang 8a3ff4f315 fix style
4 months ago
zhurunhua ad35a987d3
[Feature] Add a switch to control whether the model checkpoint needs to be saved after each epoch ends (#5941)
4 months ago
Edenzzzz 2069472e96
[Hotfix] Fix ZeRO typo #5936
4 months ago
Hongxin Liu 5fd0592767
[fp8] support all-gather flat tensor (#5932)
4 months ago
Gao, Ruiyuan 5fb958cc83
[FIX BUG] convert env param to int in (#5934)
4 months ago
Insu Jang a521ffc9f8
Add n_fused as an input from native_module (#5894)
4 months ago
YeAnbang 9688e19b32 remove real data path
4 months ago
YeAnbang b0e15d563e remove real data path
4 months ago
YeAnbang 12fe8b5858 refactor evaluation
4 months ago
YeAnbang c5f582f666 fix test data
4 months ago
zhurunhua 4ec17a7cdf
[FIX BUG] UnboundLocalError: cannot access local variable 'default_conversation' where it is not associated with a value (#5931)
4 months ago
YeAnbang 150505cbb8 Merge branch 'kto' of https://github.com/hpcaitech/ColossalAI into kto
4 months ago
YeAnbang d49550fb49 refactor tokenization
4 months ago