Commit Graph

3763 Commits (5caad13055e802e2665f1d70593116103a72395a)

Author SHA1 Message Date
hxwang 7077d38d5a [moe] finalize test (no pp) 2024-08-01 10:06:59 +08:00
haze188 2cddeac717 moe sp + ep bug fix 2024-08-01 10:06:59 +08:00
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp 2024-08-01 10:06:59 +08:00
hxwang 09d6280d3e [chore] minor fix 2024-08-01 10:06:59 +08:00
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918)
* moe sp support

* moe sp bug solve

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-01 10:06:59 +08:00
hxwang 3e2b6132b7 [moe] clean legacy code 2024-08-01 10:06:59 +08:00
hxwang 74eccac0db [moe] test deepseek 2024-08-01 10:06:59 +08:00
botbw dc583aa576 [moe] implement tp 2024-08-01 10:06:59 +08:00
botbw 0b5bbe9ce4 [test] add mixtral modelling test 2024-08-01 10:06:59 +08:00
hxwang 102b784a10 [chore] arg pass & remove drop token 2024-08-01 10:06:59 +08:00
botbw 8dbb86899d [chore] trivial fix 2024-08-01 10:06:59 +08:00
botbw 014faf6c5a [chore] manually revert unintended commit 2024-08-01 10:06:59 +08:00
botbw 9b9b76bdcd [moe] add mixtral dp grad scaling when not all experts are activated 2024-08-01 10:06:59 +08:00
botbw e28e05345b [moe] implement submesh initialization 2024-08-01 10:06:59 +08:00
haze188 5ed5e8cfba solve hang when parallel mode = pp + dp 2024-08-01 10:06:59 +08:00
haze188 fe24789eb1 [misc] solve booster hang by rename the variable 2024-08-01 10:06:59 +08:00
botbw 13b48ac0aa [zero] solve hang 2024-08-01 10:06:59 +08:00
botbw b5bfeb2efd [moe] implement transit between non moe tp and ep 2024-08-01 10:06:59 +08:00
botbw 37443cc7e4 [test] pass mixtral shardformer test 2024-08-01 10:06:59 +08:00
hxwang 46c069b0db [zero] solve hang 2024-08-01 10:06:59 +08:00
hxwang 0fad23c691 [chore] handle non member group 2024-08-01 10:06:59 +08:00
hxwang a249e71946 [test] mixtra pp shard test 2024-08-01 10:06:59 +08:00
hxwang 8ae8525bdf [moe] fix plugin 2024-08-01 10:06:59 +08:00
hxwang 0b76b57cd6 [test] add mixtral transformer test 2024-08-01 10:06:59 +08:00
hxwang f9b6fcf81f [test] add mixtral for sequence classification 2024-08-01 10:06:59 +08:00
Tong Li 1aeb5e8847
[hotfix] Remove unused plan section (#5957)
* remove readme

* fix readme

* update
2024-07-31 17:47:46 +08:00
YeAnbang 66fbf2ecb7
Update README.md (#5958) 2024-07-31 17:44:09 +08:00
YeAnbang 30f4e31a33
[Chat] Fix lora (#5946)
* fix merging

* remove filepath

* fix style
2024-07-31 14:10:17 +08:00
Hongxin Liu 09c5f72595
[release] update version (#5952) 2024-07-31 10:04:50 +08:00
Hongxin Liu 060892162a
[zero] hotfix update master params (#5951) 2024-07-30 13:36:00 +08:00
Runyu Lu bcf0181ecd
[Feat] Distrifusion Acceleration Support for Diffusion Inference (#5895)
* Distrifusion Support source

* comp comm overlap optimization

* sd3 benchmark

* pixart distrifusion bug fix

* sd3 bug fix and benchmark

* generation bug fix

* naming fix

* add docstring, fix counter and shape error

* add reference

* readme and requirement
2024-07-30 10:43:26 +08:00
Hongxin Liu 7b38964e3a
[shardformer] hotfix attn mask (#5947) 2024-07-29 19:10:06 +08:00
Hongxin Liu 9664b1bc19
[shardformer] hotfix attn mask (#5945) 2024-07-29 13:58:27 +08:00
YeAnbang c8332b9cb5
Merge pull request #5922 from hpcaitech/kto
[Chat] Add KTO
2024-07-29 13:27:00 +08:00
YeAnbang 6fd9e86864 fix style 2024-07-29 01:29:18 +00:00
YeAnbang de1bf08ed0 fix style 2024-07-26 10:07:15 +00:00
YeAnbang 8a3ff4f315 fix style 2024-07-26 09:55:15 +00:00
zhurunhua ad35a987d3
[Feature] Add a switch to control whether the model checkpoint needs to be saved after each epoch ends (#5941)
* Add a switch to control whether the model checkpoint needs to be saved after each epoch ends

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-07-26 11:15:20 +08:00
Edenzzzz 2069472e96
[Hotfix] Fix ZeRO typo #5936
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-07-25 09:59:58 +08:00
Hongxin Liu 5fd0592767
[fp8] support all-gather flat tensor (#5932) 2024-07-24 16:55:20 +08:00
Gao, Ruiyuan 5fb958cc83
[FIX BUG] convert env param to int in (#5934) 2024-07-24 10:30:40 +08:00
Insu Jang a521ffc9f8
Add n_fused as an input from native_module (#5894) 2024-07-23 23:15:39 +08:00
YeAnbang 9688e19b32 remove real data path 2024-07-22 06:13:02 +00:00
YeAnbang b0e15d563e remove real data path 2024-07-22 06:11:38 +00:00
YeAnbang 12fe8b5858 refactor evaluation 2024-07-22 05:57:39 +00:00
YeAnbang c5f582f666 fix test data 2024-07-22 01:31:32 +00:00
zhurunhua 4ec17a7cdf
[FIX BUG] UnboundLocalError: cannot access local variable 'default_conversation' where it is not associated with a value (#5931)
* cannot access local variable 'default_conversation' where it is not associated with a value

set default value for 'default_conversation'

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-07-21 19:46:01 +08:00
YeAnbang 150505cbb8 Merge branch 'kto' of https://github.com/hpcaitech/ColossalAI into kto 2024-07-19 10:11:05 +00:00
YeAnbang d49550fb49 refactor tokenization 2024-07-19 10:10:48 +00:00
Tong Li d08c99be0d
Merge branch 'main' into kto 2024-07-19 15:23:31 +08:00