Guangyao Zhang
53cb9606bd
[Feature] llama shardformer fp8 support ( #5938 )
...
* add llama shardformer fp8
* Llama Shardformer Parity
* fix typo
* fix all reduce
* fix pytest failure
* fix reduce op and move function to fp8.py
* fix typo
4 months ago
Hanks
c297e21bea
Merge pull request #5961 from ver217/feature/zeor-fp8
...
[fp8] add fp8 comm for low level zero
4 months ago
YeAnbang
fe71917851
Merge pull request #5962 from hpcaitech/colossalchat
...
[Chat] Support overall loss, update KTO logging
4 months ago
YeAnbang
0b2d55c4ab
Support overall loss, update KTO logging
4 months ago
ver217
91e596d017
[test] add zero fp8 test case
4 months ago
ver217
ae486ce005
[fp8] add fp8 comm for low level zero
4 months ago
Wang Binluo
75c963686f
[lora] lora support hybrid parallel plugin ( #5956 )
...
* lora support hybrid plugin
* fix
* fix
* fix
* fix
4 months ago
Tong Li
19d1510ea2
[feat] Dist Loader for Eval ( #5950 )
...
* support auto distributed data loader
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* support auto distributed data loader
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* fix tp error
* remove unused parameters
* remove unused
* update inference
* update docs
* update inference
---------
Co-authored-by: Michelle <qianranma8@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
4 months ago
botbw
62cdac6b7b
[chore] remove redundant test case, print string & reduce test tokens
4 months ago
botbw
d1d1ab871e
[moe] solve dp axis issue
4 months ago
botbw
65daa87627
[doc] add MoeHybridParallelPlugin docstring
4 months ago
hxwang
7bedd03739
[moe] remove force_overlap_comm flag and add warning instead
4 months ago
hxwang
f7c5485ed6
[chore] docstring
4 months ago
haze188
7e737df5ad
[misc] remove useless condition
4 months ago
haze188
70793ce9ed
[misc] fix ci failure: change default value to false in moe plugin
4 months ago
haze188
12d043ca00
[misc] remove incompatible test config
4 months ago
hxwang
606b0891ed
[chore] change moe_pg_mesh to private
4 months ago
hxwang
5b4c12381b
Revert "[moe] implement submesh initialization"
...
This reverts commit 2f9bce6686
.
4 months ago
hxwang
cb01c0d5ce
[moe] refactor mesh assignment
4 months ago
haze188
034020bd04
[misc] remove debug/print code
4 months ago
haze188
59bcf56c60
[misc] skip redunant test
4 months ago
hxwang
c3dc9b4dba
[deepseek] replace attn (a workaround for bug in transformers)
4 months ago
hxwang
6c39f0b144
[test] add check
4 months ago
haze188
b2952a5982
[moe] deepseek moe sp support
4 months ago
botbw
96d0fbc531
[bug] fix: somehow logger hangs the program
4 months ago
hxwang
067e18f7e9
[test] fix test: test_zero1_2
4 months ago
hxwang
74b03de3f9
[moe] remove ops
4 months ago
hxwang
70c9924d0d
[chore] solve moe ckpt test failure and some other arg pass failure
4 months ago
pre-commit-ci[bot]
52d346f2a5
[pre-commit.ci] auto fixes from pre-commit.com hooks
...
for more information, see https://pre-commit.ci
4 months ago
hxwang
46037c2ccd
[chore] minor fix after rebase
4 months ago
hxwang
803878b2fd
[moe] full test for deepseek and mixtral (pp + sp to fix)
4 months ago
hxwang
7077d38d5a
[moe] finalize test (no pp)
4 months ago
haze188
2cddeac717
moe sp + ep bug fix
4 months ago
hxwang
877d94bb8c
[moe] init moe plugin comm setting with sp
4 months ago
hxwang
09d6280d3e
[chore] minor fix
4 months ago
Haze188
404b16faf3
[Feature] MoE Ulysses Support ( #5918 )
...
* moe sp support
* moe sp bug solve
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
4 months ago
hxwang
3e2b6132b7
[moe] clean legacy code
4 months ago
hxwang
74eccac0db
[moe] test deepseek
4 months ago
botbw
dc583aa576
[moe] implement tp
4 months ago
botbw
0b5bbe9ce4
[test] add mixtral modelling test
4 months ago
hxwang
102b784a10
[chore] arg pass & remove drop token
4 months ago
botbw
8dbb86899d
[chore] trivial fix
4 months ago
botbw
014faf6c5a
[chore] manually revert unintended commit
4 months ago
botbw
9b9b76bdcd
[moe] add mixtral dp grad scaling when not all experts are activated
4 months ago
botbw
e28e05345b
[moe] implement submesh initialization
4 months ago
haze188
5ed5e8cfba
solve hang when parallel mode = pp + dp
4 months ago
haze188
fe24789eb1
[misc] solve booster hang by rename the variable
4 months ago
botbw
13b48ac0aa
[zero] solve hang
4 months ago
botbw
b5bfeb2efd
[moe] implement transit between non moe tp and ep
4 months ago
botbw
37443cc7e4
[test] pass mixtral shardformer test
4 months ago