HELSON
|
0f2d219162
|
[MOE] add MOEGPT model (#510)
|
2022-03-24 17:39:21 +08:00 |
Jiarui Fang
|
a445e118cf
|
[polish] polish singleton and global context (#500)
|
2022-03-23 18:03:39 +08:00 |
HELSON
|
c9023d4078
|
[MOE] support PR-MOE (#488)
|
2022-03-22 16:48:22 +08:00 |
HELSON
|
7544347145
|
[MOE] add unitest for MOE experts layout, gradient handler and kernel (#469)
|
2022-03-21 13:35:04 +08:00 |
HELSON
|
dbdc9a7783
|
added Multiply Jitter and capacity factor eval for MOE (#434)
|
2022-03-16 16:47:44 +08:00 |
1SAA
|
82023779bb
|
Added TPExpert for special situation
|
2022-03-11 15:50:28 +08:00 |
1SAA
|
219df6e685
|
Optimized MoE layer and fixed some bugs;
Decreased moe tests;
Added FFNExperts and ViTMoE model
|
2022-03-11 15:50:28 +08:00 |
HELSON
|
1ff5be36c2
|
Added moe parallel example (#140)
|
2022-01-17 15:34:04 +08:00 |
HELSON
|
dceae85195
|
Added MoE parallel (#127)
|
2022-01-07 15:08:36 +08:00 |