Hongxin Liu
|
da39d21b71
|
[moe] support mixtral (#5309)
* [moe] add mixtral block for single expert
* [moe] mixtral block fwd support uneven ep
* [moe] mixtral block bwd support uneven ep
* [moe] add mixtral moe layer
* [moe] simplify replace
* [meo] support save sharded mixtral
* [meo] support load sharded mixtral
* [meo] support save sharded optim
* [meo] integrate moe manager into plug
* [meo] fix optimizer load
* [meo] fix mixtral layer
|
2024-02-07 19:21:02 +08:00 |
digger yu
|
bce9499ed3
|
fix some typo (#5307)
|
2024-01-25 13:56:27 +08:00 |
Wenhao Chen
|
3c08f17348
|
[hotfix]: modify create_ep_hierarchical_group and add test (#5032)
* feat: modify create_ep_hierarchical_group args
* test: add ep tests
* fix: remove get_process_group_ranks
* fix: fix src_rank
|
2023-11-17 10:53:00 +08:00 |
Xuanlei Zhao
|
f71e63b0f3
|
[moe] support optimizer checkpoint (#5015)
* Refactor MoE Manager setup method
* unshard optim ckpt
* optim io
* update transformer version
* update requirements
* update ckpt
* update ckpt
* update ckpt
* fix engine
* fix engine
|
2023-11-08 15:07:03 +00:00 |
Xuanlei Zhao
|
dc003c304c
|
[moe] merge moe into main (#4978)
* update moe module
* support openmoe
|
2023-11-02 02:21:24 +00:00 |