ColossalAI/colossalai/moe
Hongxin Liu da39d21b71 [moe] support mixtral (#5309)
* [moe] add mixtral block for single expert

* [moe] mixtral block fwd support uneven ep

* [moe] mixtral block bwd support uneven ep

* [moe] add mixtral moe layer

* [moe] simplify replace

* [meo] support save sharded mixtral

* [meo] support load sharded mixtral

* [meo] support save sharded optim

* [meo] integrate moe manager into plug

* [meo] fix optimizer load

* [meo] fix mixtral layer
2024-02-07 19:21:02 +08:00
..
__init__.py [moe] init mixtral impl 2024-02-07 19:21:02 +08:00
_operation.py [moe] support mixtral (#5309) 2024-02-07 19:21:02 +08:00
checkpoint.py [moe] init mixtral impl 2024-02-07 19:21:02 +08:00
experts.py [moe] init mixtral impl 2024-02-07 19:21:02 +08:00
layers.py [moe] init mixtral impl 2024-02-07 19:21:02 +08:00
load_balance.py [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
loss.py [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
manager.py fix some typo (#5307) 2024-01-25 13:56:27 +08:00
routers.py [moe] update capacity computing (#5253) 2024-02-07 19:21:02 +08:00
utils.py [moe] init mixtral impl 2024-02-07 19:21:02 +08:00