ColossalAI/applications/ColossalMoE/tests
Hongxin Liu da39d21b71 [moe] support mixtral (#5309)
* [moe] add mixtral block for single expert

* [moe] mixtral block fwd support uneven ep

* [moe] mixtral block bwd support uneven ep

* [moe] add mixtral moe layer

* [moe] simplify replace

* [meo] support save sharded mixtral

* [meo] support load sharded mixtral

* [meo] support save sharded optim

* [meo] integrate moe manager into plug

* [meo] fix optimizer load

* [meo] fix mixtral layer
2024-02-07 19:21:02 +08:00
..
__init__.py [moe] init mixtral impl 2024-02-07 19:21:02 +08:00
test_mixtral_layer.py [moe] support mixtral (#5309) 2024-02-07 19:21:02 +08:00
test_moe_checkpoint.py [moe] support mixtral (#5309) 2024-02-07 19:21:02 +08:00