ColossalAI/colossalai/zero
Hongxin Liu da39d21b71 [moe] support mixtral (#5309)
* [moe] add mixtral block for single expert

* [moe] mixtral block fwd support uneven ep

* [moe] mixtral block bwd support uneven ep

* [moe] add mixtral moe layer

* [moe] simplify replace

* [meo] support save sharded mixtral

* [meo] support load sharded mixtral

* [meo] support save sharded optim

* [meo] integrate moe manager into plug

* [meo] fix optimizer load

* [meo] fix mixtral layer
2024-02-07 19:21:02 +08:00
..
gemini [checkpointio] fix gemini and hybrid parallel optim checkpoint (#5347) 2024-02-01 16:13:06 +08:00
low_level [moe] support mixtral (#5309) 2024-02-07 19:21:02 +08:00
__init__.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 2023-11-28 16:54:42 +08:00
wrapper.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00