ColossalAI/colossalai/checkpoint_io
Hongxin Liu da39d21b71 [moe] support mixtral (#5309)
* [moe] add mixtral block for single expert

* [moe] mixtral block fwd support uneven ep

* [moe] mixtral block bwd support uneven ep

* [moe] add mixtral moe layer

* [moe] simplify replace

* [meo] support save sharded mixtral

* [meo] support load sharded mixtral

* [meo] support save sharded optim

* [meo] integrate moe manager into plug

* [meo] fix optimizer load

* [meo] fix mixtral layer
2024-02-07 19:21:02 +08:00
..
__init__.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
checkpoint_io_base.py [moe] support mixtral (#5309) 2024-02-07 19:21:02 +08:00
general_checkpoint_io.py [shardformer] fix master param sync for hybrid plugin/rewrite unwrapping logic (#4758) 2023-09-20 18:29:37 +08:00
hybrid_parallel_checkpoint_io.py [checkpointio] fix gemini and hybrid parallel optim checkpoint (#5347) 2024-02-01 16:13:06 +08:00
index_file.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
utils.py [shardformer] Fix serialization error with Tensor Parallel state saving (#5018) 2023-11-09 17:00:25 +08:00