ColossalAI/colossalai/tensor
Hongxin Liu da39d21b71 [moe] support mixtral (#5309)
* [moe] add mixtral block for single expert

* [moe] mixtral block fwd support uneven ep

* [moe] mixtral block bwd support uneven ep

* [moe] add mixtral moe layer

* [moe] simplify replace

* [meo] support save sharded mixtral

* [meo] support load sharded mixtral

* [meo] support save sharded optim

* [meo] integrate moe manager into plug

* [meo] fix optimizer load

* [meo] fix mixtral layer
2024-02-07 19:21:02 +08:00
..
d_tensor
moe_tensor [moe] support mixtral (#5309) 2024-02-07 19:21:02 +08:00
__init__.py
colo_parameter.py [gemini] fix param op hook when output is tuple (#5355) 2024-02-04 11:58:26 +08:00
colo_tensor.py
comm_spec.py
param_op_hook.py [gemini] fix param op hook when output is tuple (#5355) 2024-02-04 11:58:26 +08:00
shape_consistency.py
sharding_spec.py
utils.py