mirror of https://github.com/hpcaitech/ColossalAI
da39d21b71
* [moe] add mixtral block for single expert * [moe] mixtral block fwd support uneven ep * [moe] mixtral block bwd support uneven ep * [moe] add mixtral moe layer * [moe] simplify replace * [meo] support save sharded mixtral * [meo] support load sharded mixtral * [meo] support save sharded optim * [meo] integrate moe manager into plug * [meo] fix optimizer load * [meo] fix mixtral layer |
||
---|---|---|
.. | ||
d_tensor | ||
moe_tensor | ||
__init__.py | ||
colo_parameter.py | ||
colo_tensor.py | ||
comm_spec.py | ||
param_op_hook.py | ||
shape_consistency.py | ||
sharding_spec.py | ||
utils.py |