mirror of https://github.com/hpcaitech/ColossalAI
![]() * fix: add warning for EP different behavior * fix: use shard_data in ep & tp model * to: add used_capacity * fix: fix router test * feat: add create_ep_node_group * feat: add create_ep_hierarchical_group fn * feat: add HierarchicalAllToAll * test: add hierarchical all2all test * fix: fix test errors * fix: simplify create_ep_hierarchical_group * fix: add hierarchical_alltoall arg * fix: fix environ typo * revert: revert process mesh order * to: add todo mark * fix: skip hierarchical_comm if torch < 1.13.1 |
||
---|---|---|
.. | ||
__init__.py | ||
convert_openmoe_ckpt.py | ||
convert_openmoe_ckpt.sh | ||
modeling_openmoe.py | ||
openmoe_8b_config.json | ||
openmoe_base_config.json | ||
openmoe_policy.py |