mirror of https://github.com/hpcaitech/ColossalAI
![]() * fix: add warning for EP different behavior * fix: use shard_data in ep & tp model * to: add used_capacity * fix: fix router test * feat: add create_ep_node_group * feat: add create_ep_hierarchical_group fn * feat: add HierarchicalAllToAll * test: add hierarchical all2all test * fix: fix test errors * fix: simplify create_ep_hierarchical_group * fix: add hierarchical_alltoall arg * fix: fix environ typo * revert: revert process mesh order * to: add todo mark * fix: skip hierarchical_comm if torch < 1.13.1 |
||
---|---|---|
.. | ||
moe_utils.py | ||
test_grad_handler.py | ||
test_kernel.py | ||
test_moe_checkpoint.py | ||
test_moe_ep_tp.py | ||
test_moe_group.py | ||
test_moe_hybrid_zero.py | ||
test_moe_load_balance.py | ||
test_moe_router.py | ||
test_moe_zero_fwd_bwd.py | ||
test_moe_zero_optim.py |