ColossalAI/colossalai/moe
Wenhao Chen 3c08f17348
[hotfix]: modify create_ep_hierarchical_group and add test (#5032)
* feat: modify create_ep_hierarchical_group args

* test: add ep tests

* fix: remove get_process_group_ranks

* fix: fix src_rank
2023-11-17 10:53:00 +08:00
..
__init__.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
_operation.py [hotfix]: modify create_ep_hierarchical_group and add test (#5032) 2023-11-17 10:53:00 +08:00
checkpoint.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
experts.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
layers.py [hotfix]: modify create_ep_hierarchical_group and add test (#5032) 2023-11-17 10:53:00 +08:00
load_balance.py [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
loss.py [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
manager.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
routers.py [moe]: fix ep/tp tests, add hierarchical all2all (#4982) 2023-11-09 06:31:00 +00:00
utils.py [hotfix]: modify create_ep_hierarchical_group and add test (#5032) 2023-11-17 10:53:00 +08:00