You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/moe
Wenhao Chen 3c08f17348
[hotfix]: modify create_ep_hierarchical_group and add test (#5032)
1 year ago
..
__init__.py [moe] support optimizer checkpoint (#5015) 1 year ago
_operation.py [hotfix]: modify create_ep_hierarchical_group and add test (#5032) 1 year ago
checkpoint.py [moe] support optimizer checkpoint (#5015) 1 year ago
experts.py [moe] support optimizer checkpoint (#5015) 1 year ago
layers.py [hotfix]: modify create_ep_hierarchical_group and add test (#5032) 1 year ago
load_balance.py [moe] merge moe into main (#4978) 1 year ago
loss.py [moe] merge moe into main (#4978) 1 year ago
manager.py [moe] support optimizer checkpoint (#5015) 1 year ago
routers.py [moe]: fix ep/tp tests, add hierarchical all2all (#4982) 1 year ago
utils.py [hotfix]: modify create_ep_hierarchical_group and add test (#5032) 1 year ago