You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster/plugin
Xuanlei Zhao f71e63b0f3
[moe] support optimizer checkpoint (#5015)
1 year ago
..
__init__.py [misc] update pre-commit and run all files (#4752) 1 year ago
dp_plugin_base.py [misc] update pre-commit and run all files (#4752) 1 year ago
gemini_plugin.py [hotfix] fix the bug of repeatedly storing param group (#4951) 1 year ago
hybrid_parallel_plugin.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 1 year ago
low_level_zero_plugin.py [hotfix] fix the bug of repeatedly storing param group (#4951) 1 year ago
moe_hybrid_parallel_plugin.py [moe] support optimizer checkpoint (#5015) 1 year ago
plugin_base.py [misc] update pre-commit and run all files (#4752) 1 year ago
pp_plugin_base.py [misc] update pre-commit and run all files (#4752) 1 year ago
torch_ddp_plugin.py [doc] polish shardformer doc (#4779) 1 year ago
torch_fsdp_plugin.py [doc] polish shardformer doc (#4779) 1 year ago