ColossalAI/colossalai/booster/plugin
Xuanlei Zhao dc003c304c
[moe] merge moe into main (#4978)
* update moe module
* support openmoe
2023-11-02 02:21:24 +00:00
..
__init__.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
dp_plugin_base.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
gemini_plugin.py [hotfix] fix the bug of repeatedly storing param group (#4951) 2023-10-31 14:48:01 +08:00
hybrid_parallel_plugin.py [feature] support no master weights option for low level zero plugin (#4816) 2023-10-13 07:57:45 +00:00
low_level_zero_plugin.py [hotfix] fix the bug of repeatedly storing param group (#4951) 2023-10-31 14:48:01 +08:00
moe_hybrid_parallel_plugin.py [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
plugin_base.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
pp_plugin_base.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
torch_ddp_plugin.py [doc] polish shardformer doc (#4779) 2023-09-26 10:57:47 +08:00
torch_fsdp_plugin.py [doc] polish shardformer doc (#4779) 2023-09-26 10:57:47 +08:00