ColossalAI/colossalai/zero
HELSON f7f2248771
[moe] fix MoE bugs (#1628)
* remove forced FP32 modules

* correct no_shard-contexts' positions
2022-09-22 13:56:30 +08:00
..
init_ctx [moe] fix MoE bugs (#1628) 2022-09-22 13:56:30 +08:00
shard_utils [gemini] add GeminiMemoryManger (#832) 2022-04-24 13:08:48 +08:00
sharded_model [NFC] polish colossalai/zero/sharded_model/reduce_scatter.py code style (#1554) 2022-09-08 22:11:04 +08:00
sharded_optim fix move fp32 shards (#1604) 2022-09-16 17:33:16 +08:00
sharded_param [gemini] add GeminiMemoryManger (#832) 2022-04-24 13:08:48 +08:00
utils [hotfix] remove potiential circle import (#1307) 2022-07-14 13:44:26 +08:00
__init__.py [zero] add zero optimizer for ColoTensor (#1046) 2022-06-02 12:13:15 +08:00
zero_optimizer.py [utils] Impl clip_grad_norm for ColoTensor and ZeroOptimizer (#1442) 2022-08-11 22:58:58 +08:00