ColossalAI/colossalai/zero
HELSON c6a1a62636
[hotfix] fix zero's incompatibility with checkpoint in torch-1.12 (#1786)
* [hotfix] fix zero's incompatibility with checkpoint in torch-1.12

* [zero] add cpu shard init

* [zero] add tiny example test

* [colo_tensor] fix bugs for torch-1.11
2022-11-02 16:11:34 +08:00
..
init_ctx [moe] fix MoE bugs (#1628) 2022-09-22 13:56:30 +08:00
shard_utils [gemini] add GeminiMemoryManger (#832) 2022-04-24 13:08:48 +08:00
sharded_model [NFC] polish colossalai/zero/sharded_model/reduce_scatter.py code style (#1554) 2022-09-08 22:11:04 +08:00
sharded_optim fix move fp32 shards (#1604) 2022-09-16 17:33:16 +08:00
sharded_param [NFC] polish colossalai/zero/sharded_param/__init__.py code style (#1717) 2022-10-19 12:20:51 +08:00
utils [zero] add constant placement policy (#1705) 2022-10-14 17:53:16 +08:00
__init__.py [zero] add zero optimizer for ColoTensor (#1046) 2022-06-02 12:13:15 +08:00
zero_optimizer.py [hotfix] fix zero's incompatibility with checkpoint in torch-1.12 (#1786) 2022-11-02 16:11:34 +08:00