ColossalAI/colossalai/zero
ver217 dd92b90a68
[DO NOT MERGE] [zero] init fp16 params directly in ZeroInitContext (#808)
* init fp16 param directly

* polish code
2022-04-19 16:16:48 +08:00
..
init_ctx [DO NOT MERGE] [zero] init fp16 params directly in ZeroInitContext (#808) 2022-04-19 16:16:48 +08:00
shard_utils Revert "[zero] add ZeroTensorShardStrategy (#793)" (#806) 2022-04-19 14:40:02 +08:00
sharded_model [refactor] moving memtracer to gemini (#801) 2022-04-19 10:13:08 +08:00
sharded_optim [refactor] moving memtracer to gemini (#801) 2022-04-19 10:13:08 +08:00
sharded_param [hotfix] fix memory leak in zero (#781) 2022-04-18 13:57:03 +08:00
utils [refactor] moving memtracer to gemini (#801) 2022-04-19 10:13:08 +08:00
__init__.py [refactor] remove old zero code (#517) 2022-03-25 14:54:39 +08:00