ColossalAI/colossalai/zero
Jiarui Fang 3af13a2c3e [zero] polish ShardedOptimV2 unittest (#385)
* place params on cpu after zero init context

* polish code

* bucketzed cpu gpu tensor transter

* find a bug in sharded optim unittest

* add offload unittest for ShardedOptimV2.

* polish code and make it more robust
2022-03-11 15:50:28 +08:00
..
init_ctx [bug] shard param during initializing the ShardedModelV2 (#381) 2022-03-11 15:50:28 +08:00
shard_utils [zero] able to place params on cpu after zero init context (#365) 2022-03-11 15:50:28 +08:00
sharded_model [zero] polish ShardedOptimV2 unittest (#385) 2022-03-11 15:50:28 +08:00
sharded_optim [zero] polish ShardedOptimV2 unittest (#385) 2022-03-11 15:50:28 +08:00
sharded_param [zero] find miss code (#378) 2022-03-11 15:50:28 +08:00
__init__.py added buffer sync to naive amp model wrapper (#291) 2022-03-11 15:50:28 +08:00