You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/zero
Jiarui Fang 0bebda6ea5
[zero] fix init device bug in zero init context unittest (#516)
3 years ago
..
init_ctx [zero] fix init device bug in zero init context unittest (#516) 3 years ago
shard_utils [zero] fix init device bug in zero init context unittest (#516) 3 years ago
sharded_model [zero] show model data cuda memory usage after zero context init. (#515) 3 years ago
sharded_optim [zero] use colo model data api in optimv2 (#511) 3 years ago
sharded_param [zero] sharded model support the reuse of fp16 shard (#495) 3 years ago
__init__.py [zero] Update initialize for ZeRO (#458) 3 years ago