ColossalAI/colossalai/engine/ophooks
HELSON e6d50ec107
[zero] adapt zero for unsharded parameters (#561)
* support existing sharded and unsharded parameters in zero

* add unitest for moe-zero model init

* polish moe gradient handler
2022-03-31 18:34:11 +08:00
..
__init__.py [zero] adapt zero for unsharded parameters (#561) 2022-03-31 18:34:11 +08:00
_base_ophook.py add pytorch hooks (#179) 2022-01-25 22:20:54 +08:00
_memtracer_ophook.py Refactored docstring to google style 2022-03-29 17:17:47 +08:00
_shard_grad_ophook.py [zero] add sharded grad and refactor grad hooks for ShardedModel (#287) 2022-03-11 15:50:28 +08:00
_shard_param_ophook.py fix sharded param hook and unit test 2022-03-11 15:50:28 +08:00
zero_hook.py [polish] rename col_attr -> colo_attr (#558) 2022-03-31 12:25:45 +08:00