ColossalAI/colossalai/auto_parallel/tensor_shard
Boyuan Yao ab38aebace
[autoparallel] Hook all meta information on ResNet nodes for auto activation checkpoint (#2248)
* [autoparallel] hook node meta on graph nodes for checkpoint solver

* [autoparallel] polish code

* [autoparallel] restore some node handlers

* colossalai/auto_parallel/passes/meta_info_prop.py

* [autoparallel] remove some unused import

* [autoparallel] hook bwd_mem_out
2023-01-02 16:25:18 +08:00
..
deprecated [NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/operator_handler.py code style (#1845) 2022-11-09 12:08:47 +08:00
node_handler [autoparallel] Hook all meta information on ResNet nodes for auto activation checkpoint (#2248) 2023-01-02 16:25:18 +08:00
solver [autoparallel] new metainfoprop based on metainfo class (#2179) 2022-12-28 13:35:08 +08:00
utils [autoparallel] adapt solver with self attention (#2037) 2022-12-01 17:53:15 +08:00
__init__.py [autoparallel] init new folder structure (#1696) 2022-10-13 14:18:55 +08:00
constants.py [autoparallel] adapt solver with self attention (#2037) 2022-12-01 17:53:15 +08:00
initialize.py [autoparallel] autoparallel initialize (#2238) 2022-12-31 01:02:14 +08:00
sharding_strategy.py [autoparallel] memory estimation for shape consistency (#2144) 2022-12-21 10:39:37 +08:00