ColossalAI/colossalai/fx/passes
Boyuan Yao d5c5bc219e
[SC] add GPT example for auto checkpoint (#1889)
* [sc] SC tutorial for auto checkpoint

* [sc] polish examples

* [sc] polish readme

* [sc] polish readme and help information

* [sc] polish readme and help information
2022-11-11 23:17:25 +08:00
..
algorithms [fx] refactor memory utils and extend shard utils. (#1754) 2022-10-26 14:24:41 +08:00
experimental [autoparallel] refactor the runtime apply pass and add docstring to passes (#1757) 2022-10-25 14:32:22 +08:00
__init__.py [fx] metainfo_trace as an API. (#1873) 2022-11-10 20:58:37 +08:00
adding_split_node_pass.py [fx] support module with bias addition (#1780) 2022-11-01 22:53:51 +08:00
concrete_info_prop.py [fx] refactor memory utils and extend shard utils. (#1754) 2022-10-26 14:24:41 +08:00
meta_info_prop.py [SC] add GPT example for auto checkpoint (#1889) 2022-11-11 23:17:25 +08:00
passes_for_gpt2_test.py [hotfix] fix some bugs during gpt2 testing (#1379) 2022-07-28 17:21:07 +08:00
shard_1d_pass.py [Doc] add more doc for ColoTensor. (#1458) 2022-08-16 10:38:41 +08:00
split_module.py [fx] fixed compatiblity issue with torch 1.10 (#1331) 2022-07-18 11:41:27 +08:00
utils.py [autoparallel] added liveness analysis (#1516) 2022-08-30 15:54:37 +08:00