ColossalAI/colossalai/fx/passes
Boyuan Yao b231430bcb
[fx] Fix wrong index in annotation and minimal flops in ckpt solver (#1521)
* [fx] fix wrong variable name in solver rotor

* [fx] fix wrong variable name in solver rotor

* [fx] fix the discretize bug

* [fx] fix the first op in activation checkpoint codegen

* [fx] fix some bugs of ckpt solver

* [fx] modify test_ckpt_torchvision

* [fx] set sequence to __sequence__ attr of GraphModule

* [fx] docstring modification

* [fx] remove performance test
2022-08-31 18:10:48 +08:00
..
algorithms [fx] Fix wrong index in annotation and minimal flops in ckpt solver (#1521) 2022-08-31 18:10:48 +08:00
__init__.py [fx]add autoparallel passes (#1121) 2022-06-15 16:36:46 +08:00
adding_split_node_pass.py [fx] update split module pass and add customized policy (#1373) 2022-07-27 13:40:54 +08:00
meta_info_prop.py [fx] hack __torch_dispatch__ for meta tensor and autograd. (#1515) 2022-08-31 16:30:16 +08:00
passes_for_gpt2_test.py [hotfix] fix some bugs during gpt2 testing (#1379) 2022-07-28 17:21:07 +08:00
shard_1d_pass.py [Doc] add more doc for ColoTensor. (#1458) 2022-08-16 10:38:41 +08:00
split_module.py [fx] fixed compatiblity issue with torch 1.10 (#1331) 2022-07-18 11:41:27 +08:00
utils.py [autoparallel] added liveness analysis (#1516) 2022-08-30 15:54:37 +08:00