You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/fx/passes
YuliangLiu0306 51b89d2202
[autoparallel] runtime_backward_apply (#1720)
2 years ago
..
algorithms [fx/meta/rpc] move _meta_registration.py to fx folder / register fx functions with compatibility checks / remove color debug (#1710) 2 years ago
experimental [autoparallel] runtime_backward_apply (#1720) 2 years ago
__init__.py [fx] Add concrete info prop (#1677) 2 years ago
adding_split_node_pass.py [fx] update split module pass and add customized policy (#1373) 2 years ago
concrete_info_prop.py [fx/meta/rpc] move _meta_registration.py to fx folder / register fx functions with compatibility checks / remove color debug (#1710) 2 years ago
meta_info_prop.py [fx/meta/rpc] move _meta_registration.py to fx folder / register fx functions with compatibility checks / remove color debug (#1710) 2 years ago
passes_for_gpt2_test.py [hotfix] fix some bugs during gpt2 testing (#1379) 2 years ago
shard_1d_pass.py [Doc] add more doc for ColoTensor. (#1458) 2 years ago
split_module.py [fx] fixed compatiblity issue with torch 1.10 (#1331) 2 years ago
utils.py [autoparallel] added liveness analysis (#1516) 2 years ago