You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/fx
YuliangLiu0306 fee2af8610
[autoparallel] adapt autoparallel with new analyzer (#3261)
2 years ago
..
codegen [autoparallel] move ckpt solvers to autoparallel folder / refactor code (#1764) 2 years ago
passes [NFC] polish code style (#3273) 2 years ago
profiler [hotfix] meta_tensor_compatibility_with_torch2 2 years ago
tracer [NFC] polish colossalai/fx/tracer/_tracer_utils.py (#3323) 2 years ago
__init__.py [fx] metainfo_trace as an API. (#1873) 2 years ago
_compatibility.py [hotfix] meta_tensor_compatibility_with_torch2 2 years ago
_meta_regist_12.py [autoparallel] adapt autoparallel with new analyzer (#3261) 2 years ago
_meta_regist_13.py [fx] meta registration compatibility (#3253) 2 years ago
graph_module.py [fx] allow control of ckpt_codegen init (#2498) 2 years ago
proxy.py [NFC] policy colossalai/fx/proxy.py code style (#3269) 2 years ago