You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/fx
Boyuan Yao 90a9fdd91d
[autoparallel] Patch meta information of `torch.matmul` (#2584)
2 years ago
..
codegen [autoparallel] move ckpt solvers to autoparallel folder / refactor code (#1764) 2 years ago
passes add avg partition (#2483) 2 years ago
profiler [autoparallel] Patch meta information of `torch.matmul` (#2584) 2 years ago
tracer [fx] allow native ckpt trace and codegen. (#2438) 2 years ago
__init__.py [fx] metainfo_trace as an API. (#1873) 2 years ago
_compatibility.py [fx/meta/rpc] move _meta_registration.py to fx folder / register fx functions with compatibility checks / remove color debug (#1710) 2 years ago
_meta_registrations.py support unet metainfo prop (#2544) 2 years ago
graph_module.py [fx] allow control of ckpt_codegen init (#2498) 2 years ago
proxy.py [hotfix] fix coloproxy typos. (#1519) 2 years ago