You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/fx
YuliangLiu0306 f6032ddb17
[autoparallel] fix bias addition module (#1800)
2 years ago
..
codegen [autoparallel] move ckpt solvers to autoparallel folder / refactor code (#1764) 2 years ago
passes [fx] support module with bias addition (#1780) 2 years ago
profiler [fx] Add linear metainfo class for auto parallel (#1783) 2 years ago
tracer [autoparallel] fix bias addition module (#1800) 2 years ago
__init__.py [fx] add a symbolic_trace api. (#1812) 2 years ago
_compatibility.py [fx/meta/rpc] move _meta_registration.py to fx folder / register fx functions with compatibility checks / remove color debug (#1710) 2 years ago
_meta_registrations.py [fx/meta/rpc] move _meta_registration.py to fx folder / register fx functions with compatibility checks / remove color debug (#1710) 2 years ago
graph_module.py [fx] Add offload codegen (#1598) 2 years ago
proxy.py [hotfix] fix coloproxy typos. (#1519) 2 years ago