You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/fx/profiler
YuliangLiu0306 fbd2a9e05b
[hotfix] meta_tensor_compatibility_with_torch2
2 years ago
..
experimental [NFC] polish colossalai/fx/profiler/experimental/profiler_module/embedding.py code style (#3256) 2 years ago
__init__.py [fx] refactor memory utils and extend shard utils. (#1754) 2 years ago
constants.py [fx/profiler] debug the fx.profiler / add an example test script for fx.profiler (#1730) 2 years ago
dataflow.py [fx] refactor memory utils and extend shard utils. (#1754) 2 years ago
memory_utils.py [autoparallel] move ckpt solvers to autoparallel folder / refactor code (#1764) 2 years ago
opcount.py [hotfix] meta_tensor_compatibility_with_torch2 2 years ago
profiler.py [autoparallel] refactor and add rotorc. (#1789) 2 years ago
shard_utils.py [autoparallel] modify comm nodes' memory cost in construct chain (#2263) 2 years ago
tensor.py [hotfix] meta tensor default device. (#2510) 2 years ago