You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/fx/passes
YuliangLiu0306 5542816690
[fx]add gpt2 passes for pipeline performance test (#1366)
2 years ago
..
__init__.py [fx]add autoparallel passes (#1121) 2 years ago
adding_split_node_pass.py [fx] add balanced policy v2 (#1251) 2 years ago
meta_info_prop.py [fx]add gpt2 passes for pipeline performance test (#1366) 2 years ago
passes_for_gpt2_test.py [fx]add gpt2 passes for pipeline performance test (#1366) 2 years ago
shard_1d_pass.py [fx] tested the complete workflow for auto-parallel (#1336) 2 years ago
split_module.py [fx] fixed compatiblity issue with torch 1.10 (#1331) 2 years ago
utils.py [fx] methods to get fx graph property. (#1246) 2 years ago