You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/auto_parallel/tensor_shard
YuliangLiu0306 ffcdbf0f65
[autoparallel]integrate auto parallel feature with new tracer (#3408)
2 years ago
..
node_handler [autoparallel]integrate auto parallel feature with new tracer (#3408) 2 years ago
solver [autoparallel]integrate auto parallel feature with new tracer (#3408) 2 years ago
utils [autoparallel] find repeat blocks (#2854) 2 years ago
__init__.py [autoparallel] init new folder structure (#1696) 2 years ago
constants.py [autoparallel] adapt solver with self attention (#2037) 2 years ago
initialize.py [autoparallel]integrate auto parallel feature with new tracer (#3408) 2 years ago
options.py [autoparallel] add shard option (#2696) 2 years ago
sharding_strategy.py [autoparallel] memory estimation for shape consistency (#2144) 2 years ago