You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/auto_parallel/tensor_shard
YuliangLiu0306 2059fdd6b0
[hotfix] add copyright for solver and device mesh (#2803)
2 years ago
..
node_handler [autoparallel] distinguish different parallel strategies (#2699) 2 years ago
solver [hotfix] add copyright for solver and device mesh (#2803) 2 years ago
utils Revert "[NFC] polish code format" (#2372) 2 years ago
__init__.py
constants.py [autoparallel] adapt solver with self attention (#2037) 2 years ago
initialize.py [autoparallel] add shard option (#2696) 2 years ago
options.py [autoparallel] add shard option (#2696) 2 years ago
sharding_strategy.py [autoparallel] memory estimation for shape consistency (#2144) 2 years ago