You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/auto_parallel/tensor_shard
ziyuhuang123 d344313533
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/embedding_handler.py code style (#2725)
2 years ago
..
deprecated [NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/embedding_handler.py code style (#2725) 2 years ago
node_handler [autoparallel] Patch meta information of `torch.nn.LayerNorm` (#2647) 2 years ago
solver [autoparallel] adapt autoparallel tests with latest api (#2626) 2 years ago
utils Revert "[NFC] polish code format" (#2372) 2 years ago
__init__.py [autoparallel] init new folder structure (#1696) 2 years ago
constants.py [autoparallel] adapt solver with self attention (#2037) 2 years ago
initialize.py add overlap option (#2613) 2 years ago
sharding_strategy.py [autoparallel] memory estimation for shape consistency (#2144) 2 years ago