You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/auto_parallel/tensor_shard/node_handler/strategy
YuliangLiu0306 aa0f6686f9
[autoparallel] accelerate gpt2 training (#2495)
2 years ago
..
__init__.py [autoparallel] implement softmax handler (#2132) 2 years ago
batch_norm_generator.py
binary_elementwise_generator.py
conv_strategy_generator.py
embedding_generator.py
getattr_generator.py [autoparallel] update_getattr_handler (#2193) 2 years ago
getitem_generator.py [autoparallel] update getitem handler (#2207) 2 years ago
layer_norm_generator.py
matmul_strategy_generator.py [autoparallel] accelerate gpt2 training (#2495) 2 years ago
normal_pooling_generator.py
output_generator.py
placeholder_generator.py
reshape_generator.py
softmax_generator.py [autoparallel] implement softmax handler (#2132) 2 years ago
strategy_generator.py
sum_generator.py
tensor_constructor_generator.py
unary_elementwise_generator.py
where_generator.py