You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/auto_parallel/tensor_shard
YuliangLiu0306 aa0f6686f9
[autoparallel] accelerate gpt2 training (#2495)
2 years ago
..
deprecated
node_handler [autoparallel] accelerate gpt2 training (#2495) 2 years ago
solver
utils
__init__.py
constants.py
initialize.py [autoparallel] support origin activation ckpt on autoprallel system (#2468) 2 years ago
sharding_strategy.py