You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/shardformer/shard
hxwang cb01c0d5ce
[moe] refactor mesh assignment
4 months ago
..
__init__.py [shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 8 months ago
grad_ckpt_config.py [shardformer] refactor pipeline grad ckpt config (#5646) 7 months ago
shard_config.py [moe] refactor mesh assignment 4 months ago
sharder.py [nfc] fix typo colossalai/shardformer/ (#5133) 11 months ago
shardformer.py [Hoxfix] Fix CUDA_DEVICE_MAX_CONNECTIONS for comm overlap 5 months ago
utils.py