You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/shardformer/shard
botbw 2431694564
[moe] implement transit between non moe tp and ep
4 months ago
..
__init__.py [shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 8 months ago
grad_ckpt_config.py [shardformer] refactor pipeline grad ckpt config (#5646) 7 months ago
shard_config.py [moe] implement transit between non moe tp and ep 4 months ago
sharder.py [nfc] fix typo colossalai/shardformer/ (#5133) 11 months ago
shardformer.py [Hoxfix] Fix CUDA_DEVICE_MAX_CONNECTIONS for comm overlap 5 months ago
utils.py [pipeline] update shardformer policy 1 year ago