You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/shardformer/shard
Hongxin Liu 1b387ca9fe
[shardformer] refactor pipeline grad ckpt config (#5646)
7 months ago
..
__init__.py [shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 8 months ago
grad_ckpt_config.py [shardformer] refactor pipeline grad ckpt config (#5646) 7 months ago
shard_config.py [shardformer] refactor pipeline grad ckpt config (#5646) 7 months ago
sharder.py [nfc] fix typo colossalai/shardformer/ (#5133) 11 months ago
shardformer.py [example]add gpt2 benchmark example script. (#5295) 9 months ago
utils.py [pipeline] update shardformer policy 1 year ago