You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/shardformer/policies
flybird11111 7486ed7d3a
[shardformer] update llama2/opt finetune example and fix llama2 policy (#4645)
1 year ago
..
__init__.py [shardformer] init shardformer code structure (#3731) 1 year ago
auto_policy.py rename chatglm to chatglm2 (#4484) 1 year ago
base_policy.py [shardformer] Add overlap support for gpt2 (#4535) 1 year ago
bert.py [shardformer/fix overlap bug] fix overlap bug, add overlap as an option in shardco… (#4516) 1 year ago
blip2.py [shardformer] chatglm support sequence parallel (#4482) 1 year ago
bloom.py [shardformer/fix overlap bug] fix overlap bug, add overlap as an option in shardco… (#4516) 1 year ago
chatglm2.py [shardformer] Pytree fix (#4533) 1 year ago
gpt2.py [shardformer] Add overlap support for gpt2 (#4535) 1 year ago
llama.py [shardformer] update llama2/opt finetune example and fix llama2 policy (#4645) 1 year ago
opt.py [shardformer] fix opt test hanging (#4521) 1 year ago
sam.py [shardformer] chatglm support sequence parallel (#4482) 1 year ago
t5.py [shardformer] fix opt test hanging (#4521) 1 year ago
vit.py [shardformer] vit/llama/t5 ignore the sequence parallelism flag and some fix. (#4498) 1 year ago
whisper.py [shardformer] fix opt test hanging (#4521) 1 year ago