You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/shardformer/policies
flybird11111 79718fae04
[shardformer] llama support DistCrossEntropy (#5176)
12 months ago
..
__init__.py [shardformer] init shardformer code structure (#3731) 1 year ago
auto_policy.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
base_policy.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
bert.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
blip2.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 1 year ago
bloom.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
chatglm2.py [Inference] Fix bug in ChatGLM2 Tensor Parallelism (#5014) 1 year ago
falcon.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
gpt2.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 1 year ago
gptj.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
llama.py [shardformer] llama support DistCrossEntropy (#5176) 12 months ago
mistral.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
opt.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
sam.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 1 year ago
t5.py [gemini] gemini support tensor parallelism. (#4942) 1 year ago
vit.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 1 year ago
whisper.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago