You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/shardformer/modeling
duanjunwen 5c2ebbfd48
[fix] fix mixtral modeling & policy; update wait handles; doing benchmarking for llama hybrid;
2 weeks ago
..
chatglm2_6b [pre-commit.ci] pre-commit autoupdate (#5572) 5 months ago
__init__.py [shardformer] added development protocol for standardization (#4149) 1 year ago
bert.py [zerobubble] rebase main (#6075) 2 months ago
blip2.py [shardformer] support bias_gelu_jit_fused for models (#5647) 7 months ago
bloom.py [zerobubble] rebase main (#6075) 2 months ago
chatglm2.py [zerobubble] rebase main (#6075) 2 months ago
command.py [zerobubble] rebase main (#6075) 2 months ago
deepseek.py [zerobubble] rebase main (#6075) 2 months ago
falcon.py [shardformer] fix modeling of bloom and falcon (#5796) 6 months ago
gpt2.py fix 1 month ago
gptj.py [zerobubble] rebase main (#6075) 2 months ago
jit.py [misc] update pre-commit and run all files (#4752) 1 year ago
llama.py [feat] support no_tp Linear for sharderformer.llama 3 weeks ago
mistral.py [zerobubble] rebase main (#6075) 2 months ago
mixtral.py [fix] fix mixtral modeling & policy; update wait handles; doing benchmarking for llama hybrid; 2 weeks ago
opt.py [zerobubble] rebase main (#6075) 2 months ago
qwen2.py [zerobubble] rebase main (#6075) 2 months ago
sam.py [shardformer]delete xformers (#5859) 5 months ago
t5.py [shardformer] Support the T5ForTokenClassification model (#5816) 5 months ago
vit.py [shardformer] support bias_gelu_jit_fused for models (#5647) 7 months ago
whisper.py [shardformer] upgrade transformers to 4.39.3 (#5815) 6 months ago