You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/shardformer/modeling
flybird11111 50b4c8e8cf
[hotfix] fix llama flash attention forward (#5777)
6 months ago
..
chatglm2_6b
__init__.py
bert.py [shardformer] support bias_gelu_jit_fused for models (#5647) 7 months ago
blip2.py [shardformer] support bias_gelu_jit_fused for models (#5647) 7 months ago
bloom.py [Shardformer] Add parallel output for shardformer models(bloom, falcon) (#5702) 6 months ago
chatglm2.py
falcon.py [Shardformer] Add parallel output for shardformer models(bloom, falcon) (#5702) 6 months ago
gpt2.py [Shardformer] Add parallel output for shardformer models(bloom, falcon) (#5702) 6 months ago
gptj.py
jit.py
llama.py [hotfix] fix llama flash attention forward (#5777) 6 months ago
mistral.py [Shardformer] Add parallel output for shardformer models(bloom, falcon) (#5702) 6 months ago
opt.py [Shardformer] Add parallel output for shardformer models(bloom, falcon) (#5702) 6 months ago
qwen2.py [Shardformer]fix the num_heads assert for llama model and qwen model (#5704) 7 months ago
sam.py
t5.py
vit.py [shardformer] support bias_gelu_jit_fused for models (#5647) 7 months ago
whisper.py