ColossalAI/colossalai/shardformer/modeling
Hongxin Liu 172f7fa3cf [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
..
chatglm2_6b [pipeline] add chatglm (#4363) 2023-08-15 23:25:14 +08:00
__init__.py [shardformer] added development protocol for standardization (#4149) 2023-07-04 16:05:01 +08:00
bert.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
blip2.py [shardformer] update shardformer to use flash attention 2 (#4392) 2023-08-15 23:25:14 +08:00
bloom.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
chatglm.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
gpt2.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
jit.py [Shardformer] Merge flash attention branch to pipeline branch (#4362) 2023-08-15 23:25:14 +08:00
llama.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
opt.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
sam.py [Shardformer] Merge flash attention branch to pipeline branch (#4362) 2023-08-15 23:25:14 +08:00
t5.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
vit.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
whisper.py [shardformer] update shardformer to use flash attention 2 (#4392) 2023-08-15 23:25:14 +08:00