You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/shardformer/modeling
Edenzzzz eb24fcd914
[Hotfix] Fix OPT gradient checkpointing forward
5 months ago
..
chatglm2_6b [pre-commit.ci] pre-commit autoupdate (#5572) 5 months ago
__init__.py
bert.py [shardformer]delete xformers (#5859) 5 months ago
blip2.py [shardformer] support bias_gelu_jit_fused for models (#5647) 7 months ago
bloom.py [shardformer]delete xformers (#5859) 5 months ago
chatglm2.py
command.py change 'xxx if xxx else None' to 'xxx or None' 5 months ago
falcon.py [shardformer] fix modeling of bloom and falcon (#5796) 6 months ago
gpt2.py [shardformer] upgrade transformers to 4.39.3 (#5815) 6 months ago
gptj.py [shardformer] upgrade transformers to 4.39.3 (#5815) 6 months ago
jit.py
llama.py Support 4d parallel + flash attention (#5789) 5 months ago
mistral.py [shardformer] upgrade transformers to 4.39.3 (#5815) 6 months ago
mixtral.py [MoE/ZeRO] Moe refactor with zero refactor (#5821) 5 months ago
opt.py [Hotfix] Fix OPT gradient checkpointing forward 5 months ago
qwen2.py [Shardformer] change qwen2 modeling into gradient checkpointing style (#5874) 5 months ago
sam.py [shardformer]delete xformers (#5859) 5 months ago
t5.py [shardformer] Support the T5ForTokenClassification model (#5816) 5 months ago
vit.py [shardformer] support bias_gelu_jit_fused for models (#5647) 7 months ago
whisper.py [shardformer] upgrade transformers to 4.39.3 (#5815) 6 months ago