Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
Tong Li 4c8e85ee0d
[Coati] Train DPO using PP (#6054)
1 month ago
..
chatglm2_6b [pre-commit.ci] pre-commit autoupdate (#5572) 5 months ago
__init__.py [shardformer] added development protocol for standardization (#4149) 1 year ago
bert.py [fp8] support hybrid parallel plugin (#5982) 3 months ago
blip2.py [shardformer] support bias_gelu_jit_fused for models (#5647) 7 months ago
bloom.py [Feature] Split cross-entropy computation in SP (#5959) 2 months ago
chatglm2.py [Feature] Split cross-entropy computation in SP (#5959) 2 months ago
command.py [Feature] Split cross-entropy computation in SP (#5959) 2 months ago
deepseek.py [moe] add parallel strategy for shared_expert && fix test for deepseek (#6063) 2 months ago
falcon.py [shardformer] fix modeling of bloom and falcon (#5796) 5 months ago
gpt2.py [Feature] Split cross-entropy computation in SP (#5959) 2 months ago
gptj.py [fp8] support hybrid parallel plugin (#5982) 3 months ago
jit.py [misc] update pre-commit and run all files (#4752) 1 year ago
llama.py [Coati] Train DPO using PP (#6054) 1 month ago
mistral.py [Feature] Split cross-entropy computation in SP (#5959) 2 months ago
mixtral.py [fp8] fix missing fp8_comm flag in mixtral (#6057) 2 months ago
opt.py [Feature] Split cross-entropy computation in SP (#5959) 2 months ago
qwen2.py [Feature] Split cross-entropy computation in SP (#5959) 2 months ago
sam.py [shardformer]delete xformers (#5859) 5 months ago
t5.py [shardformer] Support the T5ForTokenClassification model (#5816) 5 months ago
vit.py [shardformer] support bias_gelu_jit_fused for models (#5647) 7 months ago
whisper.py [shardformer] upgrade transformers to 4.39.3 (#5815) 5 months ago