ColossalAI/colossalai/shardformer/policies
Edenzzzz fbf33ecd01
[Feature] Enable PP + SP for llama (#5868)
* fix cross-PP-stage position id length diff bug

* fix typo

* fix typo

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* use a one cross entropy func for all shardformer models

---------

Co-authored-by: Edenzzzz <wtan45@wisc.edu>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-07-09 18:05:20 +08:00
..
__init__.py [shardformer] init shardformer code structure (#3731) 2023-07-04 16:05:01 +08:00
auto_policy.py [shardformer] DeepseekMoE support (#5871) 2024-07-05 16:13:58 +08:00
base_policy.py [shardformer] refactor pipeline grad ckpt config (#5646) 2024-04-25 15:19:30 +08:00
bert.py [shardformer]delete xformers (#5859) 2024-06-28 11:20:04 +08:00
blip2.py [Shardformer] add assert for num of attention heads divisible by tp_size (#5670) 2024-04-29 18:47:47 +08:00
bloom.py [shardformer]delete xformers (#5859) 2024-06-28 11:20:04 +08:00
chatglm2.py change 'xxx if xxx else None' to 'xxx or None' 2024-06-18 03:32:42 +00:00
command.py change 'xxx if xxx else None' to 'xxx or None' 2024-06-18 03:32:42 +00:00
deepseek.py [shardformer] DeepseekMoE support (#5871) 2024-07-05 16:13:58 +08:00
falcon.py [Shardformer] Add parallel output for shardformer models(bloom, falcon) (#5702) 2024-05-21 11:07:13 +08:00
gpt2.py change 'xxx if xxx else None' to 'xxx or None' 2024-06-18 03:32:42 +00:00
gptj.py [shardformer] upgrade transformers to 4.39.3 (#5815) 2024-06-14 10:59:33 +08:00
llama.py [Feature] Enable PP + SP for llama (#5868) 2024-07-09 18:05:20 +08:00
mistral.py [shardformer] upgrade transformers to 4.39.3 (#5815) 2024-06-14 10:59:33 +08:00
mixtral.py [shardformer] DeepseekMoE support (#5871) 2024-07-05 16:13:58 +08:00
opt.py [pre-commit.ci] auto fixes from pre-commit.com hooks 2024-05-07 07:07:09 +00:00
qwen2.py [Shardformer]fix the num_heads assert for llama model and qwen model (#5704) 2024-05-10 15:33:39 +08:00
sam.py [shardformer]delete xformers (#5859) 2024-06-28 11:20:04 +08:00
t5.py [shardformer] Support the T5ForTokenClassification model (#5816) 2024-06-27 16:40:38 +08:00
vit.py [Shardformer] add assert for num of attention heads divisible by tp_size (#5670) 2024-04-29 18:47:47 +08:00
whisper.py [Shardformer] add assert for num of attention heads divisible by tp_size (#5670) 2024-04-29 18:47:47 +08:00