ColossalAI/colossalai/shardformer/modeling
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905)
* halfway

* fix cross-PP-stage position id length diff bug

* fix typo

* fix typo

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* unified cross entropy func for all shardformer models

* remove redundant lines

* add basic ring attn; debug cross entropy

* fwd bwd logic complete

* fwd bwd logic complete; add experimental triton rescale

* precision tests passed

* precision tests passed

* fix typos and remove misc files

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* add sp_mode to benchmark; fix varlen interface

* update softmax_lse shape by new interface

* change tester name

* remove buffer clone; support packed seq layout

* add varlen tests

* fix typo

* all tests passed

* add dkv_group; fix mask

* remove debug statements

---------

Co-authored-by: Edenzzzz <wtan45@wisc.edu>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-16 13:56:38 +08:00
..
chatglm2_6b [pre-commit.ci] pre-commit autoupdate (#5572) 2024-07-01 17:16:41 +08:00
__init__.py [shardformer] added development protocol for standardization (#4149) 2023-07-04 16:05:01 +08:00
bert.py [shardformer]delete xformers (#5859) 2024-06-28 11:20:04 +08:00
blip2.py [shardformer] support bias_gelu_jit_fused for models (#5647) 2024-04-29 15:33:51 +08:00
bloom.py [Feature] Enable PP + SP for llama (#5868) 2024-07-09 18:05:20 +08:00
chatglm2.py [ShardFormer] Add Ulysses Sequence Parallelism support for Command-R, Qwen2 and ChatGLM (#5897) 2024-07-10 11:34:25 +08:00
command.py [Feature] Zigzag Ring attention (#5905) 2024-08-16 13:56:38 +08:00
deepseek.py [misc] Bypass the huggingface bug to solve the mask mismatch problem (#5991) 2024-08-15 14:40:26 +08:00
falcon.py [shardformer] fix modeling of bloom and falcon (#5796) 2024-06-11 17:43:50 +08:00
gpt2.py [Feature] Enable PP + SP for llama (#5868) 2024-07-09 18:05:20 +08:00
gptj.py [shardformer] upgrade transformers to 4.39.3 (#5815) 2024-06-14 10:59:33 +08:00
jit.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
llama.py [Feature] Zigzag Ring attention (#5905) 2024-08-16 13:56:38 +08:00
mistral.py [shardformer] hotfix attn mask (#5945) 2024-07-29 13:58:27 +08:00
mixtral.py [chore] remove redundant test case, print string & reduce test tokens 2024-08-01 10:06:59 +08:00
opt.py [Feature] Enable PP + SP for llama (#5868) 2024-07-09 18:05:20 +08:00
qwen2.py [shardformer] hotfix attn mask (#5945) 2024-07-29 13:58:27 +08:00
sam.py [shardformer]delete xformers (#5859) 2024-06-28 11:20:04 +08:00
t5.py [shardformer] Support the T5ForTokenClassification model (#5816) 2024-06-27 16:40:38 +08:00
vit.py [shardformer] support bias_gelu_jit_fused for models (#5647) 2024-04-29 15:33:51 +08:00
whisper.py [shardformer] upgrade transformers to 4.39.3 (#5815) 2024-06-14 10:59:33 +08:00