ColossalAI/colossalai/shardformer/modeling
Haze188 3420921101
[shardformer] DeepseekMoE support (#5871)
* [Feature] deepseek moe expert parallel implement

* [misc] fix typo, remove redundant file (#5867)

* [misc] fix typo

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>

* [Feature] deepseek support & unit test

* [misc] remove debug code & useless print

* [misc] fix typos (#5872)

* [Feature] remove modeling file, use auto config. (#5884)

* [misc] fix typos

* [Feature] deepseek support via auto model, remove modeling file

* [misc] delete useless file

* [misc] fix typos

* [Deepseek] remove redundant code (#5888)

* [misc] fix typos

* [Feature] deepseek support via auto model, remove modeling file

* [misc] delete useless file

* [misc] fix typos

* [misc] remove redundant code

* [Feature/deepseek] resolve comment. (#5889)

* [misc] fix typos

* [Feature] deepseek support via auto model, remove modeling file

* [misc] delete useless file

* [misc] fix typos

* [misc] remove redundant code

* [misc] mv module replacement into if branch

* [misc] add some warning message and modify some code in unit test

* [misc] fix typos

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-07-05 16:13:58 +08:00
..
chatglm2_6b [pre-commit.ci] pre-commit autoupdate (#5572) 2024-07-01 17:16:41 +08:00
__init__.py [shardformer] added development protocol for standardization (#4149) 2023-07-04 16:05:01 +08:00
bert.py [shardformer]delete xformers (#5859) 2024-06-28 11:20:04 +08:00
blip2.py [shardformer] support bias_gelu_jit_fused for models (#5647) 2024-04-29 15:33:51 +08:00
bloom.py [shardformer]delete xformers (#5859) 2024-06-28 11:20:04 +08:00
chatglm2.py [shardformer] fix chatglm implementation (#5644) 2024-04-25 14:41:17 +08:00
command.py change 'xxx if xxx else None' to 'xxx or None' 2024-06-18 03:32:42 +00:00
deepseek.py [shardformer] DeepseekMoE support (#5871) 2024-07-05 16:13:58 +08:00
falcon.py [shardformer] fix modeling of bloom and falcon (#5796) 2024-06-11 17:43:50 +08:00
gpt2.py [shardformer] upgrade transformers to 4.39.3 (#5815) 2024-06-14 10:59:33 +08:00
gptj.py [shardformer] upgrade transformers to 4.39.3 (#5815) 2024-06-14 10:59:33 +08:00
jit.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
llama.py Support 4d parallel + flash attention (#5789) 2024-06-17 17:40:47 +08:00
mistral.py [shardformer] upgrade transformers to 4.39.3 (#5815) 2024-06-14 10:59:33 +08:00
mixtral.py [MoE/ZeRO] Moe refactor with zero refactor (#5821) 2024-06-28 14:00:08 +08:00
opt.py [Hotfix] Fix OPT gradient checkpointing forward 2024-07-03 14:57:57 +08:00
qwen2.py [Shardformer] change qwen2 modeling into gradient checkpointing style (#5874) 2024-07-01 16:45:09 +08:00
sam.py [shardformer]delete xformers (#5859) 2024-06-28 11:20:04 +08:00
t5.py [shardformer] Support the T5ForTokenClassification model (#5816) 2024-06-27 16:40:38 +08:00
vit.py [shardformer] support bias_gelu_jit_fused for models (#5647) 2024-04-29 15:33:51 +08:00
whisper.py [shardformer] upgrade transformers to 4.39.3 (#5815) 2024-06-14 10:59:33 +08:00