ColossalAI/colossalai/shardformer/modeling
flybird11111 29695cf70c
[example]add gpt2 benchmark example script. (#5295)
* benchmark gpt2

* fix

fix

fix

fix

* [doc] fix typo in Colossal-LLaMA-2/README.md (#5247)

* [workflow] fixed build CI (#5240)

* [workflow] fixed build CI

* polish

* polish

* polish

* polish

* polish

* [ci] fixed booster test (#5251)

* [ci] fixed booster test

* [ci] fixed booster test

* [ci] fixed booster test

* [ci] fixed ddp test (#5254)

* [ci] fixed ddp test

* polish

* fix typo in  applications/ColossalEval/README.md (#5250)

* [ci] fix shardformer tests. (#5255)

* fix ci

fix

* revert: revert p2p

* feat: add enable_metadata_cache option

* revert: enable t5 tests

---------

Co-authored-by: Wenhao Chen <cwher@outlook.com>

* [doc] fix doc typo (#5256)

* [doc] fix annotation display

* [doc] fix llama2 doc

* [hotfix]: add pp sanity check and fix mbs arg (#5268)

* fix: fix misleading mbs arg

* feat: add pp sanity check

* fix: fix 1f1b sanity check

* [workflow] fixed incomplete bash command (#5272)

* [workflow] fixed oom tests (#5275)

* [workflow] fixed oom tests

* polish

* polish

* polish

* [ci] fix test_hybrid_parallel_plugin_checkpoint_io.py (#5276)

* fix ci

fix

* fix test

* revert: revert p2p

* feat: add enable_metadata_cache option

* revert: enable t5 tests

* fix

---------

Co-authored-by: Wenhao Chen <cwher@outlook.com>

* [shardformer] hybridparallelplugin support gradients accumulation. (#5246)

* support gradients acc

fix

fix

fix

fix

fix

fix

fix

fix

fix

fix

fix

fix

fix

* fix

fix

* fix

fix

fix

* [hotfix] Fix ShardFormer test execution path when using sequence parallelism (#5230)

* fix auto loading gpt2 tokenizer (#5279)

* [doc] add llama2-13B disyplay (#5285)

* Update README.md

* fix 13b typo

---------

Co-authored-by: binmakeswell <binmakeswell@gmail.com>

* fix llama pretrain (#5287)

* fix

* fix

* fix

fix

* fix

fix

fix

* fix

fix

* benchmark gpt2

* fix

fix

fix

fix

* [workflow] fixed build CI (#5240)

* [workflow] fixed build CI

* polish

* polish

* polish

* polish

* polish

* [ci] fixed booster test (#5251)

* [ci] fixed booster test

* [ci] fixed booster test

* [ci] fixed booster test

* fix

fix

* fix

fix

fix

* fix

* fix

fix

fix

fix

fix

* fix

* Update shardformer.py

---------

Co-authored-by: digger yu <digger-yu@outlook.com>
Co-authored-by: Frank Lee <somerlee.9@gmail.com>
Co-authored-by: Wenhao Chen <cwher@outlook.com>
Co-authored-by: binmakeswell <binmakeswell@gmail.com>
Co-authored-by: Zhongkai Zhao <kanezz620@gmail.com>
Co-authored-by: Michelle <97082656+MichelleMa8@users.noreply.github.com>
Co-authored-by: Desperado-Jia <502205863@qq.com>
2024-03-04 16:18:13 +08:00
..
chatglm2_6b [hotfix/hybridengine] Fix init model with random parameters in benchmark (#5074) 2023-11-20 20:15:25 +08:00
__init__.py [shardformer] added development protocol for standardization (#4149) 2023-07-04 16:05:01 +08:00
bert.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
blip2.py [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00
bloom.py [gemini] gemini support tensor parallelism. (#4942) 2023-11-10 10:15:16 +08:00
chatglm2.py [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00
falcon.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 2023-11-28 16:54:42 +08:00
gpt2.py [example]add gpt2 benchmark example script. (#5295) 2024-03-04 16:18:13 +08:00
gptj.py [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00
jit.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
llama.py [shardformer]gather llama logits (#5398) 2024-02-27 22:44:07 +08:00
mistral.py [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00
opt.py [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00
sam.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
t5.py [tests] fix t5 test. (#5322) 2024-01-29 17:38:46 +08:00
vit.py [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00
whisper.py [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00