ColossalAI/tests/test_shardformer
Edenzzzz fbf33ecd01
[Feature] Enable PP + SP for llama (#5868)
* fix cross-PP-stage position id length diff bug

* fix typo

* fix typo

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* use a one cross entropy func for all shardformer models

---------

Co-authored-by: Edenzzzz <wtan45@wisc.edu>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-07-09 18:05:20 +08:00
..
test_hybrid_parallel_grad_clip_norm [MoE/ZeRO] Moe refactor with zero refactor (#5821) 2024-06-28 14:00:08 +08:00
test_layer [misc] refactor launch API and tensor constructor (#5666) 2024-04-29 10:40:11 +08:00
test_model [Feature] Enable PP + SP for llama (#5868) 2024-07-09 18:05:20 +08:00
__init__.py [shardformer] adapted T5 and LLaMa test to use kit (#4049) 2023-07-04 16:05:01 +08:00
test_flash_attention.py [coloattention]modify coloattention (#5627) 2024-04-25 10:47:14 +08:00
test_shard_utils.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
test_with_torch_ddp.py [misc] refactor launch API and tensor constructor (#5666) 2024-04-29 10:40:11 +08:00