ColossalAI/tests/test_shardformer
flybird11111 148506c828
[coloattention]modify coloattention (#5627)
* modify coloattention

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix

* fix

* fix

fxi

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-04-25 10:47:14 +08:00
..
test_hybrid_parallel_grad_clip_norm [gemini] gemini support extra-dp (#5043) 2023-11-16 21:03:04 +08:00
test_layer [shardformer] refactor embedding resize (#5603) 2024-04-18 16:10:18 +08:00
test_model [shardformer] update transformers (#5583) 2024-04-24 22:51:50 +08:00
__init__.py [shardformer] adapted T5 and LLaMa test to use kit (#4049) 2023-07-04 16:05:01 +08:00
test_flash_attention.py [coloattention]modify coloattention (#5627) 2024-04-25 10:47:14 +08:00
test_shard_utils.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
test_with_torch_ddp.py [ci] fixed ddp test (#5254) 2024-01-11 17:16:32 +08:00