ColossalAI/tests/test_shardformer
Edenzzzz 8795bb2e80
Support 4d parallel + flash attention (#5789)
* support tp + sp + pp

* remove comments

---------

Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-06-17 17:40:47 +08:00
..
test_hybrid_parallel_grad_clip_norm [misc] refactor launch API and tensor constructor (#5666) 2024-04-29 10:40:11 +08:00
test_layer [misc] refactor launch API and tensor constructor (#5666) 2024-04-29 10:40:11 +08:00
test_model Support 4d parallel + flash attention (#5789) 2024-06-17 17:40:47 +08:00
__init__.py [shardformer] adapted T5 and LLaMa test to use kit (#4049) 2023-07-04 16:05:01 +08:00
test_flash_attention.py [coloattention]modify coloattention (#5627) 2024-04-25 10:47:14 +08:00
test_shard_utils.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
test_with_torch_ddp.py [misc] refactor launch API and tensor constructor (#5666) 2024-04-29 10:40:11 +08:00