ColossalAI/tests/test_shardformer
Hongxin Liu 19e1a5cf16
[shardformer] update colo attention to support custom mask (#5510)
* [feature] refactor colo attention (#5462)

* [extension] update api

* [feature] add colo attention

* [feature] update sdpa

* [feature] update npu attention

* [feature] update flash-attn

* [test] add flash attn test

* [test] update flash attn test

* [shardformer] update modeling to fit colo attention (#5465)

* [misc] refactor folder structure

* [shardformer] update llama flash-attn

* [shardformer] fix llama policy

* [devops] update tensornvme install

* [test] update llama test

* [shardformer] update colo attn kernel dispatch

* [shardformer] update blip2

* [shardformer] update chatglm

* [shardformer] update gpt2

* [shardformer] update gptj

* [shardformer] update opt

* [shardformer] update vit

* [shardformer] update colo attention mask prep

* [shardformer] update whisper

* [test] fix shardformer tests (#5514)

* [test] fix shardformer tests

* [test] fix shardformer tests
2024-03-27 11:19:32 +08:00
..
test_hybrid_parallel_grad_clip_norm [gemini] gemini support extra-dp (#5043) 2023-11-16 21:03:04 +08:00
test_layer [example]add gpt2 benchmark example script. (#5295) 2024-03-04 16:18:13 +08:00
test_model [shardformer] update colo attention to support custom mask (#5510) 2024-03-27 11:19:32 +08:00
__init__.py [shardformer] adapted T5 and LLaMa test to use kit (#4049) 2023-07-04 16:05:01 +08:00
test_flash_attention.py [shardformer] update colo attention to support custom mask (#5510) 2024-03-27 11:19:32 +08:00
test_shard_utils.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
test_with_torch_ddp.py [ci] fixed ddp test (#5254) 2024-01-11 17:16:32 +08:00