ColossalAI/tests/test_infer/test_models
Li Xingjian 8554585a5f
[Inference] Fix flash-attn import and add model test (#5794)
* Fix torch int32 dtype

Signed-off-by: char-1ee <xingjianli59@gmail.com>

* Fix flash-attn import

Signed-off-by: char-1ee <xingjianli59@gmail.com>

* Add generalized model test

Signed-off-by: char-1ee <xingjianli59@gmail.com>

* Remove exposed path to model

Signed-off-by: char-1ee <xingjianli59@gmail.com>

* Add default value for use_flash_attn

Signed-off-by: char-1ee <xingjianli59@gmail.com>

* Rename model test

Signed-off-by: char-1ee <xingjianli59@gmail.com>

---------

Signed-off-by: char-1ee <xingjianli59@gmail.com>
2024-06-12 14:13:50 +08:00
..
test_attention.py [Fix] Fix Inference Example, Tests, and Requirements (#5688) 2024-05-08 11:30:15 +08:00
test_baichuan.py Pass inference model shard configs for module init 2024-06-07 08:33:52 +00:00
test_custom_model.py [Inference] Fix flash-attn import and add model test (#5794) 2024-06-12 14:13:50 +08:00