ColossalAI/extensions/flash_attention
Frank Lee 7cfed5f076
[feat] refactored extension module (#5298)
* [feat] refactored extension module

* polish

* polish

* polish

* polish

* polish

* polish

* polish

* polish

* polish

* polish
2024-01-25 17:01:48 +08:00
..
__init__.py [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00
flash_attention_dao_cuda.py [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00
flash_attention_npu.py [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00
flash_attention_xformers_cuda.py [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00