ColossalAI/extensions/pybind/flash_attention
flybird11111 aaafb38851
[Device]Support npu (#6159)
* support npu

* support pretrain

support pretrain

fix

* support lora

fix

fix

* support chatglm

fix

fxi

fix

[pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

fix

fix

[pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

fix

[pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

fix

fix

fix

* Update train.py

* Update train.py

* fix

* fix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix

* fix

* fix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-12-17 15:42:39 +08:00
..
__init__.py [Inference/Refactor] Refactor compilation mechanism and unified multi hw (#5613) 2024-04-24 14:17:54 +08:00
flash_attention_dao_cuda.py [Feature] Zigzag Ring attention (#5905) 2024-08-16 13:56:38 +08:00
flash_attention_npu.py [Device]Support npu (#6159) 2024-12-17 15:42:39 +08:00
flash_attention_sdpa_cuda.py [Inference/Refactor] Refactor compilation mechanism and unified multi hw (#5613) 2024-04-24 14:17:54 +08:00