You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/extensions/pybind/flash_attention
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905)
3 months ago
..
__init__.py [Inference/Refactor] Refactor compilation mechanism and unified multi hw (#5613) 7 months ago
flash_attention_dao_cuda.py [Feature] Zigzag Ring attention (#5905) 3 months ago
flash_attention_npu.py [Inference/Refactor] Refactor compilation mechanism and unified multi hw (#5613) 7 months ago
flash_attention_sdpa_cuda.py [Inference/Refactor] Refactor compilation mechanism and unified multi hw (#5613) 7 months ago