ColossalAI/examples/language/llama2/attn.py

Symbolic link
1 line
85 B
Python

../../../applications/Colossal-LLaMA-2/colossal_llama2/utils/flash_attention_patch.py