mirror of https://github.com/hpcaitech/ColossalAI
[triton] added copyright information for flash attention (#2835)
* [triton] added copyright information for flash attention * polish codepull/2871/head
parent
7ea6bc7f69
commit
918bc94b6b
20
LICENSE
20
LICENSE
|
@ -201,17 +201,31 @@ Copyright 2021- HPC-AI Technology Inc. All rights reserved.
|
|||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
|
||||
## Some of colossal-ai's code is derived from Alpa, which is subject to the following copyright notice:
|
||||
## Some of colossal-ai's code is derived from others projects, which is subject to the following copyright notice:
|
||||
|
||||
Copyright 2021 The Alpa team.
|
||||
|
||||
Copyright 2021 The Alpa team.
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
https://github.com/alpa-projects/alpa/blob/979a45a3e6187df941ef4a4c4c6eea664527d68d/LICENSE
|
||||
https://github.com/alpa-projects/alpa/blob/979a45a3e6187df941ef4a4c4c6eea664527d68d/LICENSE
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
|
||||
-------------------------------------------------
|
||||
|
||||
Copyright 2018-2020 Philippe Tillet
|
||||
Copyright 2020-2022 OpenAI
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining
|
||||
a copy of this software and associated documentation files
|
||||
(the "Software"), to deal in the Software without restriction,
|
||||
including without limitation the rights to use, copy, modify, merge,
|
||||
publish, distribute, sublicense, and/or sell copies of the Software,
|
||||
and to permit persons to whom the Software is furnished to do so,
|
||||
subject to the following conditions:
|
||||
|
|
|
@ -1,8 +1,12 @@
|
|||
"""
|
||||
Fused Attention
|
||||
===============
|
||||
This is a Triton implementation of the Flash Attention algorithm
|
||||
(see: Dao et al., https://arxiv.org/pdf/2205.14135v2.pdf; Rabe and Staats https://arxiv.org/pdf/2112.05682v2.pdf; Triton https://github.com/openai/triton)
|
||||
The triton-based flash attention implementation is copied from the OpenAI/triton repository
|
||||
|
||||
You can find the repository in Triton https://github.com/openai/triton
|
||||
You can find the source file in https://github.com/openai/triton/blob/main/python/tutorials/06-fused-attention.py
|
||||
|
||||
Reference:
|
||||
1. Dao et al., https://arxiv.org/pdf/2205.14135v2.pdf
|
||||
2. Rabe and Staats https://arxiv.org/pdf/2112.05682v2.pdf
|
||||
"""
|
||||
|
||||
import math
|
||||
|
@ -56,7 +60,8 @@ except ImportError:
|
|||
print('please install xformers from https://github.com/facebookresearch/xformers')
|
||||
|
||||
if HAS_TRITON:
|
||||
|
||||
# the following functions are adapted from the OpenAI Triton tutorial
|
||||
# https://github.com/openai/triton/blob/main/python/tutorials/06-fused-attention.py
|
||||
@triton.jit
|
||||
def _fwd_kernel(
|
||||
Q,
|
||||
|
|
Loading…
Reference in New Issue