You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/kernel/cuda_native
Frank Lee 918bc94b6b
[triton] added copyright information for flash attention (#2835)
2 years ago
..
csrc [hotfix] fix error for torch 2.0 (#2243) 2 years ago
__init__.py [kernel] fixed repeated loading of kernels (#2549) 2 years ago
flash_attention.py [triton] added copyright information for flash attention (#2835) 2 years ago
layer_norm.py [kernel] fixed repeated loading of kernels (#2549) 2 years ago
multihead_attention.py [setup] support pre-build and jit-build of cuda kernels (#2374) 2 years ago
scaled_softmax.py [kernel] fixed repeated loading of kernels (#2549) 2 years ago