ColossalAI/colossalai/kernel/cuda_native
xcnick 85178a397a
[hotfix] fix error for torch 2.0 (#2243)
2022-12-30 23:11:55 +08:00
..
csrc [hotfix] fix error for torch 2.0 (#2243) 2022-12-30 23:11:55 +08:00
__init__.py updated flash attention api 2022-11-15 15:25:39 +08:00
flash_attention.py updated attention kernel (#2133) 2022-12-16 10:54:03 +08:00
layer_norm.py [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2022-11-17 13:42:33 +08:00
multihead_attention.py [builder] multihead attn runtime building (#2203) 2022-12-27 16:06:09 +08:00
scaled_softmax.py [builder] builder for scaled_upper_triang_masked_softmax (#2234) 2022-12-30 09:58:00 +08:00