ColossalAI/colossalai/kernel/cuda_native
oahzxl 501a9e9cd2
[hotfix] polish flash attention (#1802)
2022-11-07 14:30:22 +08:00
..
csrc [hotfix] fix CPUAdam kernel nullptr (#1410) 2022-08-05 19:45:45 +08:00
__init__.py refactor kernel (#142) 2022-01-13 16:47:17 +08:00
flash_attention.py [hotfix] polish flash attention (#1802) 2022-11-07 14:30:22 +08:00
layer_norm.py [NFC] polish colossalai/kernel/cuda_native/layer_norm.py code style (#980) 2022-05-17 10:25:06 +08:00
multihead_attention.py [formart] format fixed for kernel\cuda_native codes (#335) 2022-03-11 15:50:28 +08:00
scaled_softmax.py [NFC] polish colossalai/kernel/cuda_native/scaled_softmax.py code style (#955) 2022-05-17 10:25:06 +08:00