ColossalAI/colossalai/kernel/cuda_native
zbian 6877121377 updated flash attention api 2022-11-15 15:25:39 +08:00
..
csrc [hotfix] fix build error when torch version >= 1.13 (#1803) 2022-11-08 09:40:24 +08:00
__init__.py updated flash attention api 2022-11-15 15:25:39 +08:00
flash_attention.py updated flash attention api 2022-11-15 15:25:39 +08:00
layer_norm.py [NFC] polish colossalai/kernel/cuda_native/layer_norm.py code style (#980) 2022-05-17 10:25:06 +08:00
multihead_attention.py
scaled_softmax.py [NFC] polish colossalai/kernel/cuda_native/scaled_softmax.py code style (#955) 2022-05-17 10:25:06 +08:00