ColossalAI/colossalai/kernel/cuda_native
アマデウス 077a66dd81
updated attention kernel (#2133)
2022-12-16 10:54:03 +08:00
..
csrc [optimizer] add div_scale for optimizers (#2117) 2022-12-12 17:58:57 +08:00
__init__.py updated flash attention api 2022-11-15 15:25:39 +08:00
flash_attention.py updated attention kernel (#2133) 2022-12-16 10:54:03 +08:00
layer_norm.py [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2022-11-17 13:42:33 +08:00
multihead_attention.py [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2022-11-17 13:42:33 +08:00
scaled_softmax.py [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2022-11-17 13:42:33 +08:00