You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/kernel/cuda_native
アマデウス 077a66dd81
updated attention kernel (#2133)
2 years ago
..
csrc [optimizer] add div_scale for optimizers (#2117) 2 years ago
__init__.py updated flash attention api 2 years ago
flash_attention.py updated attention kernel (#2133) 2 years ago
layer_norm.py [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2 years ago
multihead_attention.py [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2 years ago
scaled_softmax.py [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2 years ago