You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/kernel/cuda_native
Frank Lee 8b7495dd54
[example] integrate seq-parallel tutorial with CI (#2463)
2 years ago
..
csrc [hotfix] fix error for torch 2.0 (#2243) 2 years ago
__init__.py updated flash attention api 2 years ago
flash_attention.py updated attention kernel (#2133) 2 years ago
layer_norm.py [hotfix] issue #2388 2 years ago
multihead_attention.py [setup] support pre-build and jit-build of cuda kernels (#2374) 2 years ago
scaled_softmax.py [example] integrate seq-parallel tutorial with CI (#2463) 2 years ago