ColossalAI/colossalai/kernel/cuda_native
zbian 7bc0afc901 updated flash attention usage 2023-03-20 17:57:04 +08:00
..
csrc [doc] add deepspeed citation and copyright (#2996) 2023-03-04 20:08:11 +08:00
__init__.py [kernel] fixed repeated loading of kernels (#2549) 2023-02-03 09:47:13 +08:00
flash_attention.py updated flash attention usage 2023-03-20 17:57:04 +08:00
layer_norm.py [kernel] fixed repeated loading of kernels (#2549) 2023-02-03 09:47:13 +08:00
multihead_attention.py [setup] support pre-build and jit-build of cuda kernels (#2374) 2023-01-06 20:50:26 +08:00
scaled_softmax.py [kernel] added kernel loader to softmax autograd function (#3093) 2023-03-10 14:27:09 +08:00