You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/kernel/cuda_native/csrc
zhengzangw ae7c338105
[NFC] polish colossalai/kernel/cuda_native/csrc/colossal_C_frontend.cpp code style
3 years ago
..
kernels [kernel] fixed the include bug in dropout kernel (#999) 3 years ago
colossal_C_frontend.cpp [NFC] polish colossalai/kernel/cuda_native/csrc/colossal_C_frontend.cpp code style 3 years ago
compat.h refactor kernel (#142) 3 years ago
cpu_adam.cpp [NFC] polish colossalai/kernel/cuda_native/csrc/cpu_adam.cpp code style (#936) 3 years ago
cpu_adam.h [NFC] polish colossalai/kernel/cuda_native/csrc/cpu_adam.h code style (#945) 3 years ago
layer_norm_cuda.cpp [NFC] polish colossalai/kernel/cuda_native/csrc/layer_norm_cuda.cpp code style (#973) 3 years ago
layer_norm_cuda_kernel.cu [NFC] polish colossalai/kernel/cuda_native/csrc/layer_norm_cuda_kernel.cu code style (#661) 3 years ago
moe_cuda.cpp [NFC] polish colossalai/kernel/cuda_native/csrc/moe_cuda.cpp code style (#942) 3 years ago
moe_cuda_kernel.cu [NFC] polish moe_cuda_kernel.cu code style (#940) 3 years ago
multi_tensor_adam.cu [NFC] polish colossalai/kernel/cuda_native/csrc/multi_tensor_adam.cu code style (#667) 3 years ago
multi_tensor_apply.cuh refactor kernel (#142) 3 years ago
multi_tensor_l2norm_kernel.cu [NFC] polish colossalai/kernel/cuda_native/csrc/multi_tensor_l2norm_kernel.cu code style (#958) 3 years ago
multi_tensor_lamb.cu [NFC] Polish colossalai/kernel/cuda_native/csrc/multi_tensor_lamb.cu code style. (#937) 3 years ago
multi_tensor_scale_kernel.cu [NFC] polish colossalai/kernel/cuda_native/csrc/multi_tensor_scale_kernel.cu code style (#977) 3 years ago
multi_tensor_sgd_kernel.cu [NFC] polish colossalai/kernel/cuda_native/csrc/multi_tensor_sgd_kernel.cu code style (#978) 3 years ago
multihead_attention_1d.cpp [NFC] polish colossalai/kernel/cuda_native/csrc/multihead_attention_1d.cpp code style (#952) 3 years ago
multihead_attention_1d.h [NFC] polish colossalai/kernel/cuda_native/csrc/multihead_attention_1d.h code style (#962) 3 years ago
scaled_masked_softmax.cpp add colossalai kernel module (#55) 3 years ago
scaled_masked_softmax.h add colossalai kernel module (#55) 3 years ago
scaled_masked_softmax_cuda.cu [NFC] polish colossalai/kernel/cuda_native/csrc/scaled_masked_softmax_cuda.cu code style (#949) 3 years ago
scaled_upper_triang_masked_softmax.cpp [NFC] polish colossalai/kernel/cuda_native/csrc/scaled_upper_triang_masked_softmax.cpp code style (#959) 3 years ago
scaled_upper_triang_masked_softmax.h add colossalai kernel module (#55) 3 years ago
scaled_upper_triang_masked_softmax_cuda.cu [NFC] polish pre-commit run --files colossalai/kernel/cuda_native/csrc/scaled_upper_triang_masked_softmax_cuda.cu code style (#943) 3 years ago
type_shim.h [cuda] modify the fused adam, support hybrid of fp16 and fp32 (#497) 3 years ago