mirror of https://github.com/hpcaitech/ColossalAI
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
傅剑寒
121d7ad629
|
7 months ago | |
---|---|---|
.. | ||
attention | 7 months ago | |
utils | 7 months ago | |
activation_kernel.cu | 7 months ago | |
context_kv_cache_memcpy_kernel.cu | 7 months ago | |
convert_fp8_kernel.cu | 7 months ago | |
decode_kv_cache_memcpy_kernel.cu | 7 months ago | |
flash_decoding_attention_kernel.cu | 7 months ago | |
fused_rotary_emb_and_cache_kernel.cu | 7 months ago | |
get_cos_and_sin_kernel.cu | 7 months ago | |
layer_norm_kernel.cu | 7 months ago | |
moe_kernel.cu | 7 months ago | |
multi_tensor_adam_kernel.cu | 7 months ago | |
multi_tensor_apply.cuh | 7 months ago | |
multi_tensor_l2norm_kernel.cu | 7 months ago | |
multi_tensor_lamb_kernel.cu | 7 months ago | |
multi_tensor_scale_kernel.cu | 7 months ago | |
multi_tensor_sgd_kernel.cu | 7 months ago | |
rms_layernorm_kernel.cu | 7 months ago | |
scaled_masked_softmax_kernel.cu | 7 months ago | |
scaled_upper_triang_masked_softmax_kernel.cu | 7 months ago |