mirror of https://github.com/hpcaitech/ColossalAI
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Cuiqing Li (李崔卿)
bce919708f
|
1 year ago | |
---|---|---|
.. | ||
__init__.py | 1 year ago | |
context_attention.py | 1 year ago | |
copy_kv_cache_dest.py | 1 year ago | |
custom_autotune.py | 1 year ago | |
flash_decoding.py | 1 year ago | |
fused_layernorm.py | 1 year ago | |
gptq_triton.py | 1 year ago | |
int8_rotary_embedding_kernel.py | 1 year ago | |
llama_act_combine_kernel.py | 1 year ago | |
qkv_matmul_kernel.py | 1 year ago | |
self_attention_nofusion.py | 1 year ago | |
smooth_attention.py | 1 year ago | |
softmax.py | 1 year ago | |
token_attention_kernel.py | 1 year ago |