ColossalAI/colossalai/kernel/triton
Xuanlei Zhao 32e7f99416
[kernel] update triton init #4740 (#4740)
2023-09-18 09:44:27 +08:00
..
__init__.py [kernel] update triton init #4740 (#4740) 2023-09-18 09:44:27 +08:00
context_attention.py [Feature] The first PR to Add TP inference engine, kv-cache manager and related kernels for our inference system (#4577) 2023-09-12 01:22:56 +08:00
copy_kv_cache_dest.py [Feature] The first PR to Add TP inference engine, kv-cache manager and related kernels for our inference system (#4577) 2023-09-12 01:22:56 +08:00
fused_layernorm.py [Feature] The first PR to Add TP inference engine, kv-cache manager and related kernels for our inference system (#4577) 2023-09-12 01:22:56 +08:00
qkv_matmul_kernel.py [Kernels] added triton-implemented of self attention for colossal-ai (#4241) 2023-07-18 23:53:38 +08:00
rms_norm.py [Feature] The first PR to Add TP inference engine, kv-cache manager and related kernels for our inference system (#4577) 2023-09-12 01:22:56 +08:00
rotary_embedding_kernel.py [Feature] The first PR to Add TP inference engine, kv-cache manager and related kernels for our inference system (#4577) 2023-09-12 01:22:56 +08:00
self_attention_nofusion.py [Feature] The first PR to Add TP inference engine, kv-cache manager and related kernels for our inference system (#4577) 2023-09-12 01:22:56 +08:00
softmax.py [Feature] The first PR to Add TP inference engine, kv-cache manager and related kernels for our inference system (#4577) 2023-09-12 01:22:56 +08:00
token_attention_kernel.py [Feature] The first PR to Add TP inference engine, kv-cache manager and related kernels for our inference system (#4577) 2023-09-12 01:22:56 +08:00