You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/kernel
flybird1111 7a3dfd0c64
[shardformer] update shardformer to use flash attention 2 (#4392)
1 year ago
..
cuda_native [shardformer] update shardformer to use flash attention 2 (#4392) 1 year ago
jit fix Tensor is not defined (#4129) 1 year ago
triton [Kernels] added triton-implemented of self attention for colossal-ai (#4241) 1 year ago
__init__.py [setup] support pre-build and jit-build of cuda kernels (#2374) 2 years ago
op_builder [builder] reconfig op_builder for pypi install (#2314) 2 years ago