ColossalAI/colossalai/kernel
flybird1111 7a3dfd0c64 [shardformer] update shardformer to use flash attention 2 (#4392)
* cherry-pick flash attention 2

cherry-pick flash attention 2

* [shardformer] update shardformer to use flash attention 2

[shardformer] update shardformer to use flash attention 2, fix

[shardformer] update shardformer to use flash attention 2, fix

[shardformer] update shardformer to use flash attention 2, fix
2023-08-15 23:25:14 +08:00
..
cuda_native [shardformer] update shardformer to use flash attention 2 (#4392) 2023-08-15 23:25:14 +08:00
jit fix Tensor is not defined (#4129) 2023-07-03 17:10:18 +08:00
triton [Kernels] added triton-implemented of self attention for colossal-ai (#4241) 2023-07-18 23:53:38 +08:00
__init__.py [setup] support pre-build and jit-build of cuda kernels (#2374) 2023-01-06 20:50:26 +08:00
op_builder [builder] reconfig op_builder for pypi install (#2314) 2023-01-04 16:32:32 +08:00