You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/kernel
flybird1111 7a3dfd0c64
[shardformer] update shardformer to use flash attention 2 (#4392)
1 year ago
..
cuda_native [shardformer] update shardformer to use flash attention 2 (#4392) 1 year ago
jit
triton [Kernels] added triton-implemented of self attention for colossal-ai (#4241) 1 year ago
__init__.py
op_builder